There are people who will tell you that the Earth is flat. It is not simply ignorance, and it is not a case of falling victim to fake news. The idea that our planet is not a sphere, and that scientists or the government or the media do not want us to know this, is one of many complicated, multi-faceted social and psychological beliefs. Conspiracy theories are common: Bush did 9/11. Barack Obama is not a United States citizen. Vaccines cause autism. The Holocaust never happened. Hillary Clinton runs a child sex-trafficking ring out of a Washington, DC pizzeria. These theories, as far-fetched as they may seem, have gained popularity among seemingly-ordinary people.
Belief in conspiracy theories is more complicated than being duped by an intentionally-fake Facebook post. People choose to believe in conspiracy theories, and in doing so, they are consciously taking up a position on an issue. Conspiracy theories occupy a space between belief and fact. Social media sites, which offer conspiracy theorists a platform for their ideas, allow such theories to proliferate. This is not directly the fault of social media companies; rather, it is a result of the affordances of such platforms, which facilitate the social and psychological processes that allow people to find, discuss, believe, and perpetuate conspiracy theories.
Who believes conspiracy theories, and why?
The people who are most likely to believe a conspiracy are those who already believe in at least one other conspiracy theory (Goertzel, 1994). This makes sense in relation to other scholarly findings on conspiracy theory belief. According to Goertzel (1994), belief in conspiracy theories positively correlates with feelings of discontent and distrust of other people. Once you have accepted one conspiracy theory, you have accepted the idea that information is being withheld by people who are supposed to have your best interests at heart. If there is one conspiracy, it stands to reason that there are many more.
Conspiracy theories also thrive on negative emotions, particularly emotions of uncertainty, such as fear and anxiety (van Prooijen, 2018). When someone is distrustful of the world around them and is living in a state of uncertainty, an idea that suggests that the world really is manipulated by a group of influential others makes sense and justifies those negative emotions.
In a study, researchers found that roughly half of the population of the United States believes at least one conspiracy theory (Oliver & Wood, 2014). A significant portion of the population also believes that “unseen, intentional forces” exist and are influencing events in our society (Oliver & Wood, 2014). Ironically, there are unseen forces influencing what we talk about, and they are called algorithms.
The power of algorithms
Content-filtering algorithms, such as Facebook’s, aim to personalize user experiences on a social media platform. In her book, Zeynep Tufekci (2017) unpacks the decisions that go into news-feed-sorting algorithms, pointing out that the primary objective of social media companies like Facebook is to keep users engaged, and thus generate advertising revenue. Facebook can determine the mood of the content we see, and it is in Facebook’s best interest to show us content that we will engage with—content that taps into our emotions. The more you “like” a certain type of post, the more similar posts you will see. Therefore, groups of users who share content that feeds into an emotional state (either positive or negative) are not only able to communicate and grow easily on algorithm-based social media sites, but they are valuable to the companies behind them.
Feed-personalization algorithms are not the only algorithms affecting the content we see from behind the scenes. YouTube has recently received scrutiny for its recommendation algorithm, which has apparently been exposing users to conspiracy theory videos. The algorithm, which displays suggested videos to users in the sidebar and after the video being played ends, is supposed to keep users engaged by nudging them towards content that is already popular and/or is similar to the videos they already watch and like. (YouTube will also play the next related video automatically, unless the user chooses to disable the autoplay feature.)
How algorithms feed conspiracy theories
Research has found that partisanship—strong attachment to one political party or another—is a predictor of belief in conspiracy theories (Uscinski, Klofstad, & Atkinson, 2016). Of course, Republicans are more likely to believe conspiracy theories about the opposing party (such as the idea that former president Barack Obama was a foreign-born Muslim), as are Democrats. Tufekci (2018), in a New York Times op-ed last year, detailed her experience with YouTube’s radical recommendations. She was shocked to discover that when she watched videos of Trump rallies on YouTube while researching for an article about his appeal, she began to get recommendations for videos that “featured white supremacist rants, Holocaust denials and other disturbing content” (Tufekci, 2018). YouTube recommendations were leading her down a path of radicalization. Tufekci found that the same held true for Democratic videos: YouTube began recommending her radical leftist content, including videos about the conspiracy theory that President Bush was behind the 9/11 terrorist attacks. Other investigations have found that this pattern exists for other topics, such as vegetarianism, vaccination, and school shootings—including thousands of videos promoting the conspiratorial idea that the students speaking out after notable shootings such as Parkland and Sandy Hook are actually crisis actors (Tufekci, 2018).
Certainly Google, which owns YouTube, does not have a secret plan to radicalize people and promote conspiracy theories. However, our attention is a product, and Google must acquire and sell it to advertisers in order to profit. Feeding us content that is progressively more extreme seems to keep our attention, and it doesn’t ultimately matter what that content is, even though that content has very real consequences.
When Landrum interviewed thirty “flat-earthers” at a convention in North Carolina, she found that nearly all of them had come to believe that the earth is flat after watching YouTube videos about the conspiracy (Sample, 2019). Most of the interviewees, in accordance with Goertzel’s research, discovered the flat earth conspiracy because they were already watching material about other conspiracy theories. The internet and social media have allowed such flat-earthers to find one another, organize conventions, and prioritize one another’s voices. The same is true for other conspiracy theorists, including climate change denialists and those who believe vaccines are unsafe for children.
Echo-chambers
Conspiracy theories today can be amplified by the affordances of the online environments in which they proliferate. Social media environments facilitate the formation of echo chambers based on our already-held beliefs and interests. Humans have a tendency to seek out relationships and interactions with people who are similar to them, a phenomenon known as homophily (Sunstein, 2017). In our everyday life, we are exposed to people who are different from us, and we cannot necessarily choose when and where such exposure will occur. However, social media sites allow us to sort ourselves into groups, interacting online with those who share our perspectives and seeing little to no content produced by others who do not share those perspectives.
Knowing what we know about the links between partisanship, paranoia, and belief in conspiracies (Uscinski, Klofstad, & Atkinson, 2016), we can see how algorithms and the affordances of social media can enable us to become engulfed in a network of others who support our feelings and beliefs about the world.
Imagine that a person has fallen down a YouTube rabbit hole of conspiracy videos after watching videos of a conservative news anchor. His political bias and fears about the world inform his beliefs, and he begins to believe that scientists and the media are lying about vaccines. He notices anti-vaccination posts from a friend on his Facebook feed. The more he “likes” such posts and visits pages that publish anti-vaccination content, the more they appear on his Facebook feed. As he interacts with these posts and others users in the anti-vaccination community, they dominate his feed and his experience on the social media platform because Facebook’s algorithm aims to keep him engaged.
Without the existence of counter-narratives and diverse perspectives, his belief in the conspiracy that vaccines cause autism and that the government is hiding that information solidifies in the echo-chamber. Research shows that the existence of a rumor community with a public forum for discussion, where people create social narratives around an idea and maintain a shared identity, is essential to conspiracy theory survival (Edy & Risley-Baird, 2016). Social media platforms with affordances that enable the formation of echo-chambers present an ideal environment for a rumor community to thrive.
Why does all of this matter?
Does it really matter if conspiracy theorists find a community and spread their ideas through social media? Yes. Social media platforms, though echo-chambers and algorithms, influence what we see and—most crucially—what we believe those around us believe (Taub & Fisher, 2018). A recent study by researchers at the University of Warwick and an investigation by the New York Times found that anti-refugee attacks in Germany increased in locations with higher-than-average Facebook use (Taub & Fisher, 2018). They suggest that this is due to inflammatory right-wing content dominating people’s news feeds, and by extension, affecting social norms. When we believe that those around us share radical ideas, we feel compelled to act in ways that conform.
Perhaps one of the most recent clear examples of this phenomenon is Pizzagate. During the 2016 election season, a conspiracy theory about Hillary Clinton and her staff running a child sex trafficking ring out of a pizzeria in Washington, D.C. spread through right-wing communities on social media sites such as Reddit, YouTube, and Facebook. One man, immersed in conspiratorial rumor communities, decided to take matters into his own hands by entering the pizzeria with a gun and firing shots (Fisher, Cox, & Hermann, 2016).
This world of virtual rumor-spreading, polarized online political echo-chambers, and distorted social norms shaped by social media’s affordances is real in its consequences. Algorithmic content filtering affects how we see our world and how we find outlets for our uncertainties and fears. In an echo chamber where every belief and claim can go unquestioned and every member of the community believes that outsiders are wrong, any action that furthers the cause of that conspiracy theory community seems right and justified—even violence.
References
Edy, J. A., & Risley-Baird, E. E. (2016). Rumor communities: The social dimensions of internet political misconceptions. Social Science Quarterly, 97(3), 588-602. doi:10.1111/ssqu.12309
Fisher, M., Cox, J. W., & Hermann, P. (2016, December 6). Pizzagate: From rumor, to hashtag, to gunfire in D.C. The Washington Post. Retrieved from https://www.washingtonpost.com
Goertzel, T. (1994). Belief in conspiracy theories. Political Psychology, 15(4), 731-742. Retrieved from https://www.jstor.org/stable/3791630
Oliver, J. E., & Wood, T. J. (2014). Conspiracy theories and the paranoid style(s) of mass opinion. American Journal of Political Science, 58(4), 952-966. doi:10.1111/ajps.12084
Sample, I. (2019, February 17). Study blames YouTube for rise in number of flat earthers. The Guardian. Retrieved from https://www.theguardian.com/science/
Sunstein, C. R. (2017). #Republic: Divided democracy in the age of social media. Princeton, NJ: Princeton University Press.
Taub, A., & Fisher, M. (2018, August 21). Facebook fueled anti-refugee attacks in Germany, new research suggests. The New York Times. Retrieved from https://www.nytimes.com
Tufekci, Z. (2017). Twitter and tear gas: The power and fragility of networked protest. New Haven, CT: Yale University Press.
Tufekci, Z. (2018, March 10). YouTube, the great radicalizer. The New York Times. Retrieved from https://www.nytimes.com
Uscinski, J. E., Klofstad, C., & Atkinson, M. D. (2016). What drives conspiratorial beliefs? The role of informational cues and predispositions. Political Research Quarterly, 69(1), 57-71. doi:10.1177/106591291
van Prooijen, J. (2018, August 13). The psychology of Qanon: Why do seemingly sane people believe bizarre conspiracy theories? NBC News. Retrieved from https://www.nbcnews.com/think/