Attention Women’s Rights Advocates! Be Aware of Filter Bubbles.

Social media has become a useful tool for advocating for women’s rights.  However, the landscape of the digital age is changing. Have you noticed how, when browsing your social media, you tend to only see content with ideas that you agree with? Recently, this is how most people have been experiencing their online activity. Social media users tend to only see what they agree with because they are not getting all of the information and content that is out there. Within this post, I will be explaining this phenomenon as well as why it is important for women’s rights advocates of the digital age to be aware of it.

Filter Bubbles: What Are They and How Do They Work?

The term, “filter bubble,” that is used to label this phenomenon is generally quite self-explanatory. On social media platforms, such as Facebook, YouTube, and even Google, we are in our own personalized bubbles that work to filter the content that we see. When in our filter bubbles, we tend to see personal post, news, and other information that align with our own opinions and beliefs while everything else is filtered from our attention. However, “most are not deliberately screening out the contradictory views” (Bobok, 2016, p. 2). We, as the social media users, have a choice in what we see, but we often leave this choice up to the social media sites and personalized algorithms.


The content that we encounter online is personalized as it is based on the data that the social media sites are continuously collecting from our online activity. These sites collect such data through our log in information and questionnaires, or they may collect data more discreetly through the use of cookies and by our WI-FI activity. With this data, a personalized algorithm is created that then dictates what we see and what we don’t see when browsing a platform. As stated by Sunstein (2017), algorithms “learn a great deal about you, and they will know what you want or will like, before you do, and better than you do” (p. 3). As a result, we are solely presented with content that supports our current ways of thinking about certain topics to ensure that we enjoy, and are engaging with, what we see online.


            Many social media users, including online activists, are unaware of the existence of filter bubbles. As mentioned in his article, Usman (2016) states that “we have a sense of naive realism in that we think all the information is available to us, and therefore the conclusions we make are automatically the most informed ones.” Most social media users believe that the content they encounter across their feed is generally all the information there is, and that everyone else is seeing the same content as they are.

This is important for online activist to know because social media users are their audience, and it is important for these activists to know what their audience is seeing and why. This impact filter bubbles have on what their audience is seeing and engaging with can be either good or bad, depending on how online activists work around, or infiltrate their audiences’ bubbles to spread their protest. Bringing awareness of filter bubbles and their effects, both positive and negative, to online activists is essential because this awareness opens up discussion about how such online environments impact the advocacy of important topics, such as women’s rights, and creates room for more effective civic engagement strategies.

Filter Bubbles’ Positive Impact

            The internet is known to give individuals “such immediate access to hundreds, thousands or even millions of people that agree with [them]” (Parker, 2017). As the internet gives one access to a vast amount of other people who have similar opinions and beliefs, filter bubbles work to pick out and focus our attention on those people while keeping other people with differing opinions hidden. In other words, filter bubbles make it easier to find and gravitate towards more like-minded people.

Our filter bubbles tend to work as a strange kind of magnet that, instead of attracting opposites, attract similarities. For example, an individual who seeks out and engages with information about the unfair pay gap that is based on gender is more likely to be digitally connected and associated with another individual within an online network who engages with similar content, such as videos about the lack of representation of women in the federal government. Therefore, filter bubbles provide online activists with a readily constructed mass of like-minded followers can work together toward social change.

When referring to the way they bring like-minded masses together, filters bubbles can be viewed as a positive aspect and used as a tool for advocating for women’s rights on social media sites. When conducting interviews for her research about spreading ideas through social media, Francesa Comunello (2016) found that “most of the interviewees uncritically use Facebook to carry out a political campaign, as it is perceived as effective in supporting the gathering of individuals sharing similar political beliefs and intents” (p. 528). Filter bubbles fosters solidarity, which can then lead to some sort of action online as well as offline.

There are various real-life examples of this, one example being the Women’s March on Washington DC that occurred in 2017 after the presidential election. The idea of a march for women’s rights came from the mind of Teresa Shook and made its way to social media when she posted about her idea on Facebook. Throughout social media, “Shook’s vision of a women’s march spread like wildfire after it was shared in several women’s empowerment groups on Facebook” (Garrison, 2017). Because of personalized filter bubbles, Shook’s idea was presented to those who, like her, believed there was a need for this march. This allowed different men, women, and children with like-minded values to gather together online in solidarity when organizing the event, and then successfully taking their actions offline.

Filter Bubbles’ Negative Impact

            It is suggested that, “in a democratic society, people need to come across opinions that differ from their own opinions; otherwise, [they] might enter a spiral of attitudinal reinforcement and drift towards more extreme viewpoints (Zuiderveen Borgesius et al., 2016). In other words, the overall result of an online environment where you only see content geared toward your views is polarization. While within their filter bubbles, people begin clustering to one distinctive “side”, creating a sharp division into highly contrasting online groups. Visually, this would appear as two masses on each side of a spectrum, with very few bubbles in the middle.

As polarization occurs on social media sites, such an environment hinders the widespread of thoughts and ideas online. There are people on one end of an issue that are seeing information geared towards their opinions, and there are others on the opposite end of an issue with different information being presented to them. Due to the filtering barrier, online activists are not able to completely reach both ends of the issue. They are continuously sharing their ideas with only those who already agree with them while those residing on the other end of the issue remain in their filtered spaces.

Some argue that “the key to [digital advocacy] is to use social media for mobilization, not persuasion.” (Parker 2017). However, I personally believe that one of the main adjectives for women’s rights activists is to gain exposure and generate productive discussions about topics, such as women’s reproductive rights and education, rather than to persuade the “opposing side.” As mentioned in an article concerning filters bubbles and their impact on polarization, “if we are not exposed to other ideas and perspectives beyond our own beliefs, then this artificial absence of contrary evidence or opinions can trick us into thinking we must be right because no counter-argument seems to exist” (How Do Filter Bubbles Lead to Political Polarization, 2018). This mindset, as a result of filter bubbles, severely hinders productive conversations throughout social media, a tool that is meant to make sharing ideas easier.

Being Aware: A Key Worth Noting

To conclude, filter bubbles is a complex concept with positive and negative aspects. They can be a positive force when advocating for women’s rights through social media as they allow us to easily connect with those who agree and support our protest. Gaining large amounts of support online becomes almost effortless because of the effects of filter bubbles, and organizing as well as spreading information regarding events to take place offline, such as the March for Women’s rights, is less challenging. However, there are also negative effects of filter bubbles as they only expose content to people depending on what their views and belief are. This filtered barrier makes it much more difficult for online activist to spread and introduce their ideas to those who make not agree with them.

Again, I would like to emphasize how important it is for online activists to know both the positive and negative impacts that filter bubbles have on our online experience. What are we, as activist, supposed to do with this information? My hope is that being aware of filter bubbles will encourage activist to make use of the positive effects of them as well as discover new strategy concerning how to infiltrate the bubbles that filter them out. The goal is to improve our ability to participate in digital civic engagement, and this goal is very achievable with a more knowledge on the current digital environment.


(2018). How do filter bubbles lead to political polarization? Retrieved from…

Bobok, Dalibor. (2016). Selective exposure, filter bubbles and echo chambers on Facebook (Doctoral dissertation). Central European University. Budapest, Hungary.

Comunello, F., Mulargia, S., & Parisi, L. (2016). The ‘proper’ way to spread ideas through social media: Exploring the affordances and constraints of different social media platforms as perceived by italian activists. The Sociological Review, 64(3), 515-532.

Garrison, O. (2016). Socially strong: Social media’s impact on the Women’s March on Washington. Retrieved from

Parker, E. (2017, May 22). In praise of echo chambers. The Washington Post . Retrieved from – post/wp/2017/05/22/in – praise – of – echo – chambers/?utm_term=.4657094cd971


Sunstein, C. R. (2017). #Republic: Divided Democracy in the Age of Social Media. New Jersey, NJ: Princeton University Press. Usman, O. (2017). How invisible filter bubbles shape your social, political, and religious views. Retrieved from…

Zuiderveen Borgesius, F., Trilling, D., Möller, J., Bodó, B., De Vreese, C., & Helberger, N. (2016). Should we worry about filter bubbles? Internet Policy Review, 5(1). DOI: 10.14763/2016.1.401