MCO 427 Creation Activity: Learning about Filter Bubbles

When scrolling through social media platforms like Instagram or X (formerly Twitter), you may find that the content that is shown to you eventually becomes similar to other content you like and share. This occurrence is known in media as a filter bubble. According to Eli Parisier in his 2011 book “The Filter Bubble: What the Internet Is Hiding From You”, algorithms and search engines create a situation where digital media users increasingly get information that confirms their beliefs. What we view, like, and search ends up creating personalized results in future online usage, and this has both positive and negative aspects. The issue is that the filter bubble that is created is not made clear to us, and it can limit the information that we see when consuming media daily. It is important for people, especially when starting to use social media more often, to understand all of the issues with filter bubbles and how to avoid getting into one.

Image source

Before I explore all of the positive and negatives of filter bubbles, I want to better explain how they work. When consuming media online, we often find ourselves on various websites, some of them being search engines and some of them may be social media platforms. As a Farnam Street article explains, when using social media sites or when reading news articles, you may comment on them, like posts, share a piece of media you like with someone, or interact with it in several other ways. As you are interacting with the content, you may be doing it innocently and without trying to “learn” about anything specific. For example, I find myself mindlessly scrolling Instagram and Twitter sometimes. However, the platform’s algorithm takes note of the content that you “like” and in order to provide you with more content that you are interested in, it will share content that is personalized towards the same type of opinions and perspectives as the previous content you interacted with. This, in turn, puts you into what Parisier calls a filter bubble.

Image source

As Chinmay Bhalerao explains, a filter bubble is a phenomenon that occurs when algorithms on social media platforms recommend content that is similar to what we have already like or shared. One of the major issues with being in a filter bubble is that the algorithms that placed you there end up isolating you from information and perspectives that you may want to know, but since you haven’t shown interest in it in the past, the algorithms essentially hide it from you. If a friend has different viewpoints from you, you may not end up seeing their posts on social media platforms. On the other hand, if you liked a post that supports one perspective, you may be shown more content similar to that post or from the same users that shared the original post. This can lead to people becoming isolated from other ideas, even if they represent the person more than what they originally interacted with. Filter bubbles cause us to only see one side of an issue, and this can make it hard to have an informed opinion. According to an article on GFC Global, you may not realize that you are even in a filter bubble because algorithms don’t ask for permission to personalize content towards you.  

Image source

Filter bubbles are said to create “echo chambers”, another important topic when it comes to media literacy. This is another harmful aspect of filter bubbles, as echo chambers are online spaces that users end up in that both magnify messages delivered in it and insulate them from rebuttal. As humans, we tend to search for information that confirms our existing beliefs and biases, and filter bubbles escalate this by creating feedback loops where users within the filter bubbles and echo chambers only see information that supports their viewpoints. This leads to confirmation bias. According to an article in Nieman Lab, confirmation bias may be why false information spreads so fast online, which makes filter bubbles even more harmful. The issue is that when someone sees something that confirms their thoughts, not only do they have validation for themselves, but they share it in echo chambers created by filter bubbles, and that information is spread even further.

Image source

There are some positive aspects to filter bubbles. Because they are created by algorithms suggesting content that they feel is relevant to media you have interacted with in the past, a filter bubble can provide users with a personalized experience with tailored content. You may be able to see desired content faster than you would be able to if you were not in a filter bubble. However, these minor benefits can lead to you having limited exposure to more diverse viewpoints, and it is far more beneficial to have better knowledge of the media you are consuming.

One of the major reasons I wanted to talk about filter bubbles is because in a Ipos study, three out of five adults say they regularly see fake news, and half of those people say they have believed a fake story before finding out it was fake. Therefore, we need to make sure people avoid ending up in a filter bubble and trusting in content that is suggested to them based on previous activity. Further, as a young adult who didn’t know what a filter bubble was until recently, I think it is vital for people who are becoming more active online to know that algorithms hide the fact that they are altering the content they show you to fit with a limited viewpoint. A study from The National Library of Medicine made the issue very clear when they gave a visual representation of filter bubbles, stating “It is tempting to view the filter bubble as equivalent to an invisible in-car navigation system, which instead of suggesting the direction you should follow, simply takes control of your car and takes you where it thinks you want to go. An automation system that does not allow the user to take a pause and consider the effects of automation can lead to mis-use, frustration, and accidents”. Now consider the fact that you would not see the navigation system taking the car where it wants you to go; this is why they are especially harmful.

Image source

It is imperative to learn about online states like filter bubbles, echo chambers, and to gain some level of media literacy for you to protect yourself online. Filter bubbles can cause people to have confirmation bias, reduced critical thinking, fall for misinformation, be manipulated, and more, especially if you are new to being online. To have a healthy environment online, we need to avoid getting ourselves into filter bubbles. Ways that you can do this besides learning about them include clearing your cookies, using an ad blocker, and making an effort to research other opinions and learn about other viewpoints. Using tactics like lateral reading and sift can help steer you away from the content that algorithms feed you.

Image source

Check out this TED Talk from Eli Parisier on filter bubbles for more information.


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *