upper waypoint

How Much Do Social Media Algorithms Influence Your Worldview?

Save ArticleSave Article
Failed to save article

Please try again

The content you see on a given social media app isn’t random – it’s highly curated content specifically made for you. And it is curated by recommendation algorithms. These algorithms are designed to keep you on the social media app you are using for as long as possible and they do that by learning what you like and showing you more of that type of content. But is this always a good thing? In this episode of Above the Noise, Myles explores how these recommendation algorithms can help you feel seen and build community while also potentially trapping you in filter bubbles where you are only served content that reinforces what you believe.

TEACHERS: Guide your students to practice civil discourse about current topics and get practice writing CER (claim, evidence, reasoning) responses. Explore lesson supports.

Why do you see what you see online and on social media?

What content you see on a given social app or search engine isn’t random– it’s super curated content. And it’s curated by recommendation algorithms. Recommendation algorithms are essentially the computer instructions for how a given social app or search engine decides what to show you.

How do recommendation algorithms work?

Basically, these algorithms are designed to keep you on a given social app for as long as possible. They do that by learning what you like and showing you more of that content. They consider a bunch of stuff when deciding what to show you. For instance, they collect data on what we’re watching, clicking, liking, commenting, sharing, buying, where we live, etc. They also consider what everyone else is liking and watching too. But exactly what and how all these things are ranked to give you the content that shows up in your feed is top secret. Plus, companies are constantly tweaking and changing their algorithms.


Why do social apps use recommendation algorithms?

There’s tons of content out there so recommendation algorithms sift through that content and show us what they think is the most relevant stuff to us. In the end, social apps and YouTube want to keep you on the platform for as long as possible so they can show you more ads and make more money– and they do that by showing you stuff they think is gonna keep you on the app the longest.

What’s dangerous about recommendation algorithms?

Recommendation algorithms can trap users in echo chambers or filter bubbles– where you are served content that just reinforces what you already believe. This is particularly true when it comes to news and politics– and has been cited as a reason for increased political polarization in America. These recommendation algorithms can also spread misinformation, disinformation and propaganda. Content that is emotionally charged tends to go viral because a lot of users engage with that content– and sometimes that means these algorithms are spreading misinformation, disinformation, and propaganda. There are also reports that users can get sucked into radicalization rabbit holes as algorithms serve up more and more extreme content.



Facebook’s Algorithms Fueled…. (The Conversation)

The Social Media Echo Chamber Is Real (Ars Technica)

Algorithms in Social Media Platforms (Internet Justice Society)

Fueling the Fire (NYU/ Stern)

How TikTok Reads Your Mind (NY Times)

For You Page: TikTok and Identity (Debating Networks and Communities Conference IX) 

lower waypoint
next waypoint