If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

Main content

Is the internet expanding or narrowing our minds?

Why do you see what you see online and in social media and how can that influence your identity and behavior? Co-produced with @CommonSenseEducation

SUBSCRIBE so you never miss a video!
https://bit.ly/3tNKzhV

And follow us on Instagram and Twitter!
https://www.instagram.com/abovethenoisepbs
https://twitter.com/ATN_PBS

Why do you see what you see online and on social media?

What content you see on a given social app or search engine isn’t random– it’s super curated content. And it’s curated by recommendation algorithms. Recommendation algorithms are essentially the computer instructions for how a given social app or search engine decides what to show you.

How do recommendation algorithms work?

Basically, these algorithms are designed to keep you on a given social app for as long as possible. They do that by learning what you like and showing you more of that content. They consider a bunch of stuff when deciding what to show you. For instance, they collect data on what we’re watching, clicking, liking, commenting, sharing, buying, where we live, etc. They also consider what everyone else is liking and watching too. But exactly what and how all these things are ranked to give you the content that shows up in your feed is top secret. Plus, companies are constantly tweaking and changing their algorithms.

Why do social apps use recommendation algorithms?

There’s tons of content out there so recommendation algorithms sift through that content and show us what they think is the most relevant stuff to us. In the end, social apps and YouTube want to keep you on the platform for as long as possible so they can show you more ads and make more money– and they do that by showing you stuff they think is gonna keep you on the app the longest.

What’s dangerous about recommendation algorithms?

Recommendation algorithms can trap users in echo chambers or filter bubbles– where you are served content that just reinforces what you already believe. This is particularly true when it comes to news and politics– and has been cited as a reason for increased political polarization in America. These recommendation algorithms can also spread misinformation, disinformation and propaganda. Content that is emotionally charged tends to go viral because a lot of users engage with that content– and sometimes that means these algorithms are spreading misinformation, disinformation, and propaganda. There are also reports that users can get sucked into radicalization rabbit holes as algorithms serve up more and more extreme content.

SELECTED SOURCES
Facebook’s Algorithms Fueled…. (The Conversation)
https://theconversation.com/facebooks-algorithms-fueled-massive-foreign-propaganda-campaigns-during-the-2020-election-heres-how-algorithms-can-manipulate-you-168229

The Social Media Echo Chamber Is Real (Ars Technica)
https://arstechnica.com/science/2017/03/the-social-media-echo-chamber-is-real/

Algorithms in Social Media Platforms (Internet Justice Society)
https://www.internetjustsociety.org/algorithms-in-social-media-platforms

Fueling the Fire (NYU/ Stern)
https://static1.squarespace.com/static/5b6df958f8370af3217d4178/t/613a4d4cc86b9d3810eb35aa/1631210832122/NYU+CBHR+Fueling+The+Fire_FINAL+ONLINE+REVISED+Sep7.pdf

How TikTok Reads Your Mind (NY Times)
https://www.nytimes.com/2021/12/05/business/media/tiktok-algorithm.html

For You Page: TikTok and Identity (Debating Networks and Communities Conference IX) http://networkconference.netstudies.org/2020Curtin/2020/05/11/for-you-page-tiktok-as-a-lens-for-identity-discourse-in-western-culture/

TEACHERS
Get your students in the discussion on KQED Learn, a safe place for middle and high school students to investigate controversial topics and share their voices: https://learn.kqed.org/

Check out Common Sense Education's Digital Citizenship Curriculum: https://www.commonsense.org/education/

About KQED
KQED serves the people of Northern California with a public-supported alternative to commercial media. An NPR and PBS member station based in San Francisco, KQED is home to one of the most listened-to public radio stations in the nation, one of the highest-rated public television services, and an award-winning education program helping students and educators thrive in 21st-century classrooms. A trusted news source, leader, and innovator in interactive technology, KQED takes people of all ages on journeys of exploration — exposing them to new people, places, and ideas.

Funding for KQED Education is provided by the Corporation for Public Broadcasting, the Koret Foundation, the William and Flora Hewlett Foundation, the AT&T Foundation, the Crescent Porter Hale Foundation, the Silver Giving Foundation, Campaign 21 donors, and members of KQED.

CHAPTERS
00:00 Intro
00:55 What are recommendation algorithms?
2:00 How social media algorithms work
4:21 Pros of recommendation algorithms
4:51 Dangers of recommendation algorithms
7:10 Tips to make recommendation algorithms work for you
.
Created by Common Sense Education.

Want to join the conversation?

Video transcript

- You know who really gets me? TikTok. What up, world? Myles Bess, journalist, host of "Above the Noise", and world class chef. Y'all, I've spent hours scrolling through TikTok perfecting my vegan wing recipe. I've learned so much about cooking, and pots, and pans. It's just, it's changed my world. But while I was deep in vegan wing TikTok, I totally missed that, apparently, there's this thing called a Super Bowl, and it has nothing to do with cooking. It's a sporting event with football? What's next? You're gonna tell me that Obama isn't president anymore? What else am I missing out on? So today, we're getting philosophical in asking, is the internet expanding your mind or is it making you narrow-minded? Now, you might be thinking, "Dang, Myles, that's a deep question." And to that I say, yes, yes it is, and you're welcome. But to answer that question, we gotta talk about why you're seeing what you're seeing on your social media feeds. And the answer to that, my friends, has a lot to do with recommendation algorithms. Now, basically, an algorithm is a formula or a set of instructions to solve a problem. And when it comes to social media, recommendation algorithms are the computer instructions for how a given social media app decides what to show you. I mean, 500 hours of video are uploaded to YouTube every minute worldwide. That works out to 720,000 hours of new content per day. That's 82.2 years. Yes, years. That's literally a lifetime. - Did you see this one? This one? What about this one? - And on TikTok, more than a billion videos get viewed a day. That's a lot of videos to sift through, and you probably have no interest in a bunch of them. I mean, those pimple popping videos, ugh. No thank you. - And that's the point of the recommendation algorithm, to identify content that will appeal to you. - That's Alexander Nwala. He's a computer scientist who studies how information spreads on social media. We chatted with him to help us better understand how social media recommendation algorithms work. - In order for them to learn about you, they need to collect information. So as they collect some of the pages you browse, some of the content you engage with, it's taken into account in those algorithms. And then, they do a lot of sophisticated machine learning, a lot of mathematics behind these scenes and say, "Okay, we think you are going to like this." And if you engage with that content, they give you more of that. So it's like you're telling the algorithm, you're giving it some feedback that says, "Give me more of this, give me more of this, give me more of this." - So these companies are pretty much tracking our every move. They're collecting data on what you're watching, clicking on, liking, commenting on, sharing, buying, how long we watch something, where we live, how old we are, et cetera. Then they use all of that data to make predictions to serve us the content they think we'll like. You know, stuff like this 27 year old male who lives in California, likes sneakers and cooking, and hates gross medical videos. So show him more cooking content, and don't show him people cutting in a cyst. Ugh. Oh my God. It's gross. And with this data, they can sell super targeted ads to us too. - And if you stay longer there, they make more money. Maybe through ad revenues, maybe make more money through the information they collect about you. But the whole idea is to find something that appeals to you that keeps you. - See? At the end of the day, these social media companies are businesses trying to make money. And the longer you're on the app, the more ads you'll see, and the more money they'll make Y'all, TikTok made $4 billion from ads in 2021, and they want to triple that to $12 billion this year. In 2020, Instagram made $17.4 billion through ad sales. Y'all, that's like a lot of money. But here's the catch. These companies keep the inner workings of these algorithms under tight lock and key. I mean, it is their secret sauce after all. For example, TikTok publicly says that it takes into account shares, likes, follows, what you watch, and even how long you watch something, but we don't know for sure exactly how all these things are weighted or ranked to give you the content that's on your feed. And it's not just what you are engaging with. These social media algorithms also take into account what everyone else is liking and sharing too. They tend to amplify the posts that have the most likes, comments, and shares, even if it's misinformation or lies. Okay, so back to my original question, is the internet expanding your mind or making you narrow-minded? On the one hand, by following accounts and liking videos, you do have a certain amount of agency on what you see. I mean, the type of media we consume shapes how we see the world, and our place in it. It can help you find your people. Say, for instance, you really like knitting, but none of your real life friends do, so you start following some stuff on the Gram. And then bam, you've been connected to a whole new community. You're learning about knitting things you wouldn't have known otherwise. But on the flip side, as these algorithms learn your preferences, you'll never know if you're interested in stuff they don't show you. They're not, like, all of a sudden gonna be like, "Oh, you love knitting. Have you tried MMA?" And ultimately, that can limit your exposure to new ideas or interests. Research has shown that users can get trapped in filter bubbles or echo chambers, where you just, you know, you're getting served content that reinforces what you already believe. This is particularly strong when it comes to political beliefs or news you consume. How can you really think critically about something if you're not being challenged on it? - The more that you interact with content that is something that interests you, the more you're not going to see content that doesn't interest you. - That's Iretiolu Akinrinade. She studies adolescent wellbeing in digital spaces. - And you never really get to see what's going on in other holes of the platform. - She explains that some users have found creative ways to make the algorithm work better for them. Like this one TikTok user created a second account to follow conservative news. And there, he discovered there was this whole trend of COVID positive users going out into public spaces intentionally infecting people. Um, side note, that's awful. So anyway, he shared that on his main account, introducing his followers to this content that they otherwise wouldn't see. - And I really liked what this user did in terms of strategically creating a separate account and really playing with their algorithm, and trying to see what would happen if they had different identities, or if they were understood as an algorithm to be a completely different person. But what I really liked about their decision making, is that they came back to their original account and shared that information. - Talk about breaking down that echo chamber, chamber, chamber, chamber. So, social media algorithms are far from perfect. Similar to echo chambers, there have been reports of people being sucked into dangerous rabbit holes where these algorithms can lead users to more extreme content, even leading to radicalization. Like how on TikTok, engaging in transphobic content lead users to white supremacists and other far-right content. And there are also reports of how what you see can affect your wellbeing too. Like, internal documents from Meta, the company formally known as Facebook, show how Instagram's algorithms can make teen girls feel worse about their body images. Well, dang, that's a big downer. We can't end the video like that. No, no, no, no. That's not gonna fly. We want to leave you with something inspiring, some wisdom on how to make these types of algorithms work better for you. Number one, for starters, be mindful of how social media algorithms work. Ask yourself, what content am I seeing and why am I seeing it? Number two, think before you like and share. Those actions help amplify posts. So doing those things doesn't just affect what you see, but what everyone else sees too. Because after all. - As an informed citizen, we have a part to play when it comes to the content we share. - Well said. Number three, try to break out of that echo chamber. Maybe you don't need to create a separate spam account, or maybe you do. But, like, basically, just keep an eye out for stuff from the "other side". Number four, use your voice. It's not just about consuming content. It's also about what content you're creating, and the conversations you're starting with your audience. - We should all experiment with the different ways that we're seen online, so we can try and receive quality, edifying content not just in the very moment, but over time. So be intentional as you use the internet, especially social media, and talk with your friends and family about what you see online. You might be surprised by the differences and similarities. It's just a really interesting point of conversation. - And that's it from me. But I'm curious about what you all think. How do you see social media algorithms influencing you? Let us know the commons below. Oh, and before we go, I want to give a big shout out to Common Sense Education, who we collabed with on this video. Speaking of algorithms, we need your help. If you liked this video, be sure to like, subscribe, and hit that bell notification so you can see more of our amazing stuff. It helps us with YouTube's algorithm too. Until next time. I'm Myles Bess. Peace out.