This post originally appeared in Rolling Stone.
In a world of hyper-personalized For You Pages and self-reinforcing Search Preferences, should social media platforms be responsible for exposing us to different voices and varying perspectives?
FOR MANY PEOPLE, including myself, our social media feed feels like a safe space — a place where those you follow, and those who follow you, are ideologically aligned.
TikTok and its famous ‘For You’ algorithm has taken this idea of an internet safe space even further by presenting users with a perpetual feed of videos curated explicitly to their needs.
In theory, it’s a fabulous tool. As a fan of stand-up comedy and design, I love that most of my feed is made up of both, but is it a good idea to be shown more of the same in practice?
Beware the “Filter Bubbles”
In a 2011 Ted Talk by author Eli Pariser, he warns audiences of the internet’s “filter bubbles,” which he defines as “the invisible, algorithm editing of the Web,” and social media platforms aren’t exclusively involved.
Google Search, for example, will show different results based on “things like time, context, or personalized results.”
Pariser demonstrated this when he asked two friends to search “Egypt” in Google and send a screenshot of their results. To his surprise, they were remarkably different.
From search engines to news sites and, of course, social media, personalization has become standard practice across the Internet, showing us content it thinks we want to see but not necessarily what we need to see.
In other words, we only get a single side of every story, resulting in unintended confirmation bias and algorithmic echo chambers.
How Confirmation Bias Manifests on Social Media
The #1 goal of any social media platform is to keep you scrolling. To do this, these companies rely on their algorithms which are designed to do two things: closely study our interests and inundate our feeds with related content.
This is why a soccer fan is likely to come across soccer highlights on their feed over tennis or basketball, for example.
Simply put, confirmation bias is why we’re drawn to social media in the first place.
Why Should We Care?
According to research on the structure of modern social networks, “echo chambers strengthen polarization and the divisions in our society.” The research becomes abundantly clear if we take a quick glance at Twitter over the course of an election cycle. Yikes.
The truth is, if these social media giants and their proprietary algorithms continue to over-filter our content, we not only shun ourselves from voices, experiences, and perspectives different from our own, but creativity as we know it may cease to exist.
How can a creative be inspired by something different if everything’s the same?
How can anyone be, do, or think differently if our feeds do not challenge us?
How can we think outside of the box if our search preferences reinforce what we already know?
How Unfiltered Feeds Can Help
The way I see it, there are only two options to go from here — one is likelier than the other.
The first and less likely option is for social media companies to leave feeds unfiltered. In a perfect world, algorithms would offer users a content-tasting menu of different flavors made up of diverse voices, experiences, and perspectives.
Not only would this keep users’ feeds fresh, but it would also introduce people to creators they wouldn’t have known to follow or learn about a new interest, along with enjoying the content that already matches their taste.
TikTok claims to do this as part of its push to safeguard and diversify recommendations. According to its 2021 newsroom article, they’re committed to “interrupt repetitive patterns” and “intersperse diverse types of content along those people already know they love.”
They also add that research-based experts, ethics committees, and advisory boards are a part of the ongoing conversation to safeguard the For You Page.
In theory, if the social media giants committed to standardizing a prescribed minimum of different types of content, similar to the Canadian content requirements on the radio, this could solve all of our problems.
Then again, if the nature of the game is to keep users glued to their apps for as long as possible, this option would be considered unprofitable.
The Responsibility of the Informed User
The second option requires us to mindfully regulate our online environments, which is a challenging task.
It’s simply part of our default programming to seek, interpret, and recall new information that validates our preexisting beliefs.
This means it’s up to us to challenge our human nature — to seek opinions that contradict our own, cultivate media from all sides, and indulge both sides of the same story.
Until companies and their algorithms do things differently, it’s up to us to preserve our online culture’s integrity and its myriad flavors.