TSS Exclusive: Polarized Algorithms

You open a browser window or app to check the day’s headlines and trending topics on YouTube. A group of previously watched videos pops up along with a “Recommended” section, twice as big. You decide to give one of these videos a shot and click ‘play’. A few minutes later, it ends. Now, a new suggested video is queued up and ready to auto-play in a few seconds.

The videos seem normal at first, exactly what you want. The process of watch-suggest-autoplay repeats several times. Yet, as each new video plays, the site pulls you further and further away from mainstream content you wanted towards content designed, above all, to keep you watching.

Fringe content takes over; and, before you know it, you’re falling further down the rabbit hole. Videos about the earth being flat, when the world will come to an end, and why your political representative is a member of the Illuminati—these all take over.

On the Fringe.

Fringe theories are ideas or collections of ideas that considerably depart from the prevailing and mainstream views of others. This can include pseudo-science research such as the claim that vaccines cause autism, aliens walk among us, and the Large Hedron Collider would crush the Earth.

Why do we have such an increase in these fringe theories in recent years? Why are these discussed so much on news sites and social media? Why does it feel like the fabric that holds us together as a society is unraveling, causing us to be more polarized and in opposition of each other? What is it exactly that attracts viewers to certain videos over others and makes them want to click that mouse button?

Are algorithms the root cause…

Everything from building a business and fanbase to seeing the latest photographs of childhood friends and their families—sites such as Facebook, Twitter, Instagram, and YouTube have dominated our lives and will continue to do so.

In their infancies, network newsfeeds were simple. They were in chronological order and showed you what your friends and family were doing. Between 2011 and 2014, Facebook et al tinkered around with algorithms in attempts to re-prioritize a user’s interests. By 2015, most major platforms were using algorithms.

What social media platforms learned is that, if a user’s newsfeed could be catered with a large amount of clickable content—especially stuff that drew on the user’s emotions—they could push more ads and sponsored content. And make a hefty profit in return.

“Emotion is a big driver of what goes viral,” Jonah Berger told The New Republic. Berger is a professor at The Wharton School of The University of Pennsylvania and authored the New York Times best seller Contagious, Why Things Go Viral. “Whether something pulls on our heartstrings, makes us angry, or provokes controversy, the more we care, the more we share.”

Is this why click-bait sprung up? Is this why political-charged articles, one-sided videos, and “fake news” oozed across our newsfeeds? How could we change so rapidly because of these algorithms?

…or are social platforms to blame?

“YouTube is something that looks like reality, but it is distorted to make you spend more time online,” former YouTube engineer Guillaume Chaslot said in an interview with The Guardian. “The recommendation algorithm is not optimizing for what is truthful, or balanced, or healthy for democracy. Watch time was the priority. Everything else was considered a distraction.”

Content aside, a video appearing in a person’s “Up Next” or “Recommended” section will have a huge impact on its number of views. This will be determined by what users have liked in the past, what is currently trending, and the like-to-dislike ratio thereof. It is also impacted by subscriptions and notification settings.

Not a lot of information is publicly known about YouTube’s exact algorithm, but Chaslot claims that the most important aspect of it is the 20 or so videos that get served up in the recommended, “Watch Next” section. The videos are meant to entice viewers and keep them on the website longer.

This may seem ideal; however, it can sometimes push videos with extreme political messaging. Free speech is important, yes; but a significant problem for users and our society is that there often is no reliable way to determine if certain speech or messages are true or not. The algorithm does not currently achieve this.

New problem-solving operations.

In the aftermath of Facebook’s Cambridge Analytica scandal, “fake news” problems, and users leaving in droves, the company re-evaluated their algorithms this year, changing how we perceive content. They began to prioritize friends, family, and groups. They claimed this was to focus on building friendships and to silence the fake news stories making the rounds on social media website.

While Facebook’s focus on building relationships between friends and family is seemingly positive, some critics are skeptical these changes appropriately address the issues. Will this achieve the desired result and help depolarize our society?

We, as social media users, are left wondering: Should corporations with no public accountability have control over speech online? How could sites like Facebook and YouTube be better designed to disseminate news? How should we as members of the public stay well-informed?

These questions are likely to only compound as social media continues to pervade our lives and the way we consume information.

The Starset Society

1 comment

MORE COOL STUFF LIKE THIS

IN YOUR INBOX

[mc4wp_form id=”2223″]

CONTRIBUTE

Have something to  share? Become a Starset Society Contributor today.
BECOME A CONTRIBUTOR