in

The Recorder – My Turn: Hate gone viral — how social media algorithms create domestic terror


What would motivate you to attempt to overthrow the power structures of your country, at serious risk to your life and well-being? What if you believed that every last facet of your society had been constructed to prey on the most vulnerable? Would that be enough?

For a growing portion of Americans, this exact situation is reality. It’s just not a real reality. In March of 2021, 14% of Americans expressed a belief in QAnon, the political conspiracy that believes in the existence of a Satanic, child-eating, sex trafficking ring that opposes the presidency of Donald Trump. This number rose to 17% by September of 2021, according to Public Religion Research Institute, a nonpartisan research think tank.

Since its birth in 2017, QAnon has evolved to include a belief that the results of the 2020 election were fraudulent, a belief shared by a far wider range of people — more than 40% of Americans if one Axios poll from January of 2022 is to be believed. With such a high percentage, it’s important to consider just how exactly this disinformation is being spread. One major source of this is social media.

Regardless of platform, social media companies like Facebook, Twitter, YouTube, and Instagram all make their money from ad revenue. Companies that want to advertise their products bid for every second you spend on social media. This profit incentive has led social media companies to design algorithms that keep users looking at their devices for as long as possible by suggesting content to them that they would find most engaging.

As humans, our brains are incredibly enticed by things that make us feel like we belong or that evoke strong emotions. Unfortunately, out of everything available on the internet, conspiracy content is some of the best at pressing those psychological buttons. And because it’s so engaging, the algorithm encourages its growth by pushing users in its direction.

Let’s say that someone is on their favorite social media app, scrolling through content that is personalized for them, such as a “For You” or “Suggested” page, when they come across a news story. Say, for example, a headline that claims that new information has been discovered regarding the lab leak theory of coronavirus origin. The user doesn’t read the story to learn that the sensational headline doesn’t match with the article’s claim that the virus did not come from a lab, they just like the post. The algorithm picks up on this new interest, and soon the user’s feed is filled with misinformation about the pandemic, vaccine efficacy, and more. In the comments of these articles, they chat with other users falling into the same rabbit holes, and join increasingly extreme online groups with similar beliefs. Situations just like this one are playing out constantly.

On the internet, there is very little to no favoring of trustworthy sources of information. Anything can be liked, picked up by the algorithm as trending, and sent out to masses. There is no difference between what is true and what is false, other than the latter often being a lot more engaging. Fake news outperforms real news on the internet, and performance is all that matters to the algorithms. It is a perfect storm for creating a bubble of misinformation, where users are primed to distrust anything that disagrees with their worldviews.

To return to the question posed at the beginning, a belief that action is desperately needed, and reinforcement of that belief from other users, is quite the motivating factor. People believe that what they say and do on the internet doesn’t have consequences in the real world. But that’s wrong. At a pizza shop in Washington D.C., in Christchurch, New Zealand, and in Buffalo, New York, individuals have committed acts of terrorism motivated by beliefs they either encountered or had reinforced online.

The internet has the power to connect us, regardless of nationality, age, language, or economic status. But that power is not without consequences. Left unchecked, social media companies have created the greatest asset to decentralized terrorism in human history, on accident. If we want to stop the spread of misinformation and hate in this country, social media needs some work.

Zachary Rutherford is a recent graduate of Four Rivers Charter Public School in Greenfield. This piece was originally written as part of a civics class that covered media literacy in the digital age and how social media is impacting society. It is the second of three pieces written by Four Rivers students that will be published this week.





Source link

What do you think?

Written by Sharecaster

Opinion | Don’t let crypto get a foot in the vault

It’s Officially Too Hot for Crypto Mining in Texas