Hard Truth

When Facebook (and all the others) decide what you see in your news feed, there are many thousands of things they could show you. So they have written a piece of code to automatically decide what you will see. There are all sorts of algorithms they could use—ways they could decide what you should see, and the order in which you should see them. They could have an algorithm designed to show you things that make you feel happy. They could have an algorithm designed to show you things that make you feel sad. They could have an algorithm to show you things that your friends are talking about most. The list of potential algorithms is long.

The algorithm they actually use varies all the time, but it has one key driving principle that is consistent. It shows you things that will keep you looking at your screen. That’s it. Remember: the more time you look, the more money they make. So the algorithm is always weighted toward figuring out what will keep you looking, and pumping more and more of that onto your screen to keep you from putting down your phone. It is designed to distract. But, Tristan was learning, that leads—quite unexpectedly, and without anyone intending it—to some other changes, which have turned out to be incredibly consequential.

Imagine two Facebook feeds. One is full of updates, news, and videos that make you feel calm and happy. The other is full of updates, news, and videos that make you feel angry and outraged. Which one does the algorithm select? The algorithm is neutral about the question of whether it wants you to be calm or angry. That’s not its concern. It only cares about one thing: Will you keep scrolling? Unfortunately, there’s a quirk of human behavior. On average, we will stare at something negative and outrageous for a lot longer than we will stare at something positive and calm. You will stare at a car crash longer than you will stare at a person handing out flowers by the side of the road, even though the flowers will give you a lot more pleasure than the mangled bodies in a crash. Scientists have been proving this effect in different contexts for a long time—if they showed you a photo of a crowd, and some of the people in it were happy, and some angry, you would instinctively pick out the angry faces first. Even ten-week-old babies respond differently to angry faces. This has been known about in psychology for years and is based on a broad body of evidence. It’s called “negativity bias.”

There is growing evidence that this natural human quirk has a huge effect online. On YouTube, what are the words that you should put into the title of your video, if you want to get picked up by the algorithm? They are—according to the best site monitoring YouTube trends—words such as “hates,” “obliterates,” “slams,” “destroys.” A major study at New York University found that for every word of moral outrage you add to a tweet, your retweet rate will go up by 20 percent on average, and the words that will increase your retweet rate most are “attack,” “bad,” and “blame.” A study by the Pew Research Center found that if you fill your Facebook posts with “indignant disagreement,” you’ll double your likes and shares. So an algorithm that prioritizes keeping you glued to the screen will—unintentionally but inevitably—prioritize outraging and angering you. If it’s more enraging, it’s more engaging.

If enough people are spending enough of their time being angered, that starts to change the culture. As Tristan told me, it “turns hate into a habit.” You can see this seeping into the bones of our society. When I was a teenager, there was a horrific crime in Britain, where two ten-year-old children murdered a toddler named Jamie Bulger. The Conservative prime minister at the time, John Major, responded by publicly saying that he believed we need “to condemn a little more, and understand a little less.” I remembered thinking then, at the age of fourteen, that this was surely wrong—that it’s always better to understand why people do things, even (perhaps especially) the most heinous acts. But today, this attitude—condemn more, understand less—has become the default response of almost everyone, from the right to the left, as we spend our lives dancing to the tune of algorithms that reward fury and penalize mercy.

In 2015 a researcher named Motahhare Eslami, as part of a team at the University of Illinois, took a group of ordinary Facebook users and explained to them how the Facebook algorithm works. She talked them through how it selects what they see. She discovered that 62 percent of them didn’t know their feeds were filtered at all, and they were astonished to learn about the algorithm’s existence. One person in the study compared it to the moment in the film The Matrix when the central character, Neo, discovers he is living in a computer simulation.

I called several of my relatives and asked them if they knew what an algorithm was. None of them—including the teenagers—did. I asked my neighbors. They looked at me blankly. It’s easy to assume most people know about this, but I don’t think it’s true.

Johann Hari, “Stolen Focus: Why You Can’t Pay Attention–and How to Think Deeply Again” (Crown, January 25, 2022)


Notes:

%d bloggers like this: