Hard Truth

When Facebook (and all the others) decide what you see in your news feed, there are many thousands of things they could show you. So they have written a piece of code to automatically decide what you will see. There are all sorts of algorithms they could use—ways they could decide what you should see, and the order in which you should see them. They could have an algorithm designed to show you things that make you feel happy. They could have an algorithm designed to show you things that make you feel sad. They could have an algorithm to show you things that your friends are talking about most. The list of potential algorithms is long.

The algorithm they actually use varies all the time, but it has one key driving principle that is consistent. It shows you things that will keep you looking at your screen. That’s it. Remember: the more time you look, the more money they make. So the algorithm is always weighted toward figuring out what will keep you looking, and pumping more and more of that onto your screen to keep you from putting down your phone. It is designed to distract. But, Tristan was learning, that leads—quite unexpectedly, and without anyone intending it—to some other changes, which have turned out to be incredibly consequential.

Imagine two Facebook feeds. One is full of updates, news, and videos that make you feel calm and happy. The other is full of updates, news, and videos that make you feel angry and outraged. Which one does the algorithm select? The algorithm is neutral about the question of whether it wants you to be calm or angry. That’s not its concern. It only cares about one thing: Will you keep scrolling? Unfortunately, there’s a quirk of human behavior. On average, we will stare at something negative and outrageous for a lot longer than we will stare at something positive and calm. You will stare at a car crash longer than you will stare at a person handing out flowers by the side of the road, even though the flowers will give you a lot more pleasure than the mangled bodies in a crash. Scientists have been proving this effect in different contexts for a long time—if they showed you a photo of a crowd, and some of the people in it were happy, and some angry, you would instinctively pick out the angry faces first. Even ten-week-old babies respond differently to angry faces. This has been known about in psychology for years and is based on a broad body of evidence. It’s called “negativity bias.”

There is growing evidence that this natural human quirk has a huge effect online. On YouTube, what are the words that you should put into the title of your video, if you want to get picked up by the algorithm? They are—according to the best site monitoring YouTube trends—words such as “hates,” “obliterates,” “slams,” “destroys.” A major study at New York University found that for every word of moral outrage you add to a tweet, your retweet rate will go up by 20 percent on average, and the words that will increase your retweet rate most are “attack,” “bad,” and “blame.” A study by the Pew Research Center found that if you fill your Facebook posts with “indignant disagreement,” you’ll double your likes and shares. So an algorithm that prioritizes keeping you glued to the screen will—unintentionally but inevitably—prioritize outraging and angering you. If it’s more enraging, it’s more engaging.

If enough people are spending enough of their time being angered, that starts to change the culture. As Tristan told me, it “turns hate into a habit.” You can see this seeping into the bones of our society. When I was a teenager, there was a horrific crime in Britain, where two ten-year-old children murdered a toddler named Jamie Bulger. The Conservative prime minister at the time, John Major, responded by publicly saying that he believed we need “to condemn a little more, and understand a little less.” I remembered thinking then, at the age of fourteen, that this was surely wrong—that it’s always better to understand why people do things, even (perhaps especially) the most heinous acts. But today, this attitude—condemn more, understand less—has become the default response of almost everyone, from the right to the left, as we spend our lives dancing to the tune of algorithms that reward fury and penalize mercy.

In 2015 a researcher named Motahhare Eslami, as part of a team at the University of Illinois, took a group of ordinary Facebook users and explained to them how the Facebook algorithm works. She talked them through how it selects what they see. She discovered that 62 percent of them didn’t know their feeds were filtered at all, and they were astonished to learn about the algorithm’s existence. One person in the study compared it to the moment in the film The Matrix when the central character, Neo, discovers he is living in a computer simulation.

I called several of my relatives and asked them if they knew what an algorithm was. None of them—including the teenagers—did. I asked my neighbors. They looked at me blankly. It’s easy to assume most people know about this, but I don’t think it’s true.

Johann Hari, “Stolen Focus: Why You Can’t Pay Attention–and How to Think Deeply Again” (Crown, January 25, 2022)


Notes:

31 thoughts on “Hard Truth

  1. THe negative voice is always louder – and perhaps easier to spread. Of all the vital things we could put such energy into, we track how much we hate. And this is where we are; this is what we have fomented with all our wisdom.

    Liked by 2 people

  2. Glad I read this, one for an understanding of why I don’t see all the “crazy, mean and violent” things on FB which is good, or is it because that means only a handful of people see my content? I’ll choose the first. Here’s a funny (people are often surprised when I’m funny). Why does a 61 year old woman with no outward “sexiness” plus a constant case of her hip pain due to toting a two-year old whose teeth are repaired with a “partial” continue to be inundated by overly confident sneering men with heart emojis in their profile? Curious, truly over this. Just a thought. 😊

    Liked by 1 person

  3. Algorithms (like less passive and more personal propaganda and steady misinformation) that keep our fury-heavy outrage dialed in is why I avoid Twitter and YouTube like the plague. We’re all vulnerable. Facebook, however, knows where to put cat-video automatic scrolling, and of course, there was that time I actively Like-d ONE photographer.. 🤦

    Liked by 1 person

  4. This is most disconcerting. Even if I know there are algorithms out there and even when I try to stop the constant barrage of negativity, it feels like a hopeless battle. We are but pawns in their game, aren`t we? Of course, we could simply turn it off but it is everywhere. From Netflix to YouTube to Disney+. They are more than willing to send you more of what you have watched…

    Liked by 1 person

  5. I don’t use social media, any longer, I used F. book for maybe 3 years…time waster…I left the late fall of 2015…don’t miss it…I’ve known & noticed manipulative algorithm steered suggestions for years…I do have critical thinking skills!..I’ve known for years about the tons of tracking cookies for probably 15 years or longer..ie: a sister will email me a product link, for new flooring or a cast iron pan and just because I click the link tracking cookies thinks I want to purchase one, wrong..the item will even show up on my hubbies laptop (for weeks)…really over kill…Just like well know retailers who send emails that say I noticed you like this item, take a look again…then a few days later -an item you like is now on sale! Pesty for sure…/// I miss the days when I was growing up when the newspaper came in the morning and evening, mail was delivered twice a day and Milk (sometimes chocolate) butter, eggs magically appeared in the milk box on the porch in the morning…I don’t have a smart phone either…A. I. & Algorithms, think their suggestions are the ones I am looking for…Wrong…I don’t have a cell phone, currently.

    Liked by 1 person

  6. I was trying to manage content showing on my Fb feed by hiding and reporting things I didn’t want to see or I found to be offensive or fake news. Unfortunately it is Sisyphus’ work, the algorithm keeps changing and all the stuff I removed were coming back after a while. So I decided to delete my FB account. The second reason was the fact that people who I considered intelligent and logical in real life, quiet often turned into conspiracy theories or fake news enthusiasts when online. Sometimes it is better to know your colleagues or the lovely lady at the local bakery only professionally and let them enjoy their private views and opinions, well … in private.

    Liked by 1 person

Leave a Reply