What we need most at the end of the day…

In a recent Times article, the reporter Emma Goldberg wrote about how the rise of social media and influencer power has made it such that young people, in particular, find their livelihood, success and sense of self inextricably entwined with an online presentation. She wrote, “With personal branding, the line between who people are and what they do disappears. Everything is content.” A strange, exhausting new twist in being human is that each day, each of us must decide how much of ourselves, our family life, thoughts, work, photos and feelings we will share with strangers online. Goldberg quoted Tom Peters, a marketing writer, who explained that, we are each “head marketer for the brand called You.”

To reduce ourselves to brands, however, is to do violence to our personhood. We turn ourselves into products, content to be evaluated instead of people to be truly known and loved. We convert the stuff of our lives into currency.

This new way of interacting with the world is driving institutional dysfunction, personal anxiety and the hollowing out of ourselves…Klein confessed that social media had made him hungry for validation. It offers us, he said, a steady drumbeat of “You exist. You are seen.” This longing to be seen and validated is universal, but this desire has been co-opted by technologists to capture more and more of our time and attention…

I have gotten letters from time to time from readers declaring me their pastor, and of course, I’m flattered and grateful. I hope to be of help to them, yet I cannot be their pastor. I cannot hold their hands and pray over them in the hospital. I cannot grieve with them after the loss of a loved one or rejoice when they land a job. A pastor and the work of local churches more broadly are tethered to a place, an institution and a particular people, with all the complexity, hilarity, struggle and mystery of their lives.

What we need most at the end of the day has nothing to do with influence or brands. We need quiet beauty and enduring truth that we share with those who walk this journey with us…

—  Tish Harrison Warren, from “The Temptations of the ‘Personal Brand’” (New York Times, January 29, 2023). Tish Harrison Warren (@Tish_H_Warren) is a priest in the Anglican Church in North America and the author of “Prayer in the Night: For Those Who Work or Watch or Weep.”

Monday Morning Wake-Up Call

So when I came across Carr’s book in 2020 (The Shallows: What the Internet Is Doing to Our Brains), I was ready to read it. And what I found in it was a key — not just to a theory but to a whole map of 20th-century media theorists…who saw what was coming and tried to warn us. Carr’s argument began with an observation, one that felt familiar:

The very way my brain worked seemed to be changing. It was then that I began worrying about my inability to pay attention to one thing for more than a couple of minutes. At first I’d figured that the problem was a symptom of middle-age mind rot. But my brain, I realized, wasn’t just drifting. It was hungry. It was demanding to be fed the way the Net fed it — and the more it was fed, the hungrier it became. Even when I was away from my computer, I yearned to check email, click links, do some Googling. I wanted to be connected.

Hungry. That was the word that hooked me. That’s how my brain felt to me, too. Hungry. Needy. Itchy. Once it wanted information. But then it was distraction. And then, with social media, validation. A drumbeat of: You exist. You are seen…

These are industries I know well, and I do not think it has changed them, or the people in them (myself included), for the better.  But what would? I’ve found myself going back to a wise, indescribable book that Jenny Odell, a visual artist, published in 2019. In “How to Do Nothing: Resisting the Attention Economy,” Odell suggests that any theory of media must first start with a theory of attention. “One thing I have learned about attention is that certain forms of it are contagious,” she writes.

When you spend enough time with someone who pays close attention to something (if you were hanging out with me, it would be birds), you inevitably start to pay attention to some of the same things. I’ve also learned that patterns of attention — what we choose to notice and what we do not — are how we render reality for ourselves, and thus have a direct bearing on what we feel is possible at any given time. These aspects, taken together, suggest to me the revolutionary potential of taking back our attention.

I think Odell frames both the question and the stakes correctly. Attention is contagious. What forms of it, as individuals and as a society, do we want to cultivate? What kinds of mediums would that cultivation require?

This is anything but an argument against technology, were such a thing even coherent. It’s an argument for taking technology as seriously as it deserves to be taken, for recognizing, as McLuhan’s friend and colleague John M. Culkin put it, “we shape our tools, and thereafter, they shape us.”

There is an optimism in that, a reminder of our own agency. And there are questions posed, ones we should spend much more time and energy trying to answer: How do we want to be shaped? Who do we want to become?

— Ezra Klein, from “I Didn’t Want It to Be True, but the Medium Really Is the Message” (NY Times, August 7, 2022)

Hard Truth

When Facebook (and all the others) decide what you see in your news feed, there are many thousands of things they could show you. So they have written a piece of code to automatically decide what you will see. There are all sorts of algorithms they could use—ways they could decide what you should see, and the order in which you should see them. They could have an algorithm designed to show you things that make you feel happy. They could have an algorithm designed to show you things that make you feel sad. They could have an algorithm to show you things that your friends are talking about most. The list of potential algorithms is long.

The algorithm they actually use varies all the time, but it has one key driving principle that is consistent. It shows you things that will keep you looking at your screen. That’s it. Remember: the more time you look, the more money they make. So the algorithm is always weighted toward figuring out what will keep you looking, and pumping more and more of that onto your screen to keep you from putting down your phone. It is designed to distract. But, Tristan was learning, that leads—quite unexpectedly, and without anyone intending it—to some other changes, which have turned out to be incredibly consequential.

Imagine two Facebook feeds. One is full of updates, news, and videos that make you feel calm and happy. The other is full of updates, news, and videos that make you feel angry and outraged. Which one does the algorithm select? The algorithm is neutral about the question of whether it wants you to be calm or angry. That’s not its concern. It only cares about one thing: Will you keep scrolling? Unfortunately, there’s a quirk of human behavior. On average, we will stare at something negative and outrageous for a lot longer than we will stare at something positive and calm. You will stare at a car crash longer than you will stare at a person handing out flowers by the side of the road, even though the flowers will give you a lot more pleasure than the mangled bodies in a crash. Scientists have been proving this effect in different contexts for a long time—if they showed you a photo of a crowd, and some of the people in it were happy, and some angry, you would instinctively pick out the angry faces first. Even ten-week-old babies respond differently to angry faces. This has been known about in psychology for years and is based on a broad body of evidence. It’s called “negativity bias.”

There is growing evidence that this natural human quirk has a huge effect online. On YouTube, what are the words that you should put into the title of your video, if you want to get picked up by the algorithm? They are—according to the best site monitoring YouTube trends—words such as “hates,” “obliterates,” “slams,” “destroys.” A major study at New York University found that for every word of moral outrage you add to a tweet, your retweet rate will go up by 20 percent on average, and the words that will increase your retweet rate most are “attack,” “bad,” and “blame.” A study by the Pew Research Center found that if you fill your Facebook posts with “indignant disagreement,” you’ll double your likes and shares. So an algorithm that prioritizes keeping you glued to the screen will—unintentionally but inevitably—prioritize outraging and angering you. If it’s more enraging, it’s more engaging.

If enough people are spending enough of their time being angered, that starts to change the culture. As Tristan told me, it “turns hate into a habit.” You can see this seeping into the bones of our society. When I was a teenager, there was a horrific crime in Britain, where two ten-year-old children murdered a toddler named Jamie Bulger. The Conservative prime minister at the time, John Major, responded by publicly saying that he believed we need “to condemn a little more, and understand a little less.” I remembered thinking then, at the age of fourteen, that this was surely wrong—that it’s always better to understand why people do things, even (perhaps especially) the most heinous acts. But today, this attitude—condemn more, understand less—has become the default response of almost everyone, from the right to the left, as we spend our lives dancing to the tune of algorithms that reward fury and penalize mercy.

In 2015 a researcher named Motahhare Eslami, as part of a team at the University of Illinois, took a group of ordinary Facebook users and explained to them how the Facebook algorithm works. She talked them through how it selects what they see. She discovered that 62 percent of them didn’t know their feeds were filtered at all, and they were astonished to learn about the algorithm’s existence. One person in the study compared it to the moment in the film The Matrix when the central character, Neo, discovers he is living in a computer simulation.

I called several of my relatives and asked them if they knew what an algorithm was. None of them—including the teenagers—did. I asked my neighbors. They looked at me blankly. It’s easy to assume most people know about this, but I don’t think it’s true.

Johann Hari, “Stolen Focus: Why You Can’t Pay Attention–and How to Think Deeply Again” (Crown, January 25, 2022)


Notes:

Monday Morning Wake-Up Call

The average human lifespan is absurdly, terrifyingly, insultingly short. Here’s one way of putting things in perspective: the first modern humans appeared on the plains of Africa at least 200,000 years ago, and scientists estimate that life, in some form, will persist for another 1.5bn years or more, until the intensifying heat of the sun condemns the last organism to death. But you? Assuming you live to be 80, you’ll have had about 4,000 weeks.

When I first made that calculation, I felt queasy; but once I’d recovered, I started pestering my friends, asking them to guess – off the top of their heads, without doing any mental arithmetic – how many weeks they thought the average person could expect to live. One named a number in the six figures. Yet, as I felt obliged to inform her, a fairly modest six-figure number of weeks – 310,000 – is the approximate duration of all human civilisation since the ancient Sumerians of Mesopotamia. On almost any meaningful timescale, as the contemporary philosopher Thomas Nagel has written, “we will all be dead any minute”.

And so distraction truly matters – because your experience of being alive consists of nothing other than the sum of everything to which you pay attention. At the end of your life, looking back, whatever compelled your attention from moment to moment is simply what your life will have been. When you pay attention to something you don’t especially value, it’s not an exaggeration to say that you’re paying with your life…

—  Oliver Burkeman, from At best, we’re on Earth for around 4,000 weeks —  so why do we lose so much time to online to online distraction? (The Guardian, August 7, 2021)

…the hot breath of impending Armageddon

But something about Facebook brought out truly juvenile impulses…There was a bit of bad faith in smugly ridiculing these poor people. Posts tended toward selfies of rosacea-faced long-haired women in old-style prairie dresses and lots of pregnancy crowdsourcing about progesterone and wild yams. So what, if that it what they believe? Laughing at them was a shabby use of her time, but she knew part of what made Facebook — and the internet, really — addicting was simultaneously indulging your own obsessions while mocking (deriding, denouncing even) the obsessions of others from the safety of your screen. It was hard to resist, and indulging this impulse — even silently to yourself — made everything worse, made you worse, she was sure of it. … That led her to Twitter and back to Facebook, to wildly out-of-proportion, aggro throw downs between various vegan groups and carnivore groups, omnivores and fasters. Diet had apparently become the major battlefield for all the dispossessed (i.e., all of us). There was something quaintly nineteenth-century American about it all: the focus on health, the zealotry, the desire for perfection, and the hot breath of impending Armageddon. She clicked, she tapped, she followed, she liked. A few groups she joined, and always she lurked.

Dana Spiotta, Wayward: A Novel (Knopf, July 6, 2021)


Image & Book Review from Los Angeles Times: “Dana Spiotta’s novel of midlife female rage”.