Monday Morning Wake-Up Call

So when I came across Carr’s book in 2020 (The Shallows: What the Internet Is Doing to Our Brains), I was ready to read it. And what I found in it was a key — not just to a theory but to a whole map of 20th-century media theorists…who saw what was coming and tried to warn us. Carr’s argument began with an observation, one that felt familiar:

The very way my brain worked seemed to be changing. It was then that I began worrying about my inability to pay attention to one thing for more than a couple of minutes. At first I’d figured that the problem was a symptom of middle-age mind rot. But my brain, I realized, wasn’t just drifting. It was hungry. It was demanding to be fed the way the Net fed it — and the more it was fed, the hungrier it became. Even when I was away from my computer, I yearned to check email, click links, do some Googling. I wanted to be connected.

Hungry. That was the word that hooked me. That’s how my brain felt to me, too. Hungry. Needy. Itchy. Once it wanted information. But then it was distraction. And then, with social media, validation. A drumbeat of: You exist. You are seen…

These are industries I know well, and I do not think it has changed them, or the people in them (myself included), for the better.  But what would? I’ve found myself going back to a wise, indescribable book that Jenny Odell, a visual artist, published in 2019. In “How to Do Nothing: Resisting the Attention Economy,” Odell suggests that any theory of media must first start with a theory of attention. “One thing I have learned about attention is that certain forms of it are contagious,” she writes.

When you spend enough time with someone who pays close attention to something (if you were hanging out with me, it would be birds), you inevitably start to pay attention to some of the same things. I’ve also learned that patterns of attention — what we choose to notice and what we do not — are how we render reality for ourselves, and thus have a direct bearing on what we feel is possible at any given time. These aspects, taken together, suggest to me the revolutionary potential of taking back our attention.

I think Odell frames both the question and the stakes correctly. Attention is contagious. What forms of it, as individuals and as a society, do we want to cultivate? What kinds of mediums would that cultivation require?

This is anything but an argument against technology, were such a thing even coherent. It’s an argument for taking technology as seriously as it deserves to be taken, for recognizing, as McLuhan’s friend and colleague John M. Culkin put it, “we shape our tools, and thereafter, they shape us.”

There is an optimism in that, a reminder of our own agency. And there are questions posed, ones we should spend much more time and energy trying to answer: How do we want to be shaped? Who do we want to become?

— Ezra Klein, from “I Didn’t Want It to Be True, but the Medium Really Is the Message” (NY Times, August 7, 2022)

Hard Truth

When Facebook (and all the others) decide what you see in your news feed, there are many thousands of things they could show you. So they have written a piece of code to automatically decide what you will see. There are all sorts of algorithms they could use—ways they could decide what you should see, and the order in which you should see them. They could have an algorithm designed to show you things that make you feel happy. They could have an algorithm designed to show you things that make you feel sad. They could have an algorithm to show you things that your friends are talking about most. The list of potential algorithms is long.

The algorithm they actually use varies all the time, but it has one key driving principle that is consistent. It shows you things that will keep you looking at your screen. That’s it. Remember: the more time you look, the more money they make. So the algorithm is always weighted toward figuring out what will keep you looking, and pumping more and more of that onto your screen to keep you from putting down your phone. It is designed to distract. But, Tristan was learning, that leads—quite unexpectedly, and without anyone intending it—to some other changes, which have turned out to be incredibly consequential.

Imagine two Facebook feeds. One is full of updates, news, and videos that make you feel calm and happy. The other is full of updates, news, and videos that make you feel angry and outraged. Which one does the algorithm select? The algorithm is neutral about the question of whether it wants you to be calm or angry. That’s not its concern. It only cares about one thing: Will you keep scrolling? Unfortunately, there’s a quirk of human behavior. On average, we will stare at something negative and outrageous for a lot longer than we will stare at something positive and calm. You will stare at a car crash longer than you will stare at a person handing out flowers by the side of the road, even though the flowers will give you a lot more pleasure than the mangled bodies in a crash. Scientists have been proving this effect in different contexts for a long time—if they showed you a photo of a crowd, and some of the people in it were happy, and some angry, you would instinctively pick out the angry faces first. Even ten-week-old babies respond differently to angry faces. This has been known about in psychology for years and is based on a broad body of evidence. It’s called “negativity bias.”

There is growing evidence that this natural human quirk has a huge effect online. On YouTube, what are the words that you should put into the title of your video, if you want to get picked up by the algorithm? They are—according to the best site monitoring YouTube trends—words such as “hates,” “obliterates,” “slams,” “destroys.” A major study at New York University found that for every word of moral outrage you add to a tweet, your retweet rate will go up by 20 percent on average, and the words that will increase your retweet rate most are “attack,” “bad,” and “blame.” A study by the Pew Research Center found that if you fill your Facebook posts with “indignant disagreement,” you’ll double your likes and shares. So an algorithm that prioritizes keeping you glued to the screen will—unintentionally but inevitably—prioritize outraging and angering you. If it’s more enraging, it’s more engaging.

If enough people are spending enough of their time being angered, that starts to change the culture. As Tristan told me, it “turns hate into a habit.” You can see this seeping into the bones of our society. When I was a teenager, there was a horrific crime in Britain, where two ten-year-old children murdered a toddler named Jamie Bulger. The Conservative prime minister at the time, John Major, responded by publicly saying that he believed we need “to condemn a little more, and understand a little less.” I remembered thinking then, at the age of fourteen, that this was surely wrong—that it’s always better to understand why people do things, even (perhaps especially) the most heinous acts. But today, this attitude—condemn more, understand less—has become the default response of almost everyone, from the right to the left, as we spend our lives dancing to the tune of algorithms that reward fury and penalize mercy.

In 2015 a researcher named Motahhare Eslami, as part of a team at the University of Illinois, took a group of ordinary Facebook users and explained to them how the Facebook algorithm works. She talked them through how it selects what they see. She discovered that 62 percent of them didn’t know their feeds were filtered at all, and they were astonished to learn about the algorithm’s existence. One person in the study compared it to the moment in the film The Matrix when the central character, Neo, discovers he is living in a computer simulation.

I called several of my relatives and asked them if they knew what an algorithm was. None of them—including the teenagers—did. I asked my neighbors. They looked at me blankly. It’s easy to assume most people know about this, but I don’t think it’s true.

Johann Hari, “Stolen Focus: Why You Can’t Pay Attention–and How to Think Deeply Again” (Crown, January 25, 2022)


Notes:

Monday Morning Wake-Up Call

The average human lifespan is absurdly, terrifyingly, insultingly short. Here’s one way of putting things in perspective: the first modern humans appeared on the plains of Africa at least 200,000 years ago, and scientists estimate that life, in some form, will persist for another 1.5bn years or more, until the intensifying heat of the sun condemns the last organism to death. But you? Assuming you live to be 80, you’ll have had about 4,000 weeks.

When I first made that calculation, I felt queasy; but once I’d recovered, I started pestering my friends, asking them to guess – off the top of their heads, without doing any mental arithmetic – how many weeks they thought the average person could expect to live. One named a number in the six figures. Yet, as I felt obliged to inform her, a fairly modest six-figure number of weeks – 310,000 – is the approximate duration of all human civilisation since the ancient Sumerians of Mesopotamia. On almost any meaningful timescale, as the contemporary philosopher Thomas Nagel has written, “we will all be dead any minute”.

And so distraction truly matters – because your experience of being alive consists of nothing other than the sum of everything to which you pay attention. At the end of your life, looking back, whatever compelled your attention from moment to moment is simply what your life will have been. When you pay attention to something you don’t especially value, it’s not an exaggeration to say that you’re paying with your life…

—  Oliver Burkeman, from At best, we’re on Earth for around 4,000 weeks —  so why do we lose so much time to online to online distraction? (The Guardian, August 7, 2021)

…the hot breath of impending Armageddon

But something about Facebook brought out truly juvenile impulses…There was a bit of bad faith in smugly ridiculing these poor people. Posts tended toward selfies of rosacea-faced long-haired women in old-style prairie dresses and lots of pregnancy crowdsourcing about progesterone and wild yams. So what, if that it what they believe? Laughing at them was a shabby use of her time, but she knew part of what made Facebook — and the internet, really — addicting was simultaneously indulging your own obsessions while mocking (deriding, denouncing even) the obsessions of others from the safety of your screen. It was hard to resist, and indulging this impulse — even silently to yourself — made everything worse, made you worse, she was sure of it. … That led her to Twitter and back to Facebook, to wildly out-of-proportion, aggro throw downs between various vegan groups and carnivore groups, omnivores and fasters. Diet had apparently become the major battlefield for all the dispossessed (i.e., all of us). There was something quaintly nineteenth-century American about it all: the focus on health, the zealotry, the desire for perfection, and the hot breath of impending Armageddon. She clicked, she tapped, she followed, she liked. A few groups she joined, and always she lurked.

Dana Spiotta, Wayward: A Novel (Knopf, July 6, 2021)


Image & Book Review from Los Angeles Times: “Dana Spiotta’s novel of midlife female rage”.

Picture is Worth…


Notes:

 

Truth

Crispin Sartwell’s half tongue-in-check defense of texting and Twitter (Sept. 22) as a “golden age of the written word” ignores all evidence of the opposite (“Texting and Twitter Make This a Golden Age for the Written Word,” op-ed, Sept. 23). Those of us who ban laptops in the classroom he labels “schoolmarms,” and he cites the old charge of they-hated-comic-books-too for the millionth time, and to equally empty effect.

He skips the fact that the SAT added a writing component in 2006, and scores have dropped every year save two when they were flat. A recent Hart Research Associates poll of employers found barely one quarter (27%) think that recent college grads are well-prepared in writing. The ACT’s college readiness scores in English have actually dropped six percentage points in the last five years.

All of this has happened while youths have texted away the hours. Mr. Sartwell calls it writing, but he doesn’t realize that tweeting and texting don’t make them better writers. They make them better tweeters and texters. To say, “Perk up, young people, and keep on texting,” as he concludes, isn’t whimsical or cute or provocative. It’s irresponsible.

~ Mark Bauerlein,  Texting Isn’t the Same as Writing, or Even Thinking  (In a Letter to the Editor, wsj.com Sept 27, 2017)


Photo: “Texting” by Noa agravante

 

All dust and flashing hooves

hooves-dust

Certainly, being in the moment would seem impossible in our culture’s time-fissioning present, our iPhoned, Facebooked, Googled, Twittered restlessness, our desperate fear of missing the latest morsel of information, our attention never more than a nanosecond from seduction — our discontinuous, du jour present, a Smithsonian so densely packed with experiential exhibits that no lingering look, no settled examination, seems permitted. No sooner do we settle into a moment than another gallops by, all dust and flashing hooves.

~ Jerry DeNuccio, from “A Moment.” Just as you’re ”in” the moment, another moment comes. What to do?. 


Notes: Quote – Thank you Beth at Alive on All Channels. Photo: Richard Baxter (Harcourt, Australia) with Spirit Dance

 

God @TheTweetofGod

god

“I’ve lost control of the situation.”

God‏ @TheTweetOfGod


Notes: Quote Source – Beth @ Alive on all Channels. Photo: Tweets of God.

Stop the World

twitter-addiction-social-media

The truth is, I feel like yelling Stop quite a bit these days. Every time I hear about Twitter I want to yell Stop. The notion of sending and getting brief updates to and from dozens or thousands of people every few minutes is an image from information hell. I’m told that Twitter is a river into which I can dip my cup whenever I want. But that supposes we’re all kneeling on the banks. In fact, if you’re at all like me, you’re trying to keep your footing out in midstream, with the water level always dangerously close to your nostrils. Twitter sounds less like sipping than drowning.

The most frightening picture of the future that I’ve read thus far in the new decade has nothing to do with terrorism or banking or the world’s water reserves—it’s an article by David Carr, the Timess media critic, published on the decade’s first day, called “Why Twitter Will Endure.” “I’m in narrative on more things in a given moment than I ever thought possible,” Carr wrote. And: “Twitter becomes an always-on data stream from really bright people.” And: “The real value of the service is listening to a wired collective voice … the throbbing networked intelligence.” And: “On Twitter, you are your avatar and your avatar is you.” And finally: “There is always something more interesting on Twitter than whatever you happen to be working on.”

This last is what really worries me. Who doesn’t want to be taken out of the boredom or sameness or pain of the present at any given moment? That’s what drugs are for, and that’s why people become addicted to them. Carr himself was once a crack addict. Twitter is crack for media addicts. It scares me, not because I’m morally superior to it, but because I don’t think I could handle it. I’m afraid I’d end up letting my son go hungry.

~ George Packer, Stop the World


Notes:

Wired

ajit johnson


Source: See others in this series by Ajit Johnson

I was never completely where I was

crackphone_2

David Roberts: Re-boot or Die Trying. One Man’s Year of Digital Detox:

[…] There was no such thing as caught up; there was, at best, keeping up. To step away from e-mail, news feeds, texts, chats, and social media for even a moment was to allow their deposited information to accumulate like snow in the driveway, a burden that grew every second it was neglected.

I spent most of my daytime hours shoveling digital snow. The core of my job—researching, thinking, writing at greater-than-140-character length—I could accomplish only in the middle of the night, when things calmed down. I spent more and more hours working, or at least work adjacent, but got less and less done.

Meanwhile, my mind and body adapted to the pace of digital life, with its ceaseless ping ping ping of notifications and alerts. I got twitchy if I was away from my phone for more than a few seconds. I felt it vibrating in my pocket when it wasn’t there, took it with me to bed, even to the bathroom. (I got pretty good at tweeting while I peed, to my enduring discredit.)

All my in-between moments, the interstitial transitions and pauses that fill the cracks of a day, were crowded with pings. My mind was perpetually in the state that researcher and technology writer Linda Stone termed continuous partial attention. I was never completely where I was, never entirely doing what I was doing. I always had one eye on the virtual world. Every bit of conversation was a potential tweet, every sunset a potential Instagram […]

Don’t miss the rest of the story here: Reboot or Die Trying. One Man’s Year of Digital Detox.


Image: “Crackphone” from Saltywaffle.com

The Vacation

boat-river-Euphrates-Turkey

Excerpt from wsj.com: “Have You Twittered Away Your Summer” by Danny Heitman:

“…As a veteran journalist, I’d be wary of following Twain’s example in disregarding an editorial deadline. But his larger point—that savoring the sheer joy of travel is more important than documenting it—resonates with special urgency these days, as Twitter, Facebook and Instagram compel us to chronicle every moment of a journey in real time. Can this kind of reportorial obsession destroy the very moment we’re trying to capture? Wendell Berry, writing a generation ago, thought that it could. In “The Vacation,” a poem published in his 1994 collection, “Entries,” Berry considers a tourist intent on faithfully recording his seasonal getaway:

Once there was a man who filmed his vacation.

He went flying down the river in his boat

with his video camera to his eye, making

a moving picture of the moving river

upon which the sleek boat moved swiftly

toward the end of his vacation. . . .

And so the poem continues, with Berry’s exacting traveler translating each fleeting moment of his sojourn into the comfortable permanence of videotape. He’s so busy filming his day, though, that he forgets to live it. “With a flick of the switch, there it would be,” Berry writes of this homemade travelogue. “But he would not be in it. He would never be in it…”

Read more @wsj.com: “Have You Twittered Away Your Summer


Image Source: Travel & Leisure. Photo courtesy of @danielkrieger: Halfeti along the Euphrates river in Turkey

Go cold turkey for Cash? A tough call.

funny-cell-phone-Facebook-computer-WiFi


Source: themetapicture.com. Thanks Susan.

Like me. Like ME. LIKE ME DAMN IT.

do-you-like-me-2

Bruce FeilerFor the Love of Being ‘Liked’ – For Some Social-Media Users, an Anxiety From Approval Seeking:

Walking through an airport newsstand this year, I noticed a novelty…I quickly snapped a photo and sent out a tweet to my modest list of followers…Then I waited for the love. I checked the response before passing through security. Nothing. I glanced again while waiting for the plane. Still nothing. I looked again before we took off. Nobody cared. My little attempt to pass a lonely hour in an airport with some friendly interaction had turned into the opposite: a brutal cold shower of social isolation.

We are deep enough into the social-media era to begin to recognize certain patterns among its users. Foremost among them is a mass anxiety of approval seeking and popularity tracking that seems far more suited to a high school prom than a high-functioning society…

…it all begins to seem a bit, well, desperate.

…Time for a rewrite, Mr. Shakespeare. This above all: to thine others be true.

…“In a lot of ways, the addictive part is in the anticipation,”

…”I noticed I get in this puppet situation,” she said. “I get bored, and there’s something compelling about being able to put something online, and all of a sudden there’s instant gratification of ‘They like me!’

…Maybe Warhol needs a rewrite, too: Today, everybody can be famous for 15 retweets.

…A growing body of research indicates how deeply our brains are wired to seek social approval.

Read full (and excellent) article at For the Love of Being ‘Liked’


Notes:

Related Posts:

Leap around like panicky jackrabbits

rabbits-jumping-illustration-gif

Mark Morford nails it again in: The Tragic Death of a Good Read

…You are not alone. Researchers say our brains are getting so heavily iTrained to leap around like panicky jackrabbits, any sentence that dares to contain more than eight words, any paragraph that contains multiple clauses, any long-form work that offers deep background info or long-winded, roundabout verbiage – AKA “literature” – merely leaves you sighing heavily and wishing for Candy Crush Saga

…English profs are reporting that their students are struggling more than ever to make it through the classics, because Henry James and Nathaniel Hawthorne don’t read like Gawker.

…It might be a small problem. It might be just a little indicative of a disturbing shift, a wicked sea change in the way we navigate not just books, not just magazines and media, but love, time, each other, the world.

…Have our insta-everything devices beaten the gracefulness out of our hearts and the patience out of our brains? And also the depth? And the meaning? Maybe.

Don’t miss reading the full post @ The Tragic Death of a Good Read


Image Credit

A Digital Detox Test: The 7 Day Digital Diet

Digital-detox-social-media

And, could I do it? Read the outcome of Patrick Leger’s test @ A Digital Detox Test: Unplug Twitter and Facebook. Put Off Email and Smartphone.

“So for one week in January… I unplugged…I disconnected during a regular workweek and, in lieu of tropical seclusion, enjoyed the subfreezing and proximal isle of Manhattan…I determined I would spend no more than 15 minutes in it each session and sign in just once over the weekend. I’d use the phone only from home and would wait until noon to turn it on. I would not initiate any text exchanges, and if I received a message, I would respond as tersely as possible or call the person back. I could not go on the Internet at all unless it was crucial, and certainly not on social media. No streaming or live TV, only DVDs. Handwritten calendar. And music only at home…”

She meant slowing things down often classes them up

Frank-Bruni

“My mother was always lavish with advice, little of it original…—“Count to 10 before you speak,” she frequently said, and she meant not just that you can’t take back what’s already been uttered. She meant that pauses are the spaces in which passions cool, civility gets its oxygen, and wisdom quite possibly finds its wings. She meant that slowing things down often classes them up….”

“What would she have made of the social media born long after she died? Of a world in which so many of us, entranced by the opportunity for instant expression and an immediate audience, post unformed thoughts, half-baked wit or splenetic reactions before we can even count to three?…I’m talking about a revved-up metabolism and roughened-up manners…That happens in part because the exchanges are disembodied: We don’t have to face whomever we’re lashing out at. But it’s also because they’re impulsive. Their timbre conforms to their tempo. Both are coarse…”

“Conversely, there was talk this year about the benefits of an activity that’s in some ways the antithesis of texting and tweeting with their rat-tat-tat rhythm. That activity is the reading of fiction. According to some researchers, people who settle into it are more empathetic — more attuned to what those around them think and feel — than people who don’t…” [Read more…]

SMWI*: Calories Burned Per Hour

healthiest sites on internet-funny


*SMSI = Saturday Morning Workout Inspiration

Source: Ben Greenman

We are, in other words, one another’s virtual enablers

Word Press & Facebook Like Symbols

NY Times, Sunday, June 16, 2013: Facebook Made Me Do It (Excerpts)

…That feedback loop of positive reinforcement is the most addictive element of social media. All those retweets, likes and favorites give us a little jolt, a little boost that pushes us to keep coming back for more. It works whether or not we post the typical social media fodder of lush vacation pictures and engagement announcements or venture into realms that showcase our most daredevilish antics and risqué behavior.

…Our growing collective compulsion to document our lives and share them online, combined with the instant gratification that comes from seeing something you are doing or experiencing get near-immediate approval from your online peers, could be giving us more reason to act out online, for better or for worse.

…We are, in other words, one another’s virtual enablers.

…the vast amplification of the potential audience a single person can reach has raised the stakes for all online activity.

…“It’s performative.”


Source: The New York Times: Facebook Made Me Do It by Jenna Wortham, Technology reporter

A Blogger’s 33 Observations on Blogging

Baldur Bjarnason

Baldur Bjarnason @ Studio Tentra titles his post: 33 Observations On The Year 2012.  Terrific post.  Here are a few of my favorite observations:

#1) Doing good work is its own reward, while sharing it leads to suffering. Most of the time nobody will notice, so it’s hard to see why anybody should bother.

#5) The vast majority of those I encountered were incredibly nice and friendly, even when we disagreed.

#6) I have almost no readers but some of my work is read a lot. The number of people that will read every post of mine is miniscule. Most of the traffic comes from retweets or links. I have more than a thousand followers on twitter, but of those only about ten will click on a link to a post of mine to read it…No matter how hard I work, the best I can hope for is to catch the attention of somebody more influential who will momentarily lend me some of their traffic. [Read more…]

%d bloggers like this: