Ten Arguments for Deleting Your Social Media Accounts Right Now

Ten Arguments for Deleting Your Social Media Accounts Right Now

by Jaron Lanier

Narrated by Oliver Wyman

Unabridged — 4 hours, 44 minutes

Ten Arguments for Deleting Your Social Media Accounts Right Now

Ten Arguments for Deleting Your Social Media Accounts Right Now

by Jaron Lanier

Narrated by Oliver Wyman

Unabridged — 4 hours, 44 minutes

Audiobook (Digital)

$10.44
FREE With a B&N Audiobooks Subscription | Cancel Anytime
$0.00

Free with a B&N Audiobooks Subscription | Cancel Anytime

$10.99 Save 5% Current price is $10.44, Original price is $10.99. You Save 5%.
START FREE TRIAL

Already Subscribed? 

Sign in to Your BN.com Account


Listen on the free Barnes & Noble NOOK app


Related collections and offers

FREE

with a B&N Audiobooks Subscription

Or Pay $10.44 $10.99

Overview

"Narrator Oliver Wyman brings his full complement of vocalizations to this polemic on social media and what it is doing to us. His cadence and delivery are spot-on..." - AudioFile Magazine

Ten Arguments for Deleting Your Social Media Accounts Right Now is a timely call-to-arms from a Silicon Valley pioneer.

You might have trouble imagining life without your social media accounts, but virtual reality pioneer Jaron Lanier insists that we're better off without them. In his important new audiobook, Lanier, who participates in no social media, offers powerful and personal reasons for all of us to leave these dangerous online platforms behind before it's too late.

Lanier's reasons for freeing ourselves from social media's poisonous grip include its tendency to bring out the worst in us, to make politics terrifying, to trick us with illusions of popularity and success, to twist our relationship with the truth, to disconnect us from other people even as we are more “connected” than ever, to rob us of our free will with relentless targeted ads. How can we remain autonomous in a world where we are under continual surveillance and are constantly being prodded by algorithms run by some of the richest corporations in history that have no way of making money other than being paid to manipulate our behavior? How could the “benefits” of social media possibly outweigh the catastrophic losses to our personal dignity, happiness, and freedom?

Lanier remains a tech optimist, so while demonstrating the evil that rules social media business models today, he also envisions a humanistic setting for social networking that can direct us towarda richer and fuller way of living and connecting with our world.


Editorial Reviews

AUGUST 2018 - AudioFile

Narrator Oliver Wyman brings his full complement of vocalizations to this polemic on social media and what it is doing to us. His cadence and delivery are spot-on, and his modulation of tone is especially effective when delivering the recurrent exhortation: “Delete your accounts.” Author Jaron Lanier, a pioneer in virtual reality and longtime Silicon Valley guru, has created a powerful screed aimed squarely against the giants of tech, whose products have created “think groups,” allowed bad actors to penetrate social media with harmful results, and legitimized nonsensical arguments like those against vaccination. Lanier dubs the issues with social media as “a bummer,” and its moving parts include attention acquisition, intrusion into everyone’s lives, the cramming of content down people’s throats, and the use of fake mobs and fake society. A.D.M. Winner of AudioFile Earphones Award © AudioFile 2018, Portland, Maine

The New York Times Book Review - Franklin Foer

Critics of the big technology companies have refrained from hectoring users to quit social media. It's far more comfortable to slam a corporate leviathan than it is to shame your aunt or high school pals—or, for that matter, to jettison your own long list of "friends." As our informational ecosystem has been rubbished, we have placed very little onus on the more than two billion users of Facebook and Twitter. So I'm grateful to Jaron Lanier for redistributing blame on the lumpen-user, for pressing the public to flee social media…Whatever the flaws of this short manifesto, Lanier shows the tactical value of appealing to the conscience of the individual. In the face of his earnest argument, I felt a piercing shame about my own presence on Facebook. I heeded his plea and deleted my account.

Publishers Weekly

04/30/2018
Virtual reality pioneer Lanier (Dawn of the New Everything: Encounters with Reality and Virtual Reality) tediously reiterates well-known pitfalls of social media, arguing that the major platforms are manipulating users’ thoughts, goading their inner trolls, tearing society apart, and just generally making everyone unhappy. Lanier, a Silicon Valley insider, spells out his arguments against social media in 10 breezy chapters with titles like “You Are Losing Your Free Will” and “Social Media Is Making Politics Impossible.” His underlying argument takes aim at the business models behind popular platforms like Facebook and Google that enable third-party actors such as advertisers—to pay to modify users’ behavior using personalized, continuously adjusted stimuli. Unfortunately, his short treatise is overridden with shallow political commentary (as when he refers to Trump as a victim of Twitter) and scant analysis of critical issues (he’s quick to dismiss the role of social media in the #MeToo movement, Black Lives Matter, and the Arab Spring uprisings). Baseless generalizations and vague platitudes undermine the author’s case, which is particularly unfortunate given his experience and expertise in the world he skewers. (July)

From the Publisher

A WIRED "All-Time Favorite Book"
A Financial Times Best Book of 2018

“Profound . . . Lanier shows the tactical value of appealing to the conscience of the individual. In the face of his earnest argument, I felt a piercing shame about my own presence on Facebook. I heeded his plea and deleted my account.”
—Franklin Foer, The New York Times Book Review

“Mixes prophetic wisdom with a simple practicality . . . Essential reading.”
The New York Times (Summer Reading Preview)

“The title says it all . . . Lanier advocates untethering from social media, which fosters addiction and anomie and generally makes us feel worse and more fearful about each other and the world . . . The experiment could be a useful one, though it will darken the hearts of the dark lords—a winning argument all its own.” —Kirkus Reviews

Ten Arguments for Deleting Your Social Media Accounts Right Now is not anti-tech or even anti-phone. It is one of the most optimistic books about the Internet I’ve ever read because it dares to hope for better. Profoundly skeptical of the business model that undergirds social media, Lanier demonstrates the ways in which our social media accounts make us not consumer but product, our every connection monitored by unseen third parties who harvest our data, monetize our communication, and curate and manipulate our behavior. Another online life is possible, but first we have to destroy the one we’re trapped in. The great news is you don’t have to take to the streets—you don’t even have to leave your room. You can do it all by pressing one little key . . . A blisteringly good, urgent, essential read.” —Zadie Smith, author of Feel Free

AUGUST 2018 - AudioFile

Narrator Oliver Wyman brings his full complement of vocalizations to this polemic on social media and what it is doing to us. His cadence and delivery are spot-on, and his modulation of tone is especially effective when delivering the recurrent exhortation: “Delete your accounts.” Author Jaron Lanier, a pioneer in virtual reality and longtime Silicon Valley guru, has created a powerful screed aimed squarely against the giants of tech, whose products have created “think groups,” allowed bad actors to penetrate social media with harmful results, and legitimized nonsensical arguments like those against vaccination. Lanier dubs the issues with social media as “a bummer,” and its moving parts include attention acquisition, intrusion into everyone’s lives, the cramming of content down people’s throats, and the use of fake mobs and fake society. A.D.M. Winner of AudioFile Earphones Award © AudioFile 2018, Portland, Maine

Kirkus Reviews

2018-05-09
In a book whose title says it all, technoprophet Lanier (Dawn of the New Everything, 2017, etc.) weighs in against predatory technoprofit.In a world of dogs, it's better to be a cat. So, in this brief polemic, writes the author, who uses the animal terms advisedly: Dogs are easily trained to respond to stimuli, as Ivan Pavlov knew; humans are as easily trained, à la B.F. Skinner, when given proper rewards. "Dog whistles," Lanier adds meaningfully, "can only be heard by dogs." Cats, on the other hand, live in the world while somehow not being quite of it, a model for anyone seeking to get out of the grasp of algorithms and maybe go outside for a calming walk. The metaphor has value. So does the acronym BUMMER, which Lanier coins to sum up the many pieces of his argument: "Behavior of Users Modified, and Made into an Empire for Rent." It's a little clunky, but the author scores points with more direct notes: "E," he writes, "is for Earning money from letting the worst assholes secretly screw with everyone else." As we're learning from the unfolding story of Cambridge Analytica, which just filed for bankruptcy, he's got a point. Lanier advocates untethering from social media, which fosters addiction and anomie and generally makes us feel worse and more fearful about each other and the world. Continuing the dog metaphor, it—Lanier uses "media" as a singular noun, which, considering its monolithic nature, may no longer send grammarians screaming—also encourages pack behavior, howling at strangers and sounds in the night. His central objection, though, would seem to be this: "We have enshrined the belief that the only way to finance a connection between two people is through a third person who is paying to manipulate them." If we accept that, then it's self-evident why one would want to unplug.The experiment could be a useful one, though it will darken the hearts of the dark lords—a winning argument all its own.

Product Details

BN ID: 2940169197662
Publisher: Macmillan Audio
Publication date: 05/29/2018
Edition description: Unabridged

Read an Excerpt

CHAPTER 1

ARGUMENT ONE

YOU ARE LOSING YOUR FREE WILL

WELCOME TO THE CAGE THAT GOES EVERYWHERE WITH YOU

Something entirely new is happening in the world. Just in the last five or ten years, nearly everyone started to carry a little device called a smartphone on their person all the time that's suitable for algorithmic behavior modification. A lot of us are also using related devices called smart speakers on our kitchen counters or in our car dashboards. We're being tracked and measured constantly, and receiving engineered feedback all the time. We're being hypnotized little by little by technicians we can't see, for purposes we don't know. We're all lab animals now.

Algorithms gorge on data about you, every second. What kinds of links do you click on? What videos do you watch all the way through? How quickly are you moving from one thing to the next? Where are you when you do these things? Who are you connecting with in person and online? What facial expressions do you make? How does your skin tone change in different situations? What were you doing just before you decided to buy something or not? Whether to vote or not?

All these measurements and many others have been matched up with similar readings about the lives of multitudes of other people through massive spying. Algorithms correlate what you do with what almost everyone else has done.

The algorithms don't really understand you, but there is power in numbers, especially in large numbers. If a lot of other people who like the foods you like were also more easily put off by pictures of a candidate portrayed in a pink border instead of a blue one, then you probably will be too, and no one needs to know why. Statistics are reliable, but only as idiot demons.

Are you sad, lonely, scared? Happy, confident? Getting your period? Experiencing a peak of class anxiety?

So-called advertisers can seize the moment when you are perfectly primed and then influence you with messages that have worked on other people who share traits and situations with you.

I say "so-called" because it's just not right to call direct manipulation of people advertising. Advertisers used to have a limited chance to make a pitch, and that pitch might have been sneaky or annoying, but it was fleeting. Furthermore, lots of people saw the same TV or print ad; it wasn't adapted to individuals. The biggest difference was that you weren't monitored and assessed all the time so that you could be fed dynamically optimized stimuli — whether "content" or ad — to engage and alter you.

Now everyone who is on social media is getting individualized, continuously adjusted stimuli, without a break, so long as they use their smartphones. What might once have been called advertising must now be understood as continuous behavior modification on a titanic scale.

Please don't be insulted. Yes, I am suggesting that you might be turning, just a little, into a well-trained dog, or something less pleasant, like a lab rat or a robot. That you're being remote-controlled, just a little, by clients of big corporations. But if I'm right, then becoming aware of it might just free you, so give this a chance, okay?

A scientific movement called behaviorism arose before computers were invented. Behaviorists studied new, more methodical, sterile, and nerdy ways to train animals and humans.

One famous behaviorist was B. F. Skinner. He set up a methodical system, known as a Skinner box, in which caged animals got treats when they did something specific. There wasn't anyone petting or whispering to the animal, just a purely isolated mechanical action — a new kind of training for modern times. Various behaviorists, who often gave off rather ominous vibes, applied this method to people. Behaviorist strategies often worked, which freaked everyone out, eventually leading to a bunch of creepy "mind control" sci-fi and horror movie scripts.

An unfortunate fact is that you can train someone using behaviorist techniques, and the person doesn't even know it. Until very recently, this rarely happened unless you signed up to be a test subject in an experiment in the basement of a university's psychology building. Then you'd go into a room and be tested while someone watched you through a one-way mirror. Even though you knew an experiment was going on, you didn't realize how you were being manipulated. At least you gave consent to be manipulated in some way. (Well, not always. There were all kinds of cruel experiments performed on prisoners, on poor people, and especially on racial targets.)

This book argues in ten ways that what has become suddenly normal — pervasive surveillance and constant, subtle manipulation — is unethical, cruel, dangerous, and inhumane. Dangerous? Oh, yes, because who knows who's going to use that power, and for what?

THE MAD SCIENTIST TURNS OUT TO CARE ABOUT THE DOG IN THE CAGE

You may have heard the mournful confessions from the founders of social media empires, which I prefer to call "behavior modification empires."

Here's Sean Parker, the first president of Facebook:

We need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever. ... It's a social-validation feedback loop ... exactly the kind of thing that a hacker like myself would come up with, because you're exploiting a vulnerability in human psychology. ... The inventors, creators — it's me, it's Mark [Zuckerberg], it's Kevin Systrom on Instagram, it's all of these people — understood this consciously. And we did it anyway ... it literally changes your relationship with society, with each other. ... It probably interferes with productivity in weird ways. God only knows what it's doing to our children's brains.

Here's Chamath Palihapitiya, former vice president of user growth at Facebook:

The short-term, dopamine-driven feedback loops we've created are destroying how society works. ... No civil discourse, no cooperation; misinformation, mistruth. And it's not an American problem — this is not about Russian ads. This is a global problem. ... I feel tremendous guilt. I think we all knew in the back of our minds — even though we feigned this whole line of, like, there probably aren't any bad unintended consequences. I think in the back, deep, deep recesses of, we kind of knew something bad could happen. ... So we are in a really bad state of affairs right now, in my opinion. It is eroding the core foundation of how people behave by and between each other. And I don't have a good solution. My solution is I just don't use these tools anymore. I haven't for years.

Better late than never. Plenty of critics like me have been warning that bad stuff was happening for a while now, but to hear this from the people who did the stuff is progress, a step forward.

For years, I had to endure quite painful criticism from friends in Silicon Valley because I was perceived as a traitor for criticizing what we were doing. Lately I have the opposite problem. I argue that Silicon Valley people are for the most part decent, and I ask that we not be villainized; I take a lot of fresh heat for that. Whether I've been too hard or too soft on my community is hard to know.

The more important question now is whether anyone's criticism will matter. It's undeniably out in the open that a bad technology is doing us harm, but will we — will you, meaning you — be able to resist and help steer the world to a better place?

Companies like Facebook, Google, and Twitter are finally trying to fix some of the massive problems they created, albeit in a piecemeal way. Is it because they are being pressured or because they feel that it's the right thing to do? Probably a little of both.

The companies are changing policies, hiring humans to monitor what's going on, and hiring data scientists to come up with algorithms to avoid the worst failings. Facebook's old mantra was "Move fast and break things," and now they're coming up with better mantras and picking up a few pieces from a shattered world and gluing them together.

This book will argue that the companies on their own can't do enough to glue the world back together.

Because people in Silicon Valley are expressing regrets, you might think that now you just need to wait for us to fix the problem. That's not how things work. If you aren't part of the solution, there will be no solution.

This first argument will introduce a few key concepts behind the design of addictive and manipulative network services. Awareness is the first step to freedom.

CARROT AND SHTICK

Parker says Facebook intentionally got people addicted, while Palihapitiya is saying something about the negative effects on relationships and society. What is the connection between these two mea culpas?

The core process that allows social media to make money and that also does the damage to society is behavior modification. Behavior modification entails methodical techniques that change behavioral patterns in animals and people. It can be used to treat addictions, but it can also be used to create them.

The damage to society comes because addiction makes people crazy. The addict gradually loses touch with the real world and real people. When many people are addicted to manipulative schemes, the world gets dark and crazy.

Addiction is a neurological process that we don't understand completely. The neurotransmitter dopamine plays a role in pleasure and is thought to be central to the mechanism of behavior change in response to getting rewards. That is why Parker brings it up.

Behavior modification, especially the modern kind implemented with gadgets like smartphones, is a statistical effect, meaning it's real but not comprehensively reliable; over a population, the effect is more or less predictable, but for each individual it's impossible to say. To a degree, you're an animal in a behaviorist's experimental cage. But the fact that something is fuzzy or approximate does not make it unreal.

Originally, food treats were the most common reward used in behaviorist experiments, though the practice goes back to ancient times. Every animal trainer uses them, slipping a little treat to a dog after it has performed a trick. Many parents of young children do it, too.

One of the first behaviorists, Ivan Pavlov, famously demonstrated that he didn't need to use real food. He would ring a bell when a dog was fed, and eventually the dog would salivate upon hearing the bell alone.

Using symbols instead of real rewards has become an essential trick in the behavior modification toolbox. For instance, a smartphone game like Candy Crush uses shiny images of candy instead of real candy to become addictive. Other addictive video games might use shiny images of coins or other treasure.

Addictive pleasure and reward patterns in the brain — the "little dopamine hit" cited by Sean Parker — are part of the basis of social media addiction, but not the whole story, because social media also uses punishment and negative reinforcement.

Various kinds of punishment have been used in behaviorist labs; electric shocks were popular for a while. But just as with rewards, it's not necessary for punishments to be real and physical. Sometimes experiments deny a subject points or tokens.

You are getting the equivalent of both treats and electric shocks when you use social media.

Most users of social media have experienced catfishing (which cats hate), senseless rejection, being belittled or ignored, outright sadism, or all of the above, and worse. Just as the carrot and stick work together, unpleasant feedback can play as much of a role in addiction and sneaky behavior modification as the pleasant kind.

THE ALLURE OF MYSTERY

When Parker uses the phrase "every once in a while," he's probably referring to one of the curious phenomena that behaviorists discovered while studying both animals and people. If someone gets a reward — whether it's positive social regard or a piece of candy — whenever they do a particular thing, then they'll tend to do more of that thing. When people get a flattering response in exchange for posting something on social media, they get in the habit of posting more.

That sounds innocent enough, but it can be the first stage of an addiction that becomes a problem both for individuals and society. Even though Silicon Valley types have a sanitized name for this phase, "engagement," we fear it enough to keep our own children away from it. Many of the Silicon Valley kids I know attend Waldorf schools, which generally forbid electronics.

Back to the surprising phenomenon: it's not that positive and negative feedback work, but that somewhat random or unpredictable feedback can be more engaging than perfect feedback.

If you get a piece of candy immediately every time you say please as a child, you'll probably start saying please more often. But suppose once in a while the candy doesn't come. You might guess that you'd start saying please less often. After all, it's not generating the reward as reliably as it used to.

But sometimes the opposite thing happens. It's as if your brain, a born pattern finder, can't resist the challenge. "There must be some additional trick to it," murmurs your obsessive brain. You keep on pleasing, hoping that a deeper pattern will reveal itself, even though there's nothing but bottomless randomness.

It's healthy for a scientist to be fascinated by a pattern that doesn't quite make sense. Maybe that means there's something deeper to be discovered. And it's a great tool to exploit if you're writing a script. A little incongruity makes a plot or a character more fascinating.

But in many situations it's a terrible basis for fascination. The allure of glitchy feedback is probably what draws a lot of people into crummy "codependent" relationships in which they aren't treated well.

A touch of randomness is more than easy to generate in social media: because the algorithms aren't perfect, randomness is intrinsic. But beyond that, feeds are usually calculated to include an additional degree of intentional randomness. The motivation originally came from basic math, not human psychology.

Social media algorithms are usually "adaptive," which means they constantly make small changes to themselves in order to try to get better results; "better" in this case meaning more engaging and therefore more profitable. A little randomness is always present in this type of algorithm.

Let's suppose an algorithm is showing you an opportunity to buy socks or stocks about five seconds after you see a cat video that makes you happy. An adaptive algorithm will occasionally perform an automatic test to find out what happens if the interval is changed to, say, four and a half seconds. Did that make you more likely to buy? If so, that timing adjustment might be applied not only to your future feed, but to the feeds of thousands of other people who seem correlated with you because of anything from color preferences to driving patterns.

Adaptive algorithms can get stuck sometimes; if an algorithm gets no further benefits from further small tweaks to its settings, then further small tweaks won't stick. If changing to four and a half seconds makes you less likely to buy socks, but five and a half seconds also makes sales less likely, then the timing will remain at five seconds. On the basis of available evidence, five seconds would be the best possible time to wait. If no small random change helps, then the algorithm stops adapting. But adaptive algorithms aren't supposed to stop adapting.

Suppose changing even more might improve the result? Maybe two and a half seconds would be better, for instance. But incremental tweaks wouldn't reveal that, because the algorithm got stuck at the five-second setting. That's why adaptive algorithms also often include a sparser dose of greater randomness. Every once in while an algorithm finds better settings by being jarred out of merely okay settings.

Adaptive systems often include such a leaping mechanism. An example is the occurrence of useful mutations in natural evolution, which is usually animated by more incremental selection-based events in which the genes from an individual are either passed along or not. A mutation is a wild card that adds new possibilities, a jarring jump. Every once in a while a mutation adds a weird, new, and enhancing feature to a species.

Neuroscientists naturally wonder whether a similar process is happening within the human brain. Our brains surely include adaptive processes; brains might be adapted to seek out surprises, because nature abhors a rut.

When an algorithm is feeding experiences to a person, it turns out that the randomness that lubricates algorithmic adaptation can also feed human addiction. The algorithm is trying to capture the perfect parameters for manipulating a brain, while the brain, in order to seek out deeper meaning, is changing in response to the algorithm's experiments; it's a cat-and-mouse game based on pure math. Because the stimuli from the algorithm don't mean anything, because they genuinely are random, the brain isn't adapting to anything real, but to a fiction. That process — of becoming hooked on an elusive mirage — is addiction. As the algorithm tries to escape a rut, the human mind becomes stuck in one.

(Continues…)


Excerpted from "Ten Arguments for Deleting Your Social Media Accounts Right Now"
by .
Copyright © 2018 Jaron Lanier.
Excerpted by permission of Henry Holt and Company.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.

From the B&N Reads Blog

Customer Reviews