In today's world, you'd have to be looking at your phone all the time not to notice that people are looking at their phones all the time.
Some device owners are so enamored of their digital companions that even crossing a busy street doesn't merit a little look-see at the 3-D world. Last year, when it came to playing Pokémon Go, driving a car or walking toward a cliff didn't rate some people's full attention, either.
Harris' campaign is starting to get a lot of media attention -- last month, "60 Minutes" ran a segment looking at the issue. That was followed by the comedian and tele-muckraker Bill Maher making it the subject of one of his HBO commentaries.
"The tycoons of social media have to stop pretending that they're friendly nerd-gods building a better world, and admit they're just tobacco farmers in t-shirts selling an addictive product to children," Maher opined. "Because let's face it, checking your 'likes' is the new smoking."
"Apple, Google, Facebook? They are essentially drug dealers."
Here's Maher's take (replete with his signature politically incorrect raunch):
While the comedian's view may seem over-the-top, KQED's Lesley McClurg recently reported the story of a middle-school girl who became hooked on watching YouTube, before her parents sent her to to an actual addiction recovery clinic. The cost: $60,000, paid partly from their retirement accounts.
In reporting that story, McClurg interviewed Harris, the former Google ethicist. He called the practice of tech companies using scientific techniques that foster compulsivity "brain hacking." Harris now runs a nonprofit called Time Well Spent, whose home page invites people to "reclaim our minds from being hijacked by technology."
Speaking about the current power of Google, Facebook and Apple to command our collective gaze, Harris tells McClurg: "Never before in history have a handful of technology designers working at three tech companies ... influenced how a billion people spend their attention."
Harris says that good, ethical design is being trumped by the quest for profit.
"If your company’s goal and your stock price is based on how much attention they get from someone, it’s not really about ethics," he says. "They just have to do whatever it takes to get attention."
And that has created an eyeballs-seeking arms race.
"After you finish watching a YouTube video," Harris notes, "it auto-plays the next one right away, so you don't have to make a conscious choice. Let's say that creates a 5 percent lift in how much time people spend on YouTube. So Facebook is sitting there watching their traffic get siphoned away, and Facebook says we have to make our videos auto-play, too." (Neither Google nor Facebook returned a request for comment.)
Facebook, meanwhile, has every incentive to keep you mousing through its news feed so it can sell more ads. Harris says that's one reason the company uses continuous scroll, so that new content will keep opening up as you hit the bottom of the page. But he thinks a more ethical design would be to enable what an individual user wants to do at any given moment.
"Let's say your friend texts you that dinner's off," he says. "So there you are with no plans, and you open up Facebook. At that moment, Facebook has about 1,000 people whose job is to get you to just click and scroll and watch stuff on the news feed. And that will work. You'll probably end up sitting there, an hour later, just kind of having scrolled through the news feed."
At which point, says Harris, you will have fulfilled Facebook's mission, but perhaps not your own.
But what if Facebook actually asked you what you wanted to do, apart from just using Facebook? Like perhaps finding other people who have no plans?
"It’s about agency," says Harris. "Facebook would have to have some way before you just get dropped in the newsfeed, to say, 'What do you want right now?'
B.J. Fogg runs the Persuasive Technology Lab at Stanford, which teaches students to use these sticky techniques. Many employees of top tech companies, including a cofounder of Instagram, have participated in the lab.
Fogg says he got to know some of the early Facebook employees, and found them genuinely motivated by a desire to do good.
"The individual people at Facebook, the people that I met, really wanted to make the world more harmonious, bring people together, create empathy and so on," he says.
"Where I think Tristan (Harris) and I would agree a lot is that often their business goals can be at odds with the human-centered approach to design. There's a conflict there between what they need to do as an advertising company and what's going to be really good for people."
Amusing Ourselves to Death
The reason this matters so much is that technology is going to get more and more persuasive, says Harris.
"We're sitting at the very edge of what will become a virtual reality and augmented reality world. If those worlds are even more persuasive in getting us to spend our time there, where is human agency in that process?"
And then a warning.
"We have to have that conversation now because right now it's driving toward not a good direction."
He cited a 1985 book, "Amusing Ourselves to Death," by Neil Postman, that distinguished between two dystopian visions.
"There's one that most people already know: the "1984" Big Brother, surveillance future. We have all been trained to look out for that.
"But there is this subtler second vision of power, which was the Aldous Huxley vision in "Brave New World," that's so good at giving us amusement and little bits of trivia. In other words, it's not that we shouldn't be concerned about book burning, but we should be concerned about a society that distracts us from even wanting to read."
It's the Dopamine
Ramsay Brown is the co-founder of Dopamine Labs, which uses artificial intelligence and neuroscience to help app writers attract and retain users.
Dopamine Labs makes no bones about what it's trying to do. From a promotional document on its website.:
Dopamine is a neurotransmitter associated with rewards and addictive substances. The company is not just being glib when it says it will deliver the chemical to users. Dr. Elias Aboujaoude, the director of Stanford’s Obsessive Compulsive Disorder Clinic, told KQED's McClurg that dopamine and other feel-good brain chemicals spike in people who compulsively use the internet.
Brown told me that social media companies use a concept known as variable rewards, something that slot machines use to hook gamblers, to similarly keep users clicking.
"The brain isn't particularly craving any one little feel-good signal as much as it does a really good rhythm and pattern," Brown said. Both he and Harris say Facebook and Instagram tailor the timing of the "notifications" they deliver to users -- the messages you get that are indicated by a number in red at the top right of the screen -- in order to deliver shots of dopamine to users at times determined by an algorithm.
"Sometimes there’s nothing waiting for you, sometimes there’s a friend request or someone wrote on your wall," Brown told me. "Sometimes there’s just kind of like filler crap. It’s not pertinent to your life, but Facebook's algorithms have figured out that showing it to you then is going to be slightly more surprising then not showing it to you at all or showing it to you later."
These patterns will keep you coming back.
I asked Brown how he knew that's what Facebook was doing.
"It's obvious to anyone who knows the techniques," he said.
In the "60 Minutes" segment, Larry Rosen, a professor of psychology at California State University, Dominguez Hills, who researches the psychology of tech, said typically, people check their phones every 15 minutes or less. They're not just craving dopamine; he said they're seeking relief from the stress hormone cortisol.
"Half of the time, they check their phone, there’s no alert, no notification," said Rosen. "It’s coming from inside their head, telling them, 'Gee I haven’t checked on Facebook for a while, I haven’t checked on this Twitter feed for a while. I wonder if someone commented on my Instagram post. That then generates cortisol and it starts to make you anxious. Eventually your goal is to get rid of that anxiety, so you check in."
As Anderson Cooper of "60 Minutes" put it: "Their research suggests our phones are keeping us in a continual state of anxiety in which the only antidote is the phone."
Ramsay Brown says his own company uses this type of research to help only businesses or organizations it has determined are trying to do good.
"To break the habits we don't want in ourselves or make the habits we do want in ourselves," as he puts it.
To that end, Dopamine Labs created an app called "Space," intended to help users break troublesome online habits by creating a delay before certain apps will open.
Apple initially denied the app for placement in its app store. Brown says he was told by an Apple rep that the rejection came because any app that encouraged people to use other apps less was inappropriate for the store. After the "60 Minutes" segment aired, Apple accepted the app. (An Apple spokesperson said the rejection had to do with a technical issue and that "The adjustment had nothing to do with whether the app discouraged people from using other apps or not.")
Even though Dopamine Labs might choose clients according to its own definition of doing good, I wondered if the application of techniques that are as powerful and potentially insidious as he and other researchers say they are is justified, no matter what the product.
"We are in a bit of a Robert Oppenheimer moment," Brown said, citing the scientist who is often called the father of the atomic bomb, and who later expressed a deep ambivalence about his work.
"We have now developed a rigorous technology of the human mind, and that is both exciting and terrifying. We have the ability to twiddle some nobs in a machine learning dashboard we build, and around the world hundreds of thousands of people are going to quietly change their behavior in ways that, unbeknownst to them, feel second-nature but are really by design."
"Which means that there's a deep ethical imperative for us to use it for good."
So What's the Harm?
As KQED's McClurg reported, addictions to social media, video games, texting, shopping and pornography are not officially listed disorders in the Diagnostic and Statistical Manual of Mental Disorders. A consensus is growing, however, that compulsive online behavior is doing real harm.
Hundreds of papers have been written on the negative consequences of using Facebook, alone. While some studies have also shown positive effects, Jean Twenge, a professor of psychology at San Diego State University and the author of "Generation Me," says more rigorous research has come to a more negative conclusion.
“Its pretty clear these days that spending more time on social media leads to a more negative mood,” she said.
Twenge says her research shows that the proliferation of the smartphone is having big effects on people born around 1995. She say that's when the millennial generation morphs into "iGen" -- also known as Generation Z.
She pointed to the plummeting employment rate of young men as one macro-development related to iGen, and cited the work of University of Chicago economist Erik Hurst. Last year, in a university profile, Hurst discussed his research on the dwindling percentage of young males without a college degree in the labor force and this trend's connection to leisure-time technology:
In the 2000s, employment rates for this group dropped sharply – more than in any other group. We have determined that, in general, they are not going back to school or switching careers, so what are they doing with their time? The hours that they are not working have been replaced almost one for one with leisure time. Seventy-five percent of this new leisure time falls into one category: video games. The average low-skilled, unemployed man in this group plays video games an average of 12, and sometimes upwards of 30 hours per week. This change marks a relatively major shift that makes me question its effect on their attachment to the labor market.
The Attention Economy
BJ Fogg says a lot of persuasion methods that Facebook and other tech companies use are not really new.
He gave the example of Facebook birthday reminders, which often draw people back into the site to wish someone happy birthday. "Reminders are not new," he said, citing Hallmark TV commercials about mother's day, for example. "What ‘s different today is that machines are being created to use these techniques."
At this point, I am reminded that those of us who create text for public consumption have also been in the business of attention-grabbing for quite some time. This excerpt we recently posted from Tim Wu's book "The Attention Merchants," is instructive; it takes you through the evolution and eventual mainstream adoption of clickbait -- those headlines that contain just the right words to whet your appetite for a click.
While we here at KQED have yet to hire a an actual neuroscientist to help us craft the perfect syntactic arrangement to make our post on the California brown pelican count irresistible to a mass audience, we have sat through any number of workshops led by self-styled audience-whisperers. And we do use software tools to try to figure out what works and what doesn't in terms of getting people to read an article.
Isn't that, crudely, some form of brain hacking?
The point being: Everyone is in quest of your eyeballs. The question is how far will people go to get them. Where the line gets crossed from superior business model to dirty rotten trick is the subject of much debate, from the halls of government to academia to Thanksgiving brouhahas.
On the "60 Minutes" segment, Gabe Zichermann, who consults for companies on how to use "gamification" to make their digital products more appealing, argued that attracting audience is simply the name of the game.
"Asking technology companies, asking content creators to be less good at what they do feels like a ridiculous ask," he said. "It feels impossible. And also it’s very anti-capitalistic. This isn’t the system that we live in."
The other side of that, expressed by Bill Maher in his televised rant:
"The moral rot in this country began when corporate America decided it wasn’t enough to just successfully sell your product; people needed to be addicted to it.”
Meantime, while you wait for society to figure this issue out, it's probably best to take matters into your own compulsively typing hands. If you've ever said "I wish I knew how to quit you" to your phone, see Lesley McClurg's post, 'Help! My Phone is Ruining My Life!' for eight tips on how to detach.