On the April 9 edition of 60 Minutes, Anderson Cooper sat down with former Google product manager Tristan Harris to discuss a troubling issue: how Silicon Valley exploits neuroscience to keep us addicted to technology.
In the segment, Harris describes how app designers exploit people’s desire for pleasure, stimulation, and social connections to keep them hooked in to their devices for as long as possible. He says that, whether it’s intentional or not, Silicon Valley is programming users’ brains. Part of what makes this possible is that interactions with our smartphones trigger a response in a very primitive part of our brains.
“Every time I check my phone, I’m playing the slot machine to see, ‘What did I get?’” Harris tells 60 Minutes. He’s referring, of course, to social media notifications. The slot machine payout is the jolt of dopamine that rewards us when we do something worthy of an internal reward. And while actual slot machines are mostly limited to casinos, we can play the dopamine slot machine no matter where we are.
This is the case because the human brain possesses a marvelously developed system that rewards people with good feelings when we do things that fulfill us, make us happy, or simply ensure our continued survival. Every time you check your phone and see a like or comment or retweet, your brain releases a little bit of dopamine, the neurotransmitter associated with reward. It’s no coincidence that dopamine is the same chemical that floods a person’s brain when they use cocaine. But dopamine isn’t just associated with drugs or artificial stimulation. It’s always been a part of who we are.
The dopaminergic system is an ancient adaptation that scientists suspect drives many of our everyday behaviors. Dopamine rewards us when we eat sweets, when we act kindly towards other people, and when we receive recognition for achievements. It’s our brain’s way of telling us when we’re doing well. In the case of social apps like Facebook, Instagram, and Snapchat, our brains are rewarding us for fulfilling a basic human desire for human connection.
“Somebody commented on something we posted. Somebody liked something. Somebody pinged us. It’s psychological triggers staggered on top of psychological triggers,” Patrycja Slawuta, the founder of Self Hackathon, told Inverse in 2016. “[Facebook] fulfills the basic human need to belong since we are wired to connect as social animals.”
And when doing something makes us feel good, something like staying connected with other people, we tend to keep doing it. By exploiting our natural tendency to seek things that make us feel good, smartphone app developers can foster compulsive use patterns. And Harris contends that not only can developers harness our neuroscience to create compulsive use and emotional investment, but they definitely do.
“There’s a whole playbook of techniques that get used to get you using the product for as long as possible,” Harris tells 60 Minutes.
Harris brings up the example of Snapchat’s “Streak” feature that keeps a running count of how many days in a row you’ve interacted with each of your friends. This is a way of keeping users invested in the app, making sure they keep wanting to come back to it. In this way, it’s anxiety as much as rewards that keep us coming back to apps over and over. Snapchat isn’t the only one that does this. Pokémon GO has its own version of streaks, rewarding users for multiple days of play in a row.
And these things work. We keep coming back to our phones for more little surges of dopamine. A 2016 study of smartphone use found that the average user has 76 phone sessions a day. These could be long sessions to respond to an email or have a text conversation, or they could be short sessions in which you turn on your home screen to check for notifications. Either way, they’re triggering little internal rewards each time. And Silicon Valley app developers know this.
“Inadvertently, whether they want to or not, they are shaping the thoughts and feelings and actions of people,” Harris tells 60 Minutes. “There’s always this narrative that technology’s neutral. And it’s up to us to choose how we use it. This is just not true.”