Innovation

'Social Dilemma:' Netflix film director explains hidden cost of social media

Jeff Orlowski, director of Netflix's "The Social Dilemma," explains how companies like Facebook and Twitter have come to wield large-scale power.

Netflix

Jeff Orlowski has stopped using social media.

The director of The Social Dilemma, the Netflix documentary set for release on September 9, has spent the past two years speaking with executives and experts about the problems with Facebook, Twitter, and other social media sites.

“These companies have collected so much data and they know so much about us, through the billions of sensors that we’re carrying with us all day now on our phones,” Orlowski tells Inverse. “We don’t have transparency into how that data and power is being used.”

Inverse spoke with Orlowski to find out how this emerged and what the film reveals about our daily social media habits.

The movie focuses on Tristan Harris, who while at Google raised concerns about design ethics. Long-standing issues have gradually evolved into a complex network where information feeds are controlled by opaque algorithms – an issue also explored recently in Coded Bias. The resultant social networks have been accused of influencing democratic elections, worsening mental health, and spreading coronavirus disinformation.

Inverse: The start of the film notes that this is a hard-to-define problem. How did you start to bring this all together?

Orlowski: I knew Tristan from college we both went to Stanford. In 2017, I started seeing him talk about how the technology was designed, the impact it’s having on people and how it’s manipulating people. I’d never heard any critiques of social media, and was really curious.

Having spent the past decade or so working on climate-related stories, it’s a similar problem. We have a resource that people discovered and thought everything is great and look at how profitable it can be and look at the great positives that it brings to society. Only later do we see the consequences and the real harms as the invisible problems are being brought to light.

With climate change, you have the hidden costs that don’t reveal themselves until later.

We all think [social media] is a free thing. It allows me to connect to people and blah, blah, blah. But little do we realize that it’s powered based on individual targeting, individual information. It puts us all into our own filter bubbles.

We’ve let the experiment run for a decade. Now we’re seeing it’s affecting people at the individual level with mental health and depression and suicide, and then it’s affecting us at the societal level, with misinformation running rampant, conspiracy theories, election manipulation. All of these things that we’re now seeing have huge consequences.

It’s toward the second half of the film where you start to realize what it’s been building up to.

It’s just becoming more evident. There’s a line in the film that I love from Roger McNamee. He says Russia didn’t hack Facebook, they just used Facebook. We’re seeing that again today with the news about how Russia is using Facebook and Twitter to separate Americans, hiring Americans to write stories that are polarizing Americans.

It’s fascinating that Tristan’s story starts back when he was at Gmail with his 2013 presentation, which seems like a totally different issue!

The way he was thinking about it at the beginning was, are we designing for the user’s interests or not? And I think that goes back to this big frame. If you’re not paying for the product, you are the product.

Tristan Harris (right).Matt Winkelmeyer/Getty Images Entertainment/Getty Images

There’s a speculation that if we had to pay a penny for every email we sent, spam would basically disappear. But when we think about the problems of email, parts of it are that this whole thing is free.

"I think at large the system is an exploitative and extractive system that has turned humans into resources."

At some point, [Google] transitioned [Tristan] into a role that they basically made for him, a design ethicist role. His job then was to research the implications of the design, and how can code affect society and people. That’s really what pushed him down the path of his thinking. We are building the digital infrastructure on which the world sits, and when you program that a certain way you literally are controlling what people do. Do I want you to look at this thing or that for how long, when and where? There’s an immense and asymmetric power that the programmers have.

Google Maps has massively grown from what it originally started as, and it’s so powerful now that they actually can dictate traffic flows. They have so much data on where people are moving in their vehicles. They can move different people to different streets to increase the flow and efficiency on different roads, etc. But nobody gave Google that power. There’s no government agency or individual that decided that Google should be dictating where people are moving through the world.

You mentioned you basically stopped using social media while working on the film. Have you continued to cut back?

[Laughs] I posted the trailer as an explanation for why I stopped using social media. I don’t use it at all. I was a very heavy social media user. I would consider it an addiction. I had to use the practices of addiction to wean myself off of it. I felt such a strong pull to go back to it every day. But at this point, I have no desire for it. I really had to actively work to get there. But I see how much this has transformed my life, like how much of my own time and mental energy has come back to me by removing these systems from my life.

I suppose you could justify your social media usage because you’re working in a public-facing role and looking to promote your work.

Oh, I could easily justify that. And to be clear, there are a lot of people who make their living on social media.

Shutterstock

I’m not suggesting that people delete their accounts. I want people to be aware of how these systems work and how do you feel about using them. There are countless times where I am torn. It can be used as a platform to get ideas out to your community. And activism can happen on social media, countless great things have come on social media platforms.

But bringing it back to the fossil fuel analogy, it’s awesome that I can drive myself from point A to B or fly across the ocean. And those are huge advances for civilization, right? Using fossil fuels for transport is a huge advantage to civilization. But we’re now seeing the consequences of that. And we need to rethink that if we want to have a healthy and sustained civilization down the line.

Despite the short term immediate benefits that might come from it, I think at large the system is an exploitative and extractive system that has turned humans into resources. It doesn’t care about what’s good for society or for us individually. It has so warped the way we interact. There’s a phrase that former Facebook engineer [Chamath Palihapitiya] uses: it is “ripping apart the social fabric of how society works.” And that’s really how I think of this.

That’s one thing that stuck out to me, how Pinterest’s former president Tim Kendall said he restricted his childrens’ social media usage.

Many, many of the tech engineers that we’ve met, and I understand many of the executives, don’t let their kids use this stuff. If you’re not giving this to your own kids, there’s probably something wrong with it. And yet, they’re fine exploiting the public as a whole for their profits, but aren’t willing to give it to their own children.

Tim Kendall.Sportsfile/Sportsfile/Getty Images

Was there anything else that surprised you during filming?

I think I had this impression of Facebook, it’s a place where I can go and connect with friends and Google is a place where I can get access to information. We have this image in our minds of what these things are, based on how they’ve been advertised to us.

The back business is really what’s driving their value. And when you ask the question like, how is a free product worth hundreds of billions of dollars? It really makes you reevaluate what’s actually going on.

If a sneaker company wants to make more money, they have to sell more sneakers. If Facebook wants to make more money, they have to sell more more eyeballs. They have powerful algorithms that can literally just do that. At the end of the financial quarter, one of these tech companies can just increase the ad load and turn a dial on an algorithm. Tim Kendall was saying they can make $100 million in a day or two. That is bonkers!

One of the counterpoints that was raised during the film was when Tristan was on a panel discussion. A panelist said that we’ve seen these criticisms before and it worked out fine.

It’s something we spent quite a bit of time thinking about. And my take on this now is that it’s not just another step forward. This is such a huge leap forward because it’s powered by machine learning. We have built computer programs that learn and get better on their own. That is not within the realm of human experience. We haven’t had that before.

"Here’s a phrase that former Facebook engineer [Chamath Palihapitiya] uses: it is “ripping apart the social fabric of how society works.” And that’s really how I think of this."

Machine learning algorithms can be used for great, positive advances in society. But that’s not how we’re seeing it applied on social media. Self-driving car technology is based on machine learning, and these algorithms that will get better and better over time. The hope there is that it will decrease motor vehicle accidents, because hopefully the computers will be able to drive better than humans. That really has humanity’s interests at heart in my mind. It’s also something that the public would be paying for, like you pay for a Tesla and you get its Autopilot technology as part of the vehicle.

I am not paying for Facebook. I’m not the customer, somebody else is. The tools are designed for others, and not society’s interest.

What action do you hope to see emerge following this film?

We’re doing a campaign around how the technology is used, how it’s designed, and how it’s regulated. This is a very complex problem, just like climate change. There’s no silver bullet, it’s not like one individual changing their lightbulb is going to solve climate change. We need a huge systems-wide approach to solving it. And likewise, we need that for solving our tech problems now, and this tech crisis that we have created.

We’re hoping to have conversations with the public, the tech engineers, and with politicians to rethink, what does good technology look like that is serving our interests as a society? And how do we build that? I think it’s going to require a ground-up rewrite of a lot of things. It’s going to require a rebuilding and redesigning of the code, but it’s going to be something that we need to do. We need to.

This Q&A has been edited for brevity and clarity.

Share: