'The Social Dilemma'

( Courtesty of Netflix )
[music]
Alison Stewart: This is All Of It, I'm Alison Stewart. The storming of Capitol Hill was largely organized on social media. The New York Times reported the groups like QAnon and the Proud Boys bolstered by President Trump have openly organized and recruited online. This summer, during the Black Lives Matter protest and movement for racial justice, Facebook took down hundreds of accounts associated with far-right extremist groups, including the Proud Boys and American Guard. Yesterday, Facebook announced its ban on President Trump's account would last at least until he leaves office.
Mark Zuckerberg, Facebook CEO, posted a statement saying, "We believe the risks of allowing the president to continue to use our service during this period are simply too great." Digital platforms connect us to friends, family, and strangers, but behind the scenes, these platforms are also influencing the very nature of our communication and interactions. A documentary posits that this is an existential threat. The social dilemma features industry insiders, critics, and whistleblowers who offer testimony and insight into how the business of Silicon Valley takes advantage of human psychology and affects our behavior both online and off.
As one person says in the film, "The only businesses that call clients users are social media companies and drug dealers." IndieWire's David Ehrlich went so far as to call it, "A horrifyingly good doc about how social media will kill us all." Director Jeff Orlowski joined us to discuss The Social Dilemma when it premiered on Netflix in September. I asked him what he thought about the reviews the documentary had received.
Jeff Orlowski: [unintelligible 00:01:45] I think really understanding what we're trying to say with the sub. I know our engineers in Silicon Valley aren't just programming software. They're actually at the stage where they're programming civilization with code that's hiding on the other side of our screen, that we don't understand the public, and is shaping the way we see the world.
Alison: What was the genesis of this doc? Where did the original idea come from?
Jeff: Well, I think my team and I, we've been working on chasing ice and chasing for all our past films that were about climate change. We're only thinking about what are the big issues affecting humanity. A couple of years ago, a friend of mine from college, Tristan Harris, started talking about what he saw as a problem within the tech industry. He was at Google and he was publicly speaking about how the software is designed to manipulate us. He and I started talking firsthand more and he couldn't feed other people and started diving into this and I realized, there's an [unintelligible 00:02:44] X percent full of [unintelligible 00:02:45] climate change.
This is a climate change of culture. This is framing the way each and every person thinks and sees and gets their information. Because of the business model, because it's driven by advertising, they've set these systems up that give us what it thinks and we're looking for and these end up in these feedback loops where we're being fed the same information over and over, but it's a customized view of the world. It's moved us away from an objective truth and it's been pushing us more and more courts where it's a foot of work on us.
Alison: One of the most stark examples that you give in the film, you showed that if you type a certain question into a search engine, depending on where you live, depends what the suggestions come up. Can you explain that?
Jeff: This was one of the most startling slaps in the face for me. If you type in the phrase "climate change" is and see what autofills, what it thinks you're looking for, you get answers that are all across the board based on who you are, where you are, what Google knows about you, what it is saying about you, and the answers are completely separated from the truth. This is something that, as a team, we've spent so much time working on climate change and trying to raise awareness. We kept finding climate deniers all around the country. We go to film festivals and go on tour and find climate deniers everywhere.
It was always like, "Wait, how is this possible? Science is so clear. We understand this. We know what's going on. Why are so many people refusing to embrace the science?" It certainly felt connected the dots, it made sense. People are being fed different information through these platforms so it makes it impossible to have a shared consensus. It makes it impossible for us to move on climate solutions if a huge percentage of the public doesn't think that it's happening. That's why I look at this as a foundational issue.
An issue beneath all of our other issues because whatever issue you care about, you might be seeing a particular worldview via your social media platforms in your search engine, but there's somebody who's getting an opposing worldview reinforced to them that is counter to yours. We've been running this experiment like this for what, a decade now, and it makes it harder and harder to come together and have meaningful societal conversations around what we should do about big problems if everybody's coming to the table with a different set of facts.
Alison: That whole idea that search engines aren't objective is a little bit of a mindblower.
Jeff: Yes. That was really shocking to me. I fully recognize, look, Google is trying to organize all of the information on the Internet. If you're doing a search and you're looking for something, it gives you different results than it gives me. It doesn't need to be that way, but the personalization aspects of these platforms, they seem so innocent at the start, but they've really, really drifted into a different type of Frankenstein.
Alison: My guest is Jeff Orlowski. He's the director of The Social Dilemma. It is out on Netflix today. One of the great things about this documentary is that you interview and talk to people who have been in the trenches, have been in there since the beginning. Let's listen to a clip. These are some introductions of some of the people that are featured in the film, The Social Dilemma.
Speaker 1: I worked at Facebook in 2011 and 2012.
Speaker 2: I was one of the really early employees at Instagram.
Speaker 3: I worked at Google, YouTube.
Speaker 4: Apple, Google, Twitter, Palm.
Speaker 5: I helped start Mozilla Labs and then switched over to the Firefox side. [unintelligible 00:06:35]
Speaker 6: I worked at Twitter. My last job there was the senior vice president of engineering.
Speaker 7: I was the president of Pinterest. Before that, I was the director of monetization at Facebook for five years.
Speaker 8: While at Twitter, I spent a number of years running their developer platform and then became head of consumer product.
Speaker 9: I was the co-inventor of Google Drive, Gmail chat, Facebook pages, and the Facebook like button.
Speaker 10: This is why I spent like eight months talking back and forth with lawyers. It freaks me out.
Alison: The last person is freaked out, which leads me to the question of what made some of these people decide to come forward and reveal the inner workings of their companies?
Jeff: It took us a really long time to find all of these subjects and to find people who are willing to speak out. Maybe we get little hints here and there, somebody posted something that was a little critical, like what's their stance? What are they thinking? Tristan was really one of the most outspoken people early on in. I think early 2018, he did an interview with Anderson Cooper on 60 Minutes. I saw that. It was about the manipulation side of the software and that was one of the things that was a huge light bulb for me.
It was through that process where I started to meet more and more tech engineers, friends of friends, people that we could get connected to. It was just a deep exploration. For me, I was very curious about what the former employees had to say because I felt like we've been hearing what Facebook and Google and what the companies are putting out with them and saying in front of Congress what they have been advertising to the world around their actions. I really wanted to hear from people who were inside the companies and had something, to say that they felt flexible and able to say now that they left the companies.
Alison: When they talk to these folks about-- Like Tristan Harris, former Google designer, he's an advocate for ethical design and cofounder of the Center for Humane Technology. When they talk about trying to make change from the inside, what has that experience been like?
Jeff: I think for some, really tried to make change from within Google for quite a while. From everything that he's told me, it just felt like it was falling on deaf ears. He did a big presentation that went viral within the company and got a lot of people talking about it and went up to the CEOs desk. Then it just didn't go anywhere. I think in part, it's because a sign of fundamentally critiquing the DNA of these companies, they are built around this advertising business model. They're built around extracting attention and they figured out ways to do that that's incredibly profitable.
From their perspective, if they can keep their business model operating, that that's their hope, their goal because it's just so incredibly profitable for them.
Alison: The name is Center for Humane Technology. That's a story. It's a serious name. What makes this landscape inhumane as it exists today?
Jeff: That's a great question. There's a reference that Steve Jobs made around how a computer could be a bicycle for the mind. When you think about that, the idea is that a bicycle can take a human and let us move faster and more capably. It's a tool that's sitting there waiting for us to use it that can transform us and give us these superpowers. There are a lot of technologies that are really, really amazing positive technologies. FaceTime lets you talk to somebody virtually thousands of miles away instantaneously and you can connect and you can have a shared conversation.
The problem is that we've moved away from these tools that we are paying for and are serving us into this landscape now where everything is run through a third party where we get Facebook, Twitter, Google, and YouTube for free, but they're monetized in a different way and they don't have-- This is the thing, if you're not paying for the product, you are the product. With that worldview, these technology platforms have a very different incentives for their advertising customers. Based on those incentives, it shifts away our experiences of the general public.
We're seeing consequences in terms of teen mental health, we're seeing consequences in terms of increased rates in suicide amongst young girls, increasing rates of political polarization, conspiracy theory, misinformation running rampant, fake news spreads on Twitter six times faster than the truth. That's not because Twitter decided, "What's the best way for us to have a meaningful conversation that's engaging?" It's because they have a business model that is optimized towards the quantity over quality. The more people see things, the more eyeballs, the more attention that's just inherently the incentive model for the business model.
Is the technology designed for the general public? Is it designed for us, the public, as the main objective and the main goal? I think with this advertising business model being the way that they are turning a free product into the richest industry in the history of money, it seems too good to be true and it's turning out that it is too good to be true, that there are these really, really significant consequences that are happening that we're seeing now.
I make the comparison to the fossil fuel industry where when we first discovered oil, it just seemed like this amazing resource that allows us to transport faster and farther, and we can fly, and only years later to be recognized that there were these huge consequences to using oil at scale as we do. We're seeing that same parallel now with our social media, but this free business model has built a system that is incentivizing engagement and gets people to come back and spend more time on the platforms, and that that incentive is having these massive consequences at the individual in the societal level.
Alison: I was reading something recently and I can't read exactly where, but it was someone who said, and it wasn't some of the tech industry, but it was a smart person who said, "I think in 20 years, we'll look back and say, we actually let our kids have phones and apps on them like that, the way that people, pregnant women once upon a time could smoke." We'll look back in 25 years and be like, "Oh my God, we let our children do that?"
Jeff: I hope that's the case. I hope we have that shift because the alternative for me is very scary if we continue down the status quo. I think of it as like this is the decade of dopamine where we have all just been inundated with these feedback systems that just keep giving us what we want. It's like you're driving your car and you get to a red light and you have this instinct to pull your phone out just to get a little rush or something. I just need some information to fill the 30-second void in my life. That's a sad reality.
As we were working on this project, I used to be so addicted to social media. I was a very, very heavy user. I started weaning myself off of it as I was learning all of this stuff and seeing the techniques. Then I learned about the thing that they call a resurrection where if you use the platform and then you stop using it, the algorithm is going to try to figure out how do we get you back? How do we get this user back onto the platform? It's basically this huge fishing expedition to figure out what's going to get the user number 64839F to come back to the platform and spend more time on it.
I started to feel, "Oh, I'm getting these emails from Facebook," and, "Oh, that's interesting." Now, they're attaching photographs and, "Oh, I'm getting text messages now. I keep getting text messages," and, "Oh, here's a former relationship, a former partner that they're pinging me on, and look at the photo of the former relationship you were in." All of these things started to come on such a consistent basis and it just made me really see the company, it's for what they're trying to do. This isn't trying to serve my interests. It's not what's best for me. It's just trying to get me back onto the platform so I could see more ads. Really, this whole process has completely shifted the way I look at these platforms.
I'm very much a tech optimist. I love technology. I love the power and the capabilities of what technology can bring to humanity, but I've become so skeptical when there's a business model that misaligns the incentives. It's not just for us.
Alison: That's how I used to say about the resurrection and the luring you back because I was on vacation for a couple of weeks and I took apps off my phone. I just said, "I didn't want to have Twitter on my phone. I didn't want to have all this on my phone." I did notice an increase in an email of Twitter telling me who tweeted what, who tweeted, who tweeted what as to lure me back.
Jeff: Right. Absolutely. It's just trying to get you to come back. Once again, that's not a person sitting there and thinking, "What email's going to work on her today?" That's an automated program. That's an algorithm. That's a machine-learning algorithm that is going to try and test things and to see what works. Then if you click on any one of those emails, that's a huge data point, "Oh, this thing works, this piece of information." What worked on you is completely different than the thing that works on me, versus my friend, versus my neighbor. We are all living in our own Truman shows where everybody is being delivered a customized version.
I did this with a friend one day. I took her phone and gave her my phone and I looked at her Facebook feed and she looked at mine. Like, these are completely different realities, completely different world. Her Facebook feed would not keep me addicted at all. I was scrolling through, "This is not interesting, not interesting. All right, fine. I'm done." Likewise, my feed would have no impact on her. Yet on me, my feed was glue. It just kept me coming back. I was curious about all of these different things. This is where it gets really sticky in my mind. We live in an era of customized news. The majority of Americans get their news now through social media platforms.
If we're getting our news through these platforms and they're completely personalized, we're moving away from objective truth and from a shared reality, and we are all being given our own set of facts that we're operating and navigating the world with. How do you have a society address challenges? How do you have a community come together and address a challenge and look for solutions if everybody's coming from a completely different perspective with no real empathy and understanding for each other's realities? That's the world that we are in and are continuing to go down the path of.
Alison: My guest is Jeff Orlowski, Director of The Social Dilemma which is out on Netflix. I do want to play one more clip before we wrap. We know that platforms make money through collecting our data with targeted ads, but let's listen to this clip where author Jaron Lanier describes how the transitions work. This is from The Social Dilemma.
Jaron Lanier: We've created a world in which online connection has become primary, especially for younger generations. Yet, in that world, anytime two people connect, the only way it's financed is through a sneaky third person who's paying to manipulate those two people. We've created an entire global generation of people who were raised within a context with the very meaning of communication, the very meaning of culture is manipulation. We've put deceit and sneakiness at the absolute center of everything we do.
Alison: Jeff, from your experience and from this documentary and all the people you talk to, what is one thing that could change about the way we interact with the Web that would have a really big difference, that could really shift it from being sneaky and malevolent to something that the possibilities of the Internet, all the good stuff that's about the Internet?
Jeff: That's a really great question and so many thoughts to try to answer that. Our team is trying to work on a campaign now to help shift the way the technology is used, the way it's designed, and the way it's regulated. This is a very complex problem just like climate change. There's no silver bullet solution. There's no, "Change your light bulb and all of this is done. There's no, "Turn off your notifications," that doesn't solve this. We need a huge shift in the way the tech industry relates with the public, the way they design their products, and the way that we are exporting these products out into the world.
My hope is that these changes will come from within the industry themselves, but they'll recognize we don't want to be the new fossil fuel industry. We don't want to be the new cigarette industry and that the companies will say, "You know what, let's get off of this bad exploitative business model, let's change, and let's do the right thing for the public, for the users." It would be a ground up reinvention of what Google, Facebook, YouTube, and Twitter are. I see that path forward. I think it's a very, very difficult path to get to because like the financial incentive to stay with the status quo. Just like with climate change, I think we need to make those shifts.
We need to get off of an exploitative and extractive business model that harms civilization. My hope is that the film is a wake-up call, both for the public, the tech industry, and to regulators that we need to demand a change, that we don't want to participate in a system like this. It's an incredibly unfair burden for parents to have to regulate their child's usage of these social media platforms that are designed to be addictive, that are designed to get them to come back with no regard for where a child is, their mental development, their age, and their social connections. They're learning.
We're training this generation in a completely different way. Then anybody 30s and above, you remember what the world was like before social media and you can still feel like, "Wait. This is so different right now and different in a weird way," and, "What the hell is going on?" For a teenager today and for Gen Z, it's a different reality born into. The stories that they've heard, it is frightening to think that this is a way we're training a new generation of humans.
Alison: That was my conversation with Director Jeff Orlowski about his investigative documentary, The Social Dilemma available to watch on Netflix.
Copyright © 2020 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.