BOB GARFIELD This is On the Media, I'm Bob Garfield. Carol Cadwalladr enumerated a series of harms caused by Facebook. Quieting dissent, ignoring incitement and profiting from distortion. But according to Harvard Professor Emeritus Shoshana Zuboff, the bill of indictment can't merely be a list of harms. It must rather recognize a vast, sinister architecture that not only exploits markets and human frailty, but steals our inner selves as fuel for the machine. Like digital Soylent Green, text food supply is people. And yet society has been too much in awe to recognize the damage.
SHOSHANA ZUBOFF These companies were born into a series of historical windfalls. They like to play the game of the naturalistic fallacy, which is the idea that, hey, we're so successful, we must be right and we must be good. But their success really has nothing to do with being right or good. And all these companies were born into an era of the neoliberal ideology, a time when regulation has been diminished and devalued in favor of this idea that somehow markets have a kind of a natural genius to them. Markets always make the right decisions. And so you've got to leave these market actors to be as free as possible. You don't regulate them, you let them, quote, self regulate. And that's how we get to the best outcomes.
BOB GARFIELD For a brief moment, at the turn of the millennium, governments seemed to have developed an appetite at least to regulate the unchecked collection of personal data. From 1997 to 2001, Zuboff says the FTC began to stir. But then came 9/11, and the conversation moved from privacy protection to total information awareness. And so, she says, the stage had been set for the economic world we live in today. One of surveillance capitalism.
SHOSHANA ZUBOFF Historically of capitalism evolves by taking things that live outside of the market, bringing them into the market, turning them into what we call commodities, things that can be sold and purchased. In our digital era. Capitalism has evolved according to the same age old pattern, but with a dark and startling twist. That all the easy commodities were already taken, the forests and the meadows and the rivers and the mountainsides. So what they figured out early in the digital century was that private human experience could be the new virgin wood. The new unblemished mountain side.
BOB GARFIELD So it's my supermarket loyalty card and its Facebook and its Waze tracking my every automotive movement. What are some of the less obvious examples?
SHOSHANA ZUBOFF It's getting easier to answer the opposite question, which is what is still innocent of this process. But let me put it this way. Every product that begins with the word smart, every service that begins with the word personalized. Everywhere that there is an Internet touch point, it could be your dishwasher, it could be your television set. It could be the thermostat in your bedroom, could actually be the mattress on your bed. It could be your car dashboard. You're walking down the street and there are sensors and cameras. You're in a cafe and there are webcams. There are companies that specialize only in tracking your behavior as a renter. And then they make predictions about your behavior as a renter and sell them to landlords. That's a niche market. Multiply that overall, the niche markets, and then, of course, one of the biggest markets that's growing now is health data. This is why everybody under the sun wants to have a wearable device which will exceed even the smartphone as being the most invasive supply chain interface of all. Facebook's A.I. hub is described by the company as ingesting trillions of data points every day and producing six million predictions of human behavior every second, Bob. So that's the kind of scale.
BOB GARFIELD Well, my heart rate is rising and I'm embarrassed to report Apple knows about it because they're monitoring it on my wrist. Now, one of the most obvious applications, I guess the first application of data collection is extremely targeted advertising. You know, I'm pregnant. I've told absolutely nobody but my partner, and yet I suddenly start getting ads for vitamins and crib accessories, which is creepy enough on the face of it. But on the creep-o-meter, that is nothing because you call ad targeting merely the leading edge of the human futures market. And what is that?
SHOSHANA ZUBOFF So what are these advertisers and marketing folks buying? You know, it started out there buying the click through rate and what is the click through rate except for a tiny computational fragment that predicts a fragment of my behavior. You know, what ad am I likely to click on and click through to to that website? That is a prediction. Just like we have futures markets in pork bellies or oil prices. We have futures markets now in human behavior. And it turns out this surveillance economy has come into existence because all these companies from Ford Motor all the way down the line have figured out that, hey, we can make more money and attract more investment by selling predictions of our our users or our customers behavior based on all this data we can gather about them than actually selling a product or a service.
BOB GARFIELD Back in the day, Oh, I don't know. 15 years ago, advertising was an attempt to get some awareness. You take your best guess as to who your ad will reach and then you try to get their attention by whatever means. If I understand this correctly, advertising is not that anymore. Advertising is a bet that if you reach this particular user with this particular message at this particular time, that they will click and do business with you.
SHOSHANA ZUBOFF The real turning point in advertising came at Google based on their invention of this click through rate. Before the click through rate, when an advertiser went to figure out where should we put an ad, they looked for places where the context was somehow related to their brand values. They look for places where they were likely to encounter people who were somehow related to their brand values. Google comes along and says, you know, you've been kind of flying blind, choosing keywords, trying to connect with customers that are interested in your brand values. But we have a completely new program. We have a black box. It's going to tell you where to place your ad if you follow the advice of our black box, your ads are going to be more successful and you're going to sell more product. But don't ask us how we got to that prediction. Do not ask to see inside the black box because that is off limits. That was the moment when advertisers had to make a choice between does advertising project a connection to our brand values or is it just wherever we have a higher probability of making a buck - that's where we're going to go. This black box deal asked advertisers and the manufacturers behind those advertisers to sell their souls, and they did. Now, fast forward in the weeks and months after the Cambridge Analytica story, suddenly we were hearing from advertisers who were all upset that my ads are showing on pages that tout anti-Semitism and white supremacy and suddenly the advertisers are all up in arms and all indignant. Well, that was a bit of a show that was performative, Bob, because really the advertisers had sold their souls two decades ago and they lost the right to complain about where their ads appeared because they bought into the black box. So that was the huge turning point in advertising. And of course, that turning point is what allowed these new human futures markets to become so big and so lucrative that nearly all of the market capitalization of Alphabet, approximately 88 percent at this point, and really all of Facebook's market capitalization, about 98 percent derived solely from those advertising markets.
BOB GARFIELD Google discovered that data collection and the human behavior market far exceeded mere ad targeting, but they could make behavioral predictions about a whole host of things that has long since metastasized into the total surveillance society that you describe.
SHOSHANA ZUBOFF Well, you know, we labor under the delusion that the data that the companies are collecting about us is the data that we have chosen to give them. In other words, I can choose my own degree of privacy based on a private calculation of how much information I choose to share with a company in return for their product or service. And often that's a free product or service, but not always. This is a delusion and it's a delusion that has been nursed by the companies. So, Bob, it's not that you're walking down Fifth Avenue. It's the stoop of your shoulders and your gait, the cadence and pacing and rhythm of your walking. It's not your face just to I.D. you. It's the hundreds of micro expressions that the little muscles in your face produce, because those micro expressions are great predictors of your emotions. And Bob, it turns out that your emotions are great predictors of your behavior, and that's, of course, what they are after.
BOB GARFIELD If we've learned anything about the failures of American capitalism, it is about the perverse inequality and distribution of wealth and opportunity and social justice. To those yawning gaps, you add another, what you call epistemic inequality, the gap between what we know and what is known about us. This is what we're discussing here, is it not?
SHOSHANA ZUBOFF When we entered the digital century, this was supposed to be the golden age of democracy, the democratization of knowledge. This was going to allow us to finally solve our problems as individuals who have commerce that was really oriented toward solving our deepest needs. It was supposed to create the data that was going to allow us to finally cure diseases that had been uncurable, to finally create the solutions for climate cataclysm that have eluded us. Instead, surveillance capitalism has captured the entire domain of the digital. Surveillance. Capitalism owns and operates the Internet and under surveillance capitalism. Yes, it looks on the surface like there's democratization of knowledge. And don't get me wrong, it is a great gift to be able to get online or get on my phone or whatever and search and find information that before I would have had to go to one of the 40 volumes of my Encyclopedia Britannica or into my university library to find. This is a great gift, there is no belittling it. The problem is that on the back of that gift, these companies have built huge concentrations. In the past. We thought of industrial concentrations as concentrations of economic power. But now in the digital century, these concentrations are of knowledge, knowledge about us that comes from us, but is not for us. It's taken from us to use for others benefit not to solve social problems, not to solve the Earth's problems, not to solve consumers and users problems, but to meet the needs of business customers in these human futures markets. And so from this gap of the difference between what I can now and what can be known about me grows a new kind of power, which is the difference between what I can do and what can be done to me.
BOB GARFIELD The quid pro quo does not favor the consumer.
SHOSHANA ZUBOFF We are the objects now of global architectures of behavioral modification. Facebook talks about its ability to analyze moods and emotions so that they can alert advertisers of the best day, the best time of day, the kind of content of a message and the way the message should be delivered so that it will have maximum impact to get us to buy a product. For example, they learn that teenagers go through specific cycles during the week of anxiety and anxiety rises as they approach the weekend. And they can see, for example, that, hey, Bob is getting really uptight about his date Friday night. His self-esteem is really being challenged and he's feeling vulnerable right now. So, hey, advertiser, if you have some kind of really sexy confidence boost product like that black leather jacket that you've got for sale, this is the time for you to quickly send him a message. Tell them you're going to discount that black leather jacket, tell them about its sex appeal. Tell them that you'll do free shipping. And that'll he'll have it by 10:00 a.m. Friday morning so he can wear it on his date Friday night. And you will have maximized your ability to sell him that expensive black leather jacket. Now, what we learned in the year 2018 when Cambridge Analytica entered the global consciousness was that these same methods and mechanisms, which are the bread and butter of every self respecting surveillance capitalist, can be pivoted just a few degrees to political objectives.
BOB GARFIELD And hence Brexit, hence the Trump victory in 2016. Hence the pogroms in Myanmar. And the list is long and bleak.
SHOSHANA ZUBOFF We've learned that in 2016, the Trump campaign, just by using Facebook's political advertising capabilities to their fullest extent were successfully able to target black citizens in swing states and suppress the black vote. And they were able to do that effectively without jackbooted soldiers turning up at anybody's door. Without a single gun shot being fired. So this is what I call instrumentarian in power. It's not the totalitarianism of the 20th century that learned how to control people through the threat of terror and murder. This is a different kind of power, Bob, and it comes to us on slippered feet. It comes to us whispering sweet nothings, holding a cappuccino, but it can get everything it wants, or at least that's the trajectory that it's on.
BOB GARFIELD So let's say the obvious ways to combat the excesses of industrial capitalism were and are regulation, including antitrust, collective bargaining, minimum wage, high marginal tax rates, capital gains taxes, workplace protections, environmental laws and so on. How are we to deal with epistemic inequality?
SHOSHANA ZUBOFF When you look at the origins of antitrust law, which go back to the late 19th century, it was clear to observers then and it's been clear to historians since that antitrust became really popular within the world of ordinary folks, not because monopoly was the only problem or even the worst problem, but because they were so angry at these companies for having so much power over them, for making them feel like pawns in a system over which they had no control, for devaluing them, for demeaning them as as citizens and as individuals and as workers and as consumers. All of those sources of anger and indignity were plowed into the antitrust rallying cry. And I think we've got something similar like that going on today. Antitrust that these are the laws that we have, this is the hammer that we have. Monopoly, as is conventionally understood, anti-competitive practices - these things are real. There's no question that companies like Google and Facebook and Amazon and Apple and Microsoft, these companies are ruthless capitalists in the most conventional sense, as well as being ruthless surveillance capitalists. The problem is, Bob, that if we only address their anti-competitive practices, we run the risk of leaving everything that you and I have been talking about intact. When we wanted to outlaw child labor, we didn't start having negotiations about how many hours a day a child could work in a factory. We said there's not going to be children in factories. Well, it's the same thing about data. We don't want to just be negotiating who owns it or can I get it from the company and take it with me? Once we started talking about data, we've already lost the battle on extraction. It means they've already taken our lives and turned it into data in the first place. We need to go upstream and start focusing on extraction, that's number one. But there's also the markets where ultimately the predictions that come out of their computational factories get sold and these markets that trade in predictions about our behavior, these are the source of the financial incentives. So let's outlaw markets, the trade in human futures. We've done this before. We've outlawed, for example, markets that trade in human beings, even when there were whole economies based on those markets. And we did so because they were contrary to the principles of a democratic society. We can outlaw markets that trade in human futures. The moment that we do that, Bob, we have opened up a vast new landscape for competitors who want data to serve, people who want data to serve society, who want data to serve the earth and are going to find a way to do that, that they can monetize. They may not become trillion dollar companies in the space of 10 years or 20 years, but they can make great profit and they can do it without the overhang of surveillance capitalism. We are starting to get it. We know it. We see it. We can act on it.
BOB GARFIELD Shoshanna, thank you very, very much.
SHOSHANA ZUBOFF Thank you for doing this.
BOB GARFIELD You'll have to excuse me now. I have to return a black leather jacket to Amazon.
BOB GARFIELD Shoshana Zuboff is professor emerita at Harvard Business School and author of The Age of Surveillance Capitalism.
DET. THORN They're making our food out of people. Next thing they'll be breeding us like cattle for food. You gotta tell 'em, you gotta tell 'em! Listen to me, Hatcher. You gotta tell 'em Soylent Green is people! [END CLIP]
BOB GARFIELD That's it for this week's show On the Media is produced by Alana Cassanova-Burgess, Micah Loewinger, Leah Feder, Jon Hanrahan and Eloise Blondiau with help from Ava Sasani. Ava, sorry to say, leaves us this week and we wish her absolutely nothing but the best.
Xandra Ellin writes our newsletter and our show was edited this week by executive producer Katye Rogers. Our technical director is Jennifer Munsen. Our engineers this week were Sam Bair and Adrian Lilly. Bassist composer Ben Allison wrote our theme. On the Media as a production of WNYC Studios. Brooke Gladstone will be back next week. I'm Bob Garfield.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.