Brooke Gladstone: This week, the CEO of OpenAI went to Congress and said, "Regulate us," but is the power of AI being overblown?
News clip: Generative AI was compared to the first cell phone, the creation of the internet, the Industrial Revolution, the printing press, and the atomic bomb.
Brooke Gladstone: From WNYC in New York, this is On the Media. I'm Brooke Gladstone. Also on the show, while the current writers' strike is ongoing, there are still myths that persist from the last one in 2007 like the notion that the strike gave birth to reality TV.
Emily St. James: What really bugs me about it is, all it takes to prove it's not true is googling when Survivor debuted. It debuted in 2000.
Brooke Gladstone: Plus in the third and final part of our series on the imblipification of the internet, we go in search of possible solutions.
Cory Doctorow: We could just make this a rule. We could say if you have some user's data and the user asks for the data, you got to give them the data.
Brooke Gladstone: It's all coming up after this. From WNYC in New York, this is On the Media. I'm Brooke Gladstone. On Tuesday, Sam Altman, the CEO of OpenAI, testified in front of the Senate Judiciary Committee about the dangers of artificial intelligence. The hearing opened with remarks from Senator Richard Blumenthal.
Senator Richard Blumenthal: We have seen how algorithmic biases can perpetuate discrimination and prejudice and how the lack of transparency can undermine public trust. This is not the future we want.
Brooke Gladstone: But wait!
Senator Richard Blumenthal: If you were listening from home, you might have thought that voice was mine, but in fact, that voice was not mine. The audio was an AI voice cloning software trained on my floor speeches. The remarks were written by ChatGPT when it was asked how I would open this hearing.
Brooke Gladstone: The stunt was somewhat underwhelming, but sure, point made about how good AI has gotten and the implications about where it might go. At the end of March, there was the open letter.
News clip: It's been signed by more than a thousand artificial intelligence experts and tech leaders past and present.
News clip: The artificial intelligence experts are calling for a six-month pause in developing large-scale AI systems, citing fears of profound risks to humanity.
Brooke Gladstone: Then almost three weeks ago, Geoffrey Hinton, the so-called "Godfather of AI," who we'd interviewed on the show, left his job at Google specifically so that he could-
Geoffrey Hinton: -blow the whistle and say we should worry seriously about how we stop these things getting control over us. It's going to be very hard and I don't have the solutions. I wish I did.
Brooke Gladstone: Apple co-founder Steve Wozniak also chimed in.
Steve Wozniak: It's going to be used by people for basically really evil purposes.
Brooke Gladstone: As did Microsoft founder Bill Gates.
Bill Gates: We're all scared that a bad guy could grab it.
Brooke Gladstone: OpenAI's Sam Altman basically said to Congress, "Regulate me."
Sam Altman: It is essential that powerful AI is developed with Democratic values in mind, and this means that US leadership is critical.
Brooke Gladstone: Will Oremus writes about technology in the digital world for The Washington Post. He says the vibe of Tuesday's session was worlds away from the ones where lawmakers rake social media execs over the coals.
Will Oremus: This was very different. This was more like some of those low-key hearings you don't end up reading much about in the news, where they have some independent expert witnesses that are there to really educate them about an issue. That's how they were treating Sam Altman, who is the CEO of OpenAI.
Brooke Gladstone: What was his take? What was his demeanor? Was he the Cassandra of coders?
Will Oremus: He was there to issue warnings about how powerful this technology could be.
Sam Altman: My worst fears are that we cause significant-- we, the field, the technology, the industry, caused significant harm to the world. I think if this technology goes wrong, it can go quite wrong. We want to be vocal about that. We want to work with the government to prevent that from happening.
Will Oremus: He was also there to present himself as an ally in making sure that the worst fears aren't realized, but here's the thing. Altman's the one who's building it.
Brooke Gladstone: Yes, I know. That is the thing. [laughs]
Will Oremus: No company has done more to push this particular form of AI. You can call it generative AI or large language models, foundational models, the stuff that underpins something like ChatGPT. They're the ones who released ChatGPT to the public and forced the hand of big companies like Google and Microsoft to respond with their own AI chatbots.
Brooke Gladstone: What's he playing at here? Senator John Neely Kennedy, the Republican of Louisiana, even asked if Altman himself might be the one to oversee a federal regulatory body overseeing AI.
Senator John Neely Kennedy: Would you be qualified if we promulgated those rules to administer those rules?
Sam Altman: I love my current job.
Senator John Neely Kennedy: Cool. Are there people out there that would be qualified?
Sam Altman: We'd be happy to send you recommendations for people out there, yes.
Brooke Gladstone: That was weird. No? Asking the fox to guard the henhouse.
Will Oremus: Yes, it raises the question of, is it still regulatory capture? If you don't have to capture anything, they just hand you the keys to the regulations.
Brooke Gladstone: What were the ideas that were proposed?
Will Oremus: Broadly, there are two ways of thinking about the threats posed by this generation of AI. One way of thinking about it is the way that was prevalent at this hearing. It's that speculative, far-off, what if AI gets too smart? How do we make sure that it doesn't go rogue and kill us all? That's sometimes called "AI safety." There is another framework sometimes called "AI ethics," that looks more at problems of, "How can AI tools be misused by humans or how could they deceive people?" This hearing really focused more on the speculative harms and less on the questions like, what if companies start delegating decision-making to AI today and the AI makes bad decisions at a huge scale?
We don't know why because it's a black box, because we don't know exactly how it works or what data it's been trained on. What if tons of people lose their jobs and then we realize it was all a big mistake, or we realize that they've been replaced by these machines that have embedded really insidious biases? Another way of thinking about those two sets of concerns is, on the one hand, you're concerned that AI is going to get too smart. On the other hand, you're concerned that AI today is too dumb, for lack of a better word, that people are going to overestimate its intelligence and use it for things it's not really cut out for.
Brooke Gladstone: Like what things?
Will Oremus: For instance, if you talk to a doctor about people using the internet for medical research, they'll laugh ruefully about Dr. Google.
Brooke Gladstone: The University of Google.
Will Oremus: Right, and it's not always the most reliable information. That said, when I google my symptoms, I know that there are certain sites that are maybe more reliable than others. I can go to those and I can take it with a grain of salt because I know whose site I'm on. Now, think about how Google and Microsoft want to build chatbots like ChatGPT into their search engines. In fact, they're already doing it. What about when people start asking medical questions to a chatbot? What data was that trained on? Was that trained on WebMD or was that trained on some conspiracy quack's blog? We don't know.
Brooke Gladstone: The ideas that came up for regulating this nascent industry included things like licensing AI models, scoring them on certain benchmarks, ensuring that the AI is identified as AI and can't pose as humans. Did you see anything in this that could address some of these short-term, immediate present concerns?
Will Oremus: There were a lot of ideas floated. Some of them, I think, do address some of the shorter-term issues. There were some calls for the AI companies to not train their models on the copyrighted works of artists without those artists' consent. Senator Marsha Blackburn of Tennessee, representing the Nashville country music industry, wanted to know, "Can you train your model on Garth Brooks's music and then have a program that can make songs that sound just like Garth Brooks but he doesn't get any royalties?"
Altman downplayed that concern, but what he didn't say was they've already done it. They've already trained their models on copyrighted works without consent. There is no unbaking that bread. It's in there. Even though the headlines from this hearing were, while Sam Altman is inviting regulation, there are certain types of regulation he was definitely not invited and that was one of them.
Another one was, one of the expert witnesses was Gary Marcus, a professor at NYU who's been a long-time observer, an expert on AI. He repeatedly said we need some transparency about what are the data sources for these models so that we can even begin to research what the biases might be. That is something that OpenAI does not support and that Altman sidestepped in the hearing. Then one other line of regulatory attack on these AI models would be around liability.
When a chatbot says something that turns out to lead to harm, maybe an AI could give bad medical or financial advice that leads to someone's ruin or even death, will the AI companies be held responsible for that? Altman doesn't think that the AI companies should be held liable if their models steer somebody wrong. Altman says, "Regulate me. I'm making something dangerous," but there were things that he doesn't want the government's help on, things that would be problematic for the way that OpenAI is doing business.
Brooke Gladstone: The headline from this hearing was that here's Sam Altman warning that AI could cause great harm to society. That's pretty catchy.
Will Oremus: I think it would be foolish to sit here and say six months after ChatGPT came out that there couldn't be serious harms from a super smart AI someday.
Brooke Gladstone: You have Stephen Hawking being terrified of it before he dies.
Will Oremus: Yes, but the more you focus on how smart AI could be someday, the less you focus on all the ways it falls short today. That's crucial because we're in a moment when almost every industry is looking for, how can we make use of AI? How can we show our investors that we're at the forefront? Could we weather this difficult financial period by laying off humans and putting AI in charge of some things?
We're already seeing media companies doing that. I've talked to BuzzFeed and I've talked to CNET and its parent Red Ventures and they say, "Well, yes, we're investing heavily in AI and we're going to let AI write articles now and, yes, also we disbanded BuzzFeed News and laid off all the reporters and we did layoffs at CNET of humans, but those two are entirely unrelated." Nobody's coming out and saying, "We're firing humans and replacing them with AI." If you connect the dots, it is already happening.
Brooke Gladstone: This hearing really is one of the best AI marketing campaigns ever.
Will Oremus: Right. If you're Sam Altman and you get a whole press cycle that says, "First of all, my technology is so powerful that it could destroy the world. Second of all, I'm here to help. Regulate me and I'll do whatever I can to prevent that from happening," that's a hero pose for him. It's worth noting that they did not include some of the people who first brought the warnings about large language models to the public's attention. A few years ago, there was a big hubbub around Google's ethical AI team. It stemmed from a research paper that members of that team had co-authored.
They warned of everything from the impacts on the climate, from building ever bigger models that require evermore computing power and energy to run. They warned about the biases built into the data that these models were being trained on. Google wouldn't let them publish that paper. Those people are still very active as critics of AI, but they were not invited to be part of this hearing. Instead, you got the industry folks who were asked to come in and inform lawmakers about what the harms might be.
Brooke Gladstone: What sort of lens should the concerned media consumer put on this story and the coverage of it?
Will Oremus: I would just say to consumers of the news, be wary of the hero narrative. Be wary of the idea that this guy who's building the leading AI systems is also the guy to save us from them. There are many other voices out there with different things to say than Altman about the risks posed by AI. I think it's really important that those voices get heard and listened to when they speak up. We need to be aware of its limitations in order to have any hope of using it for good.
Brooke Gladstone: Will, thank you very much.
Will Oremus: Thanks for having me.
Brooke Gladstone: Will Oremus writes about the ideas, products, and power struggles shaping the digital world for The Washington Post. Coming up, the writers are restless. This is On the Media.
Brooke Gladstone: This is On the Media. I'm Brooke Gladstone. On Tuesday, we entered the third week of one of the largest entertainment strikes in recent years.
News clip: This is the first time in 15 years that TV writers are on strike.
News clip: They say that they're calling for fairer contracts after not being able to reach an agreement on negotiations last night.
News clip: We're also looking to the future and hoping to stave off an industry takeover by artificial intelligence and chatbots.
Brooke Gladstone: The action by the Writers Guild of America has closed down shows from Abbott Elementary to SNL to The Tonight Show, many others. Big names join the little ones on the picket line, Tina Fey, Seth Meyers, Rob Lowe, Jason Sudeikis, and a very impassioned Mandy Patinkin.
Mandy Patinkin: You guys make millions and millions of dollars. For God's sake. Without the writers, we're nothing. They create the stories that make our hearts beat.
Brooke Gladstone: Those writers have definitely shown their skills over the course of the strike.
News clip: Hollywood writers now penning one-liners on picket signs.
News clip: A fair offer they apparently can't refuse.
News clip: Jokes are hard. Paying us is easy.
Brooke Gladstone: Not to mention, a staff favorite. Without writers, Logan Roy would be alive. At the heart of the strike are concerns about the risks to writer pay and career development posed by the rise of streaming. For instance, streamers don't pay writers residuals, the cut of cash they'd normally get each time their show was rerun on TV. Now, they're more likely to be paid just for the number of days they work on any given show. Streaming shows often have shorter seasons. While streaming is new-ish, the fundamentals behind the strike are perennial. For 15 years, Emily St. James covered TV at such outlets as Vox and The AV Club, including the 100-day 2007 writers' strike. Now, she's a TV writer herself.
Emily St. James: Yes, yes, yes, I feel very strange saying I'm a TV writer when I worked one day in that job and immediately went on strike.
Brooke Gladstone: We spoke to Emily fresh from the picket line about some of the less-than-true notions she's seen in the coverage about this strike and the last one. For instance, the notion that the 2007 strike supercharged the growth of reality TV.
Emily St. James: All it takes to prove it's not true is googling when Survivor debuted. It debuted in 2000.
Jeff Probst: 39 days, 16 people, 1 survivor.
Emily St. James: Survivor is generally noted as the dawn of reality competition TV. It's not the first example of it, but it's the first really huge hit that American TV had. Of course, from there, we have all sorts of copycats. We have shows like American Idol and shows like The Bachelor. All of those shows debuted before 2005. The strike is in 2007. When you get to the top 30 TV shows of the 2007-2008 season, which is the one affected by the strike, yes, the top five shows are all reality.
There are two installments of American Idol and the three installments of Dancing with the Stars each week. Those were the top five shows for the TV season before and the TV season after. If you look at the rest of that top 30 list, you see all kinds of scripted shows. You see Grey's Anatomy and Lost and Desperate Housewives and CSI and NCIS and et cetera. These networks were not canceling scripted programming to make reality shows.
Brooke Gladstone: How about the collapse of civilization as we know it, by which I mean the lasting influence of the '08 strike on The Apprentice?
Donald Trump: You've been lazy. You've been nothing but trouble. Now, you cut them off as they're fighting each other for who should be fired? Michael.
Michael: Yes, sir?
Donald Trump: You're fired.
Brooke Gladstone: The story goes that the Donald Trump vehicle, which debuted in 2004, was saved from cancellation by the 2007-2008 strike, thus allowing Trump to continue having a primetime platform that would lead him to the presidency.
Emily St. James: We can debate all day long, the role of The Apprentice in Trump's ascent to the presidency. I think there are valid cases on both sides.
Brooke Gladstone: He was a New York schnorrer before he became an international sensation.
Emily St. James: I think that what is not accurate is that The Apprentice was saved from cancellation by the strike. What happened was in May 2007, it was not on NBC's fall schedule and everyone was like, "Does that mean The Apprentice is canceled?" because its ratings had been slumping to that point after being an enormous hit for its first two seasons. At the time, the NBC brass were very clear that, "No, we're just trying to work out a new deal with Trump and with Mark Burnett," who's the producer of that show.
They eventually did within a few weeks and they went into production on the first season of what became The Celebrity Apprentice that summer. For whatever reason, whether it's because it did debut in that strike period when there wasn't a lot else on or because it had celebrities, its ratings did go up a little bit and then it very quickly fell off. I do not think Donald Trump is president because there was a writers' strike. I think that's way too many butterflies causing hurricanes.
Emily St. James: It is true that The Celebrity Apprentice debuted in the middle of that strike period.
Brooke Gladstone: Why does the idea of a reality TV boom in the last strike persist?
Emily St. James: We jokingly talked about the collapse of civilization a few minutes ago, but you really would look at writing about reality television in the 2000s. People would write about it as though it was just this extreme, pernicious evil. One of the things about unscripted programming, whether you're looking at the TV or film world, is it's always been there. One of the first big hits was The $64,000 Question.
Brooke Gladstone: I was thinking more Candid Camera might be one.
Emily St. James: Sure, yes, Candid Camera is one. Queen for a Day, This Is Your Life, all of these shows that are playing off like, "Yes, they're very different from the reality shows we have today," but they are unscripted programming of the time. They were actually written about in similar ways. During the game-show craze of the '50s, you can find all sorts of TV criticism that is like, "These shows are just going to be the end of all of us." It's a way that people have always thought about this kind of programming.
Brooke Gladstone: Let's talk about the impact of the strike. I was thinking of the 2007 strike when the life of Jesse Pinkman was saved.
Emily St. James: [chuckles] The beloved series, Breaking Bad. Bryan Cranston as the chemistry teacher who becomes a drug kingpin. The strike happened. If you've always wondered, "Why are there only seven episodes of Breaking Bad Season 1?" That's why. Now, originally, Season 1 was supposed to end with the death of Jesse Pinkman played by Aaron Paul, a moment that underlines the consequences of Walter White's choice to start cooking meth, but the writers had come to really like his performance. As they were on strike, Vince Gilligan, a few of them started thinking about, "Do we really need to kill this guy?" If you've watched all five seasons of Breaking Bad, it is impossible to imagine that show without Jesse. He is that show's soul in so many ways.
Jesse Pinkman: I am not turning down the money. I am turning down you. Ever since I met you, everything I've ever cared about is gone.
Emily St. James: Yet, if the strike hadn't happened, he would have died. I think creative people continue to be creative even when they are not working on these shows actively. Creative people are going to come out of this strike with ideas that will make some of your favorite shows even better. How many Jesse Pinkmans will be saved because of this strike?
Brooke Gladstone: You've said that every time there's a strike, it's really about new technology. Every innovation seems to shortchange the writers. You found that this goes all the way back to the writers' strike of 1960.
Emily St. James: The writers' strike of 1960 was predominantly designed to get screenwriters some sort of residuals for movies that were made before the dawn of television, so movies from before 1948. The Writers Guild eventually did win that after a long bruising strike. That is just the basis of every strike thereafter. There's always some new technology that's coming in that is not strictly covered by the previous agreements made by these unions. Then studios are trying to find a way to find loopholes because of the new technology, and then a strike is usually about closing those loopholes up.
Brooke Gladstone: In the 1980s, it was about VHS.
Emily St. James: There was also what happens when something is aired on cable. In the 2000s, there's so much criticism about DVD and DVR. Every single person in Hollywood is affected when these new technologies come in.
Brooke Gladstone: The technology people are worried about is technology that's been around for about 20 years. That's streaming. It's also about AI, right?
Emily St. James: Yes, as my friend Alissa Wilkinson, my former colleagues at Vox, has pointed out, the fear is less that an AI can be better than a writer because everybody acknowledges that's not true. The fear is that an AI could write something just good enough that enough audiences would fall for it. Again, I have a lot of faith in TV audiences. I don't think that would happen. A decade from now, maybe there's a lot of stuff that is cheaply generated by AI. That's one of the reasons the Writers Guild is striking around this issue.
Brooke Gladstone: The point is that if AI gets used, no matter how it gets used, the union is asking for some boundaries?
Emily St. James: Yes. Basically, the idea is that if you are going to use AI as a tool, it is there to supplement or bolster. It is not there to create. There is a world in which AI becomes a tool that can be helpful. It's the question of making it the tool rather than the thing that is generating from the first.
Brooke Gladstone: You say that people have expected this strike for about three years, but I guess you could assume given that the strike was anticipated that the networks and streaming services were prepared?
Emily St. James: Yes, probably someone like Netflix has stuff banked up. Eventually, there's going to come a point where they're going to realize, "Oh, this is not going to continue past this point because production has shut down." Now, Netflix has a lot of international production. The next season of Squid Game proceeds a pace presumably, so that is a wrinkle this time around that wasn't there in 2007. You're already seeing late-night shows are not airing. Probably in the fall, some of the big shows are going to be delayed.
When Stranger Things finally comes back, those kids are all going to be on pension plans. There is this looming deadline. June 30th is the expiration of the contracts that AMPTP, the group that represents all the studios, has with the DGA, the Directors Guild, and SAG, the Screen Actors Guild. There is an equal amount of irritation and frustration and anger among members of those guilds. Now, will that result in a strike? That is hard to say. The Writers Guild has historically been the union that is most likely to go on strike, but I would be very intrigued to see what happens if all three of the major creative unions go on strike.
Brooke Gladstone: Don't you think that the coverage of strikes has been a lot more positive in recent years than in years past?
Emily St. James: You look at polling for how positively Americans feel about labor unions, and it's at its highest rate since 1965.
Brooke Gladstone: I know some people are irritated a little bit in the working man's hero narrative that the WGA sometimes strikes. The fact is that since the pandemic and the attention of the American public, however briefly on the role of essential workers in our lives, has had an impact on the perception of work in general and on inequality.
Emily St. James: Yes, I think there has never been a period in American history when people have been more aware of the idea that a small number of people have a disproportionate amount of the wealth. Everyone can see the way that that affects whatever industry they're in regardless of that industry. The work that I do is not a blue-collar job.
It is very, very white-collar, but it's a similar situation where I am getting paid what is ultimately a pittance compared to what is being made by the people who are my vast, enormous corporate overlords. That was true when I was at Vox. That was true when I was at The AV Club before that. I think it is a situation that has united a lot of people who traditionally maybe would not have been united. It's been fascinating to watch that sort of solidarity develop among the various groups who do very different things here.
Brooke Gladstone: Thank you so much.
Emily St. James: You're welcome.
Brooke Gladstone: Emily St. James is a former TV critic-turned-TV writer.
Emily St. James: For one day.
Brooke Gladstone: For a day. Her latest article for Vanity Fair was called Can We Really Blame Trump (and the Reality Boom) on the 2007 Writers Strike? Coming up, the third and final part of our exploration with Cory Doctorow into why big digital has gone to the dogs. This is On the Media.
This is On the Media. I'm Brooke Gladstone with the third and final part of our discussion with the great Cory Doctorow, journalist and novelist and special advisor to the Electronic Frontier Foundation, about the process whereby big platforms go bad, a phenomenon he calls "ensh****fication." In part one, we went through the three steps taken by big digital platforms like Facebook, Amazon, TikTok, and Twitter to get richer and get worse.
One, lose money to win customers. Two, benefit big suppliers and squeeze the small ones. Three, squeeze everyone but the shareholders, making everyone miserable but not too miserable to leave. In part two, we discussed how and why this happens and whether big digital is maybe just different from other earlier monopolies. This is the solution section, both uplifting and deeply problematic. The problem, for instance, of passing common-sense regulation, which is hobbled by confusion over how the internet works, abetted by platform designers, and also by the fact that those designers are rich and thus effective lobbyists.
Cory Doctorow: It's not that they're rich. It's that they're rich and united.
Brooke Gladstone: A crucial distinction, one only has to look back on the early days of this century.
Cory Doctorow: Tech at the time was like 150 squabbling small and medium-sized companies that all hated each other's guts and were fighting like crazy, and so lawmakers heard contradictory messages from tech. The consolidation of tech into what Tom Eastman calls five giant websites filled with screenshots of text from the other four, that produced a common playbook, right? Now, we get a lot of tech laws that are very bad that tech has pushed for because tech is able to sing with one voice.
Brooke Gladstone: Congress offers bad solutions because they don't get the internet.
Senator Ted Stevens: The internet is not something that you just dump something on. It's not a big truck. It's a series of tubes. If they're filled, when you put your message in, it gets in line--
Cory Doctorow: You don't have to get the technology in-depth to be able to make good policy. The last time I checked, there weren't any microbiologists in Congress, and yet we're not all dead from drinking our tap water. What you need to be able to do is hold a hearing in which the truth emerges from a truth-seeking exercise where adversarial entities counter one another's claims and an expert regulator who isn't captured by industry is able to evaluate those claims. That's how you get good rules.
Brooke Gladstone: Instead, we end up with regulations that are simply unworkable?
Cory Doctorow: Since the 1990s, every couple of years like a bad penny, someone proposes that we should make cryptography that works when criminals and foreign spies and stalkers are trying to break into it but doesn't work when police officers are trying to break into it or our own spies. Bill Clinton had something called the "Clipper chip." Right now in the UK, there is a proposal about this for instant messaging. It happens all over the world.
Brooke Gladstone: I'm not an expert. It sounds to me like if you're going to try and create an encryption system that will protect you from crooks but not from each other, you're not going to get an encryption system.
Cory Doctorow: Nailed it right there on the head in one. We have a name for what lawmakers do when we point this out. They say, "Nerd harder. We have so much confidence in your incredible genius as a sector. Surely, all you need to do is apply yourself." Sometimes they're right. Sometimes there's a dazzling act that goes on from tech where they say, "This is impossible," and what they mean is we'd rather not do it. That would be things like, "Can you have a search engine that doesn't spy on you?" They're like, "That's like having water that's not wet."
Brooke Gladstone: Which brings us to the first of Doctorow's three prime solutions to ensh****fication, fixing the problem of user privacy. Platform designers say their services can't run without using our data. They rarely say how or why. Why not begin the fix by returning to a form of advertising we had two decades ago, ads based on context rather than behavior?
Cory Doctorow: Let's start with how a behavioral ad works. You land on a webpage and there is a process where the webpage, the publisher takes all the information they have about you that they've gathered through this ad tech surveillance system.
Brooke Gladstone: Which includes what?
Cory Doctorow: Everything you've bought, everywhere you've gone, everything you've looked at, all the people you know, your age, your demographics, your address, everything. They say, "I have here one Brooke Gladstone, NPR host and proud New Yorker who last week was thinking about buying an air conditioner for her apartment." They say, "What am I bid for this Brooke Gladstone?"
That goes off to one of these ad tech platforms. The ad tech platform asks the advertisers, the buy-side platform. They say, "Who among you will pay me for Brooke Gladstone?" There is a little auction that takes place. If you've ever noticed that the page lags when you're loading it, that's the surveillance lag, right? That's the auctions. Dozens of them taking place at once.
Brooke Gladstone: What? How do I not know this?
Cory Doctorow: Yes, it's terrible, right? Bandwidth gets faster, pages get slower, and it's the surveillance lag that's doing it. All this busy marketplace stuff happening in the background.
Brooke Gladstone: Even if an ad company fails to win your behavioral ad auction, the process still gives them a lot of insight into your behavior. Whereas with context ads, they mostly have access to what's relevant and obvious.
Cory Doctorow: You are reading an article about the great outdoors. They look at your IP address and they go, "This is someone in New York." They say that you're using an iPhone, so it's someone who has $1,000 to buy a phone. They say to the marketplace, "Who wants to advertise to someone in New York who's reading about the great outdoors?" The same thing happens and you get an ad, but the ad is not about you. It's about what you're reading.
Brooke Gladstone: The advertiser will know what the publisher of the article knows, not your Google searches or your health concerns or what's in your email address book.
Cory Doctorow: If Congress says, "We are going to pass a comprehensive privacy law," the industry would have to respond with context ads.
Brooke Gladstone: That's one potential privacy fix, but we need more than that. The legislative focus seems to be on children's privacy. Do we have a model in the Child Online Protection Act?
Cory Doctorow: We could if we ever bothered to enforce it. COPA says that you can't gather data on people who are under 13. If you recall when poor Shou Chew, the CEO of TikTok, was being grilled by Congress, there was a congressman from Georgia who was just weirdly horny for whether or not pupils were being measured.
Congressman Buddy Carter: Can you say with 100% certainty that TikTok does not use the phone's camera to determine whether the content that elicits a pupil dilation should be amplified by the algorithm? Can you tell me that?
Shou Chew: We do not collect body, face, or voice data to identify our users. The only face data that you'll get that we collect is when you use the filters to have, say, sunglasses on your face, we need to know where your eyes are.
Congressman Buddy Carter: Why do you need to know where the eyes are if you're not seeing if they're dilated?
Shou Chew: That data is stored on your local device and deleted after use if you use it for facial. Again, we do not collect body, face, or voice data to identify our users.
Congressman Buddy Carter: I find that hard to believe. It's our understanding that they're looking at the eyes. How do you determine what age they are then?
Shou Chew: We rely on age-gating as our key age assurance.
Congressman Buddy Carter: Age?
Shou Chew: Gating, which is when you ask the user what age they are. We have also developed some tools where we look at the public profile to go through the videos that they post to see whether--
Congressman Buddy Carter: Well, that's creepy. Tell me more about that.
Shou Chew: It's public--
Cory Doctorow: He's just baffled as he should be. Rather than the congressman from Georgia saying, "Wait, this is what everybody does? That's terrible," he says, "We're not here to talk about your American competitors. We are here to talk about what you're doing for Xi Jinping." You know what? They're all doing that.
Brooke Gladstone: Congress has settled on another unsatisfying measure. The Child Protection Act doesn't really do anything?
Cory Doctorow: Does it? Can anyone with a straight face look at Congress's legislative intent in passing a rule? On the one hand, they pretty definitely don't mean measure people's pupils and do some kind of digital phrenology to figure out if they're over 13. On the other hand, they didn't mean give everyone a box that says, "I'm over 13." There is another way of thinking about this, which is to say, "Don't spy on anyone in case they might be under 13."
Brooke Gladstone: Congress is reaching back for some old-school, antitrust-style legislation.
Cory Doctorow: Mike Lee has got a bill right now that both Elizabeth Warren and Ted Cruz have sponsored. It says that, at a minimum, the ad tech business should be broken up so that you can be a company that provides a marketplace where people buy and sell ads, or you can be a company that represents publishers in that marketplace, or you can be a company that represents advertisers in that marketplace.
You cannot, in the mode of Google and Meta, be a company that is the marketplace that represents the buyers and represents the sellers, and somehow, even though you claim that this is a very clean arrangement, somehow the share of money going to publishers keeps going down, the cost to advertisers keeps going up, and your margins keep increasing. We could say that you can have a platform or you can use the platform. If you own a platform, you can't own one of the teams.
Brooke Gladstone: Facebook vowed not to spy on us when it started on its road of broken promises. Now, the big platforms claim that reigning in privacy would break the internet. They say the same thing about taking step two in Doctorow's program to pull big media from the dung heap. Step two is interoperability. Consider this.
Cory Doctorow: When you buy a pair of shoes, you can wear anyone's socks with them. When you buy a car, you can plug any charger into the cigarette lighter. In theory, when you buy an iPhone, you could run anyone's software on it. In fact, it is much easier to do that with an iPhone than with a car cigarette lighter. There is this latent computer science bedrock idea that is very important but esoteric, I apologize in advance, called "Turing completeness," named for Alan Turing, the great hero of computer science.
Turing completeness, it says that the only computer we know how to make is the one that can run all the programs we know how to write. You could hypothetically write a program that would allow you to install a different operating system on your iPhone or a different app store on your iPhone. It's not the technical challenge alone that stops it. The real thing that prevents it is that if you tried it, Apple would destroy you with lawsuits. They'll drum up a thousand excuses that, today, we call IP, which colloquially just means anything that allows me to control the conduct of my competitors, my critics, or my customers.
Brooke Gladstone: Of course, in fairness to big digital, they're not alone here. If you buy a John Deere tractor, only John Deere can fix it literally.
Cory Doctorow: Yes, it's a thing they do call "VIN-locking." VIN is vehicle identification number. Computers are now so cheap that they can put a little microchip in every part. After you install the part, the microchip asks the central computer in the engine, "Do you know who I am?" [chuckles] If the central computer in the engine says, "No, I've never seen you before. You're a new part," it says, "Well, I'm just not going to work until the manufacturer sends an unlocked code."
What that means is that if you're a farmer with your $500,000 piece of heavy farm equipment that you paid for with your money that you need to bring in the crops before the hailstorm comes and destroys them and you swap in the new part as farmers have done since tractors began and since plows began, your tractor says, "No, you've got to pay a John Deere technician a couple of hundred bucks to show up and just type an unlock code."
Brooke Gladstone: They got rid of that, right?
Cory Doctorow: They keep making feints towards it. What they've never said and what I don't think you'll ever hear them say is that if you want to just bypass the thing that makes sure that a John Deere technician has overseen the repair, that's your right. They're never going to say that. They are going to continue to claim that even though it's your property that the manufacturer's cold, dead hand rests upon it and that your use of your property is forever subject to their whim. If they decide to be generous with you, that's fine. As Darth Vader says in his MBA course, "I've altered the deal. Pray that I do not alter it further."
Brooke Gladstone: John Deere is doing what big digital does, interposing itself between the customers and their purchased products creating uncertainty by turning knobs by-
Cory Doctorow: -what I call "twiddling." Twiddling is the ability to change the business rules very quickly. There are no real policy constraints on twiddling.
Brooke Gladstone: Cory says three factors gave rise to the new world of big digital and made ensh****fication inevitable.
Cory Doctorow: The first is no competition. For 40 years, we let these companies buy their competitors. We let them do predatory pricing. We let them violate the antitrust law that was on the books because Ronald Reagan said that we shouldn't enforce it the way it was written. All of his successors until Biden said, "That sounds like a good idea to me too." On the one hand, there's just nowhere else to go. Then you have digital is different. The platforms can play this high-speed shell game because there's no rules on how they can change the rules.
There's no rules on how they can alter your experience or harvest your data or do other things that are bad for you. Then, finally, we can't use the intrinsic property of computers. This universality, this touring completeness to step in where Congress has failed and put limits on their twiddling ourselves by changing the technology so it's twiddle-resistant so that when they try to spy on us, our computer says, "I'm sorry. No, I belong to Brooke, not Mark Zuckerberg, and even though you've requested that private data, I am not going to furnish you with it."
Brooke Gladstone: No competition and unbridled twiddling are the first two factors kettling users. The third taps into feelings many of you have had. You can't stay, but you can't go. Because if you do, you leave your community behind, your history. You can't take it with you.
Cory Doctorow: Re-establish the connection with them. There's no right to exit.
Brooke Gladstone: Though some companies have tried to make it possible.
Cory Doctorow: A company called Power Ventures. If you just hold your Facebook login and your logins for all the other services you used, it would put them all into one inbox that you could manage centrally. You could send LinkedIn messages and Twitter messages and Facebook. You wouldn't do it in a way that would allow them to surveil you. They could see the message, but they couldn't see all the things you did leading up to the message and leaving it and so on. It was a great tool.
Facebook argued that it violated the Computer Fraud and Abuse Act. Facebook gives you this kind of Sophie's choice, where either you go and you do what's best for Brooke, the individual, or you do what's best for Brooke, the member of a community. Because if you leave, you leave the community behind. Now, we could just make this a rule. We could say as we do with most out-of-protection regimes like the California Privacy Act, like the European General Data Protection Regulation, and so on, we could say, "If you have some user's data and the user asks for the data, you got to give them the data.
Then we could say to a company like Twitter that is just cruisin' for a bruisin' from consumer protection agencies and is probably going to be operating under a new consent decree, "Hey, your consent decree, now that you've abused your users, is you've got to support this standard so that users can leave but continue to send messages to Twitter. They can take their followers with them if they leave and they can take their followees with them when they leave."
Brooke Gladstone: To recap, to reverse the degradation of our online experience, we rest some control of our privacy by insisting on ads that collect only context rather than every known morsel of information about our earthly lives, then sue for interoperability, the right to use what we own from books to tractors and slap away the seller's cold, dead hand. Finally, we lay claim to our right to exit. The simple idea that signing out of social programs should be as easy as signing up.
All these would take lots of public pressure, but all are possible. In fact, they were normal parts of our online experience, all of them before being thrown on the dung heap with mission statements vowing to give people the power to build community, protect the user's voice, and not be evil. Big digital's current mission statement should be, "It gets worse before it gets worse." I wonder, has anyone ever stopped the process in its tracks? Have users ever rebelled before a platform or a service went south?
Cory Doctorow: Well, I've got some good news for you, Brooke, which is that podcasting has thus far been very ensh****fication-resistant.
Brooke Gladstone: Really?
Cory Doctorow: Yes, it's pretty cool. Podcasting is built on RSS.
Brooke Gladstone: I know that. It stands for Really Simple Syndication that lets pretty much anyone upload content to the internet that can be downloaded by anyone else. The creators of RSS were very aware of how platforms could lock in users and build their tech to combat that. In turn, podcasts are extremely hard to centralize.
Cory Doctorow: Which isn't to say that people aren't trying.
Brooke Gladstone: Like Apple?
Cory Doctorow: Oh, my goodness. Do they ever? YouTube. Spotify gave Joe Rogan $100 million to lock his podcast inside their app. The thing about that is that once you control the app that the podcast is in, you can do all kinds of things to the user like you can spy on them. You can stop them from skipping ads.
The BBC for a couple of decades has been caught in this existential fight over whether it's going to remain publicly funded through the license fee or whether it's going to have to become privatized. It does have this private arm that Americans are very familiar with BBC Worldwide and BBC America, which basically figure out how to extract cash from Americans to help subsidize the business of providing education, information, and entertainment to the British public.
Brooke Gladstone: The BBC created a podcast app called BBC Sounds?
Cory Doctorow: That's right. One of my favorite BBC shows of all time is The News Quiz.
Game Show Host: Welcome to The News Quiz. It's been a week in which the culture secretary suggested that BBC needs to look at new sources of funding, so all of this week's panelists will be for sale on eBay after the show.
Cory Doctorow: You can listen to it as a podcast on a four-week delay. [chuckles] You can hear comedians making jokes about the news of the week a month ago or you can get it on BBC Sounds. From what I'm told by my contacts at the B, people aren't rushing to listen to BBC Sounds. Instead, they're going, "There is so much podcast material available more than I could ever listen to. I'll just find something else," and that's what happened with Spotify too.
Brooke Gladstone: Spotify paid big bucks like hundreds of millions of dollars to buy out production houses and big creators like Alex Cooper and Joe Rogan in an attempt to build digital walls around their conquest's popular shows just to see their hard-one audiences say, "I'll pass."
Cory Doctorow: Now, Spotify is making all those pronouncements, "We are going to, on a select basis, move some podcasts outside for this reason and that." Basically, what's happening is they're just trying to save face as they gradually just put all the podcasts back where they belong on the internet instead of inside their walled garden.
Brooke Gladstone: Maybe it's because of the abundance of content or because like the news business, people are used to getting it for free. Podcasting seems resistant even though no medium is safe from what Doctorow is describing. Ensh****fication sits at the intersection of some of our country's most powerful players, entrenched capitalist values, and the consumer's true wants and needs. How do you see our future?
Cory Doctorow: I have hope, which is much better than optimism. Hope is the belief that if we materially alter our circumstance even in some small way that we might ascend to a new vantage point from which we can see some new course of action that was not visible to us before we took that last step. I'm a novelist and an activist and I can tell the difference between plotting a novel and running an activist campaign. In a novel, there's a very neat path from A to Z. In the real world, it's messy.
In the real world, you can have this rule of thumb that says, "Wherever you find yourself, see if you can make things better, and then see if, from there, we can stage another climb up the slope towards the world that we want." I got a lot of hope pinned on the Digital Markets Act. I got a lot of hope pinned on Lina Khan and the Federal Trade Commission's antitrust actions, the Department of Justice antitrust actions, the Digital Markets Act in the European Union, the Chinese Cyberspace Act, the Competition and Markets Authority in the UK stopping Microsoft from doing its rotten acquisition of Activision.
Cory Doctorow: I got a lot of hope for people who are fed up to the back teeth with people like Elon Musk and all these other self-described geniuses and telling them all to just go to hell. I got a lot of hope.
Brooke Gladstone: Thank you for taking me on this journey with you. I am inspired. [laughs]
Cory Doctorow: Thank you for coming on it.
Brooke Gladstone: Journalist, activist, novelist Cory Doctorow. His most recent novel is called Red Team Blues.
Brooke Gladstone: On the Media is produced by Micah Loewinger, Eloise Blondiau, Molly Schwartz, Rebecca Clark-Callender, Candice Wang, and Suzanne Gaber. Our technical director is Jennifer Munson. Our engineer this week was Andrew Nerviano. Katya Rogers is our executive producer. On the Media is a production of WNYC Studios. I'm Brooke Gladstone.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.