The Social Media Addiction Trials Begin

Download

News clip: This is social media literally on trial.

Brooke Gladstone: In a Los Angeles courtroom, Meta and Google are accused of engineering addiction to hook kids on their social media platforms.

Madlin Mekelburg: They're saying that you intentionally designed your products to be harmful in this way and keep users on there as long as possible.

Brooke Gladstone: Could this spell the end of the Infinite Scroll? From WNYC in New York, I'm Brooke Gladstone.

Micah Loewinger: And I'm Micah Loewinger. Also on this week's show, how some of the efforts to make social media safer for kids is affecting the rest of us.

Julia Angwin: All of a sudden, we're in a world where you have to show some sort of identification to go onto a lot of the portions of the Internet.

Brooke Gladstone: Plus a look at the billion-dollar industry turning the news into a game.

Judd Legum: What is the price of the Democrats will take control of the House? Is it $0.50? Is it $0.60? Is it $0.75?

Micah Loewinger: It's all coming up after this. From WNYC in New York, this is On the Media. I'm Micah Loewinger.

Brooke Gladstone: And I'm Brooke Gladstone. This week, a landmark lawsuit against Big Tech went to trial.

News clip: This is social media literally on trial, platforms like Instagram and YouTube, accused of being harmful to children and designed to be addictive.

Brooke Gladstone: The now 20-year-old plaintiff, alleging the apps targeted her as a minor with harmful and depressive content, leading to self-harm.

Julia Angwin: This is the first trial of several coming this year.

Brooke Gladstone: To be precise, this is 1 of around 2,500 personal injury lawsuits being brought at the state level. At the federal level, there are another 800 or so, as well as a slew of cases being brought by attorneys general and school districts that claim they've experienced harm as a result of their students being addicted to social media, but this one, the one that went to trial this week, is called a bellwether because-

Madlin Mekelburg: The resolution of this case here is going to help inform lawyers, litigants, how they're going to resolve all of these other cases.

Brooke Gladstone: Madlin Mekelburg is a legal reporter at Bloomberg. She spent the week in that LA courtroom when I spoke to her Thursday, and she began by describing Kaley, the plaintiff, who was a minor when the alleged harm happened and so is only being identified by her first name.

Madlin Mekelburg: What we've heard from her lawyers is that as a result of her addiction to social media, she's experienced depression, anxiety, body dysmorphia. In fact, she's so anxious and having such a hard time with this that they're not allowing her to appear in the courtroom at all until she's called to testify.

Brooke Gladstone: Now she's signing on to something they call a master complaint.

Madlin Mekelburg: Yes, one big master complaint compiled by these lawyers. Individual plaintiffs can sign on to the master complaint.

Brooke Gladstone: A bit like the opioid cases across the country.

Madlin Mekelburg: They're getting a lot of comparison to those kinds of cases, to big tobacco cases, because they're being handled in the same way, and it's around this same issue alleging individual injury because of a product.

Brooke Gladstone: You've observed that they're using a novel legal theory. What makes the plaintiff's legal argument so novel?

Madlin Mekelburg: Social media companies in past litigation have cited a federal law that shields them from liability when it comes to content that's produced on their platform.

Brooke Gladstone: Ah, the infamous Section 230.

Madlin Mekelburg: Exactly. That's what's so unique about this lawsuit. They're trying to sidestep Section 230 completely by saying this is not about the content; this is about the design of the platform and, more specifically, the algorithm. They're able to talk about how they've designed this tool that is able to predict what people want to see, give them content that's similar to content they're viewing already, and, in that way, able to hook users and keep them on the platform for longer. It's a product liability lawsuit. They're saying that you intentionally, knowingly, willfully designed your products to be harmful in this way and keep users on there as long as possible.

Brooke Gladstone: We've covered Section 230 before on this show many times. It has the dubious distinction of being both pivotal and boring. [laughter] Basically, the tech companies are saying, "Look, we are not a publisher. We're a common carrier like the phone companies." You're not going to come after the phone companies because somebody made an actionable phone call, but this time it isn't boring because there really seems to be growing support from lawmakers across the spectrum for reforming or getting rid of Section 230, and then the companies could be accountable for whatever gets posted, a slur, a naked picture, hate speech, which sounds kind of wonderful, but it's also kind of chilling at the same time, for free speech. This is a very big deal.

Madlin Mekelburg: The way that the Section 230 applies in this case is that the lawyers are not allowed to argue about the content. It's not about whether the content on the platform itself contributed to her mental health harms, but about whether the platform and the way it is designed fed into these issues for Kaley.

Brooke Gladstone: Tell me about the plaintiff's lawyer. He's a kind of razzle-dazzle guy, Mark Lanier, one of the most renowned plaintiffs' attorneys in the country. He was involved in the opioid litigation. Give me the style of the guy and the approach.

Madlin Mekelburg: Mark Lanier has this folksy charm. Two of his daughters are actually attorneys with him trying this case, sitting at the table with him, and he makes frequent references to them during the case. He's very personable, speaking directly to the jury, and one thing that I found noteworthy about his presentation is that he used a lot of props, more props than I've seen a lawyer use in a courtroom previously.

Brooke Gladstone: Like Gallagher, the comedian.

Madlin Mekelburg: Exactly. [laughter] At one point during his opening arguments, he's talking about what this case boils down to, and he says, "It's as easy as ABC," and he pulls out three little children's blocks that have the lettering ABC on them, and he says, "Addicting the Brains of Children, that's the ABCs of this case," and similarly, while he's been questioning witnesses, he makes great use of this little overhead document camera that allows jurors and everyone in the room to see what he's writing on a piece of paper.

Brooke Gladstone: What he's writing is projected onto a screen?

Madlin Mekelburg: Exactly, taking notes of what the witness is saying while they're talking, and at the beginning of their testimony, he puts up a paper that has a drawing of a road and some signs on it, and he'll say, "This is the road that we're going to go on during our testimony, and these are the stops that we're going to make along the way," to try to help the understand the information that they're hearing.

Brooke Gladstone: It seems some of the most damning evidence against Google and Meta are these internal documents written by their employees.

Madlin Mekelburg: Right. The plaintiffs are going to be relying a lot on emails and instant messaging between employees at these companies, talking about the impact of their product on young users. During opening arguments, Lanier was talking about a chain of messages between a researcher at Meta and one of her colleagues. One of them said, "Instagram is a drug." Another one replied, "LOL, I mean all social media, we're basically pushers."

Brooke Gladstone: Then the response is, "Seriously, it is. We are causing reward deficit disorder because people are binging on IG so much, they can't feel reward anymore. Their reward tolerance is so high," and then the other person says, "Yes, I was starting to think the same thing yesterday when you made that gambling reference. It's kind of scary," and then there's a sad face emoji. How do they respond?

Madlin Mekelburg: Lanier has indicated that he plans to call executives from all of these companies. We heard already in court this week from the head of Instagram, Adam Mosseri. We're also anticipating testimony from Mark Zuckerberg. There's a lot of interest in that day of testimony from the CEO of YouTube, Neal Mohan. Those are the big names, but there's been indications that we're going to hear from other executives, maybe a little bit further down the food chain, who might have different insights to provide about what the company was thinking on a given issue.

Brooke Gladstone: Adam Mosseri is the head of Instagram. He was questioned already. What happened?

Madlin Mekelburg: A lot of the questioning from Mr. Lanier was about an email exchange that happened internally regarding Instagram's decision about whether to ban photo filters on its platform that include effects that can mimic cosmetic surgery. What we saw in some of these emails back and forth between executives as they were weighing this question is concern that these filters have a disproportionate impact on particularly vulnerable users, which would be teen girls who use photo filters more than anybody on the platform.

There's a lot of references to body dysmorphia, how this might impact the self-view, but they ultimately decided to lift any ban of them except for those that explicitly promote plastic surgery on the platform.

Brooke Gladstone: Weren't there also some documents that showed some executives acknowledging that they were targeting children, trying to keep them on their sites as long as possible?

Madlin Mekelburg: We got a little bit of a preview of some internal documents from both Meta and Google talking about the importance of teens and younger users on their platform. Some of that has been in connection to these youth-oriented versions of their product that they've started offering, and that's something that we heard from the companies as kind of a defense to some of these claims is that "We're creating these products geared specifically for young users, and they're meant to be more safe," but there were documents that were shown talking about how the earlier that a young person is on the platform, the more likely they are to stay on the platform. It's definitely a topic that was discussed by leaders of these companies at some point.

Brooke Gladstone: What is going to be the defense's main argument?

Madlin Mekelburg: Their argument is that the mental health experiences that Kaley was having, they were not caused by their platforms. They talked about medical records from Kaley, from treatment she's had with therapists. We got to see excerpts from some of those. Kayleigh had an incredibly challenging life. She had complicated family dynamics at home.

She witnessed domestic violence. What the lawyers for these companies have been arguing is that if you took YouTube away from the equation, took Instagram away, would she still be suffering from the same mental health struggles that she is right now? They argue that the answer is yes. One thing that we heard from the lawyers about consistently as their defense was how little time they say that she was actually spending on these platforms.

We heard from a lawyer for YouTube that, looking at her viewership data, which they have access to going back to 2020, she was, on average, viewing YouTube every day for 29 minutes. She watched an average of about four minutes of videos that were suggested by Autoplay, which is one of the features that her lawyers are taking issue with as being addictive, and then she also averaged about a minute spent watching YouTube shorts, which are those newer, vertically oriented videos that you see on YouTube.

Brooke Gladstone: That's incredibly short amount of time, and this is over years. That was the average.

Madlin Mekelburg: That's the average over the past five years. That's what the lawyer for YouTube said, but then we also heard from Mr. Lanier, who said that her highest usage day on Instagram was recorded in March of 2022, and that was 16.2 hours in a single day spent on the platform. It's going to be up to the jury to decide, "What do those numbers actually mean? What value do they have in determining addiction and the culpability of these platforms?"

Brooke Gladstone: Getting back to the defense, they also talked a lot about safety features. Yes, you have all the constant notifications and the liking affecting the dopamine receptors in your brain. "You can just turn it off. We have parental controls that can limit these features," they said, right?

Madlin Mekelburg: That's right. What we heard from lawyers for these companies is that these tools are incredibly effective, but you have to use them. They were highlighting what they say are failures by Kaley’s mom to use these parental controls at all. They say there's no evidence that they were being used.

Brooke Gladstone: Our show has covered a lot of moral panics sparked by the rise of new technologies. Did you know that novels were once considered very dangerous for young women because reading them could spur madness in their delicate minds? Comics spurred congressional hearings, all terrible for the young, but the lawyers representing Kaley are claiming addiction. Is it akin to cocaine, or is it more like gambling? Both kinds of addictions, chemical and behavioral, can affect parts of the brain. What kind of addiction will the plaintiff's lawyers be arguing is being fostered by social media?

Madlin Mekelburg: We're definitely talking about the clinical addiction category. To be clear, this does not represent all users. They're talking about a really specific subset of individuals that they say have become addicted to the platform, like Kaley, who they say were particularly vulnerable because of trauma she's experienced in her past; people who had difficult experiences are more susceptible to addiction. That's one of the things that we heard from this expert. This case is intended to be narrowly tailored to just these people that they say were suffering and experiencing this clinical addiction to social media.

Brooke Gladstone: You've been covering this issue for a while. What do you think's the most important thing that's going to come out of it?

Madlin Mekelburg: The verdict, and I don't say that to be cute. This case is the first of its kind to go to trial. There's going to be interesting testimony. There's going to be interesting documents that give us some insight into what these companies were thinking, but the most important thing here is going to be what the jury says. This is an issue that we've been talking about for years as a society, as a culture, but this is the first time that it's actually going to be decided officially in court, and that's going to have impact on all these other cases and future efforts to address this issue or make changes in this space.

Brooke Gladstone: Madlin, thank you so much.

Madlin Mekelburg: Thank you for having me. It was such a pleasure to talk with you.

Brooke Gladstone: Madlin Mekelburg is a legal reporter at Bloomberg. Before we speculate on the future of the monopolies that own and tweak social media and the millions who cheerfully use them, abuse them, or are abused by them, let's detour briefly into the meaning of addiction as applied to social media. First, how do you know you're addicted? Maybe you just have a bad habit.

Ian Anderson: People generally use what's called a symptom-based approach.

Brooke Gladstone: Ian Anderson is a neuroscience researcher at the California Institute of Technology.

Ian Anderson: Feelings of withdrawal or becoming restless or troubled if you've been prohibited from using the app or compulsive urges to use it. Those symptoms of addiction have to be experienced quite frequently for something to qualify as a behavioral addiction. Those things aren't part of a habit. Habits are generally performed, especially when they become very strong in response to the context in which they've been repeated in the past.

For example, a smartphone, which is constantly with you all the time, the smartphone itself, or the presence of the app icon on the screen, or notifications can trigger a scrolling habit or a posting habit because they activate in response to cues. These are really strong habits, resistant to willpower, and it's quite likely that feeling like you can't control your behavior with willpower leads people to say, "Oh, I must be addicted to the app," even if their behavior is something that wouldn't rise to a clinical standard of social media addiction.

Brooke Gladstone: Anderson recently published a study of roughly 400 users of Instagram, about 18% of whom said they were pretty sure they were addicted. Only 2% actually were at risk of addiction. Why did so many users overestimate their problem? Part of it may have something to do with this.

Ian Anderson: We did a media analysis in our paper as well, and we found that social media addiction is mentioned about a hundred times more in media articles than the phrase "social media habit." If "addiction" is the term that you hear all the time, it starts to become normative, and that not only leads to people using the term, but maybe makes people also feel like that's the right term.

Brooke Gladstone: Thus, uneasiness about staring way too much at your phone, plus muddled media messages, inflate a bad habit into an addiction, and that's not good.

Ian Anderson: Our first study was looking at that overestimation. The second study was looking at what happened when people internalized the addiction narrative. We had people read a text basically saying, "Social media can be addictive," based on the US Surgeon General's warning that was released around the time we did the study, and what we found is that when you internalize this addictive narrative, it makes you feel less like you can control it, less like you're going to be able to control it in the future, and also it made people recall more failed attempts to control it in the past, which is really interesting.

On top of that, it also made them blame themselves more for overusing the app. Framing all heavy use as addiction is a potential harm, and also might divert people from useful solutions that are based on habit science that we know can work.

Brooke Gladstone: Nevertheless, serious habits are seriously hard to break. All the research points to a real problem and not an unintended one.

Ian Anderson: What can be said certainly is that social media companies do design in order to build very strong habits of use. When you're building very strong habits, there is the potential for people to develop behavioral addiction, or at least the risk of it.

Brooke Gladstone: He doesn't want his research findings misconstrued.

Ian Anderson: The concern is that because we're arguing addiction is quite rare, they might try to use it as kind of a way to sell, "Oh, these products aren't harmful at all." I want to see the companies be held accountable for the harms that the designs of their platforms do. Habits can also have pretty negative consequences, and certainly people can be led to undesirable usage patterns, or dark patterns, as user interface designers often call them, even if they don't necessarily rise to the level of clinical social media addiction. That's something that I don't want to get lost in the discussion.

Brooke Gladstone: Ian Anderson is a neuroscience researcher at the California Institute of Technology.

Micah Loewinger: Coming up, maybe you should grayscale your phone. All the kids are doing it.

Brooke Gladstone: This is On the Media.

Micah Loewinger: This is On the Media. I'm Micah Loewinger.

Brooke Gladstone: And I'm Brooke Gladstone. As we've just heard, whether it be addiction or not addiction to social media is not really the question. The whole point in trying to name the problem, whether habit or addiction or nothing, is to fix it. Julia Angwin, investigative journalist and founder of Proof News, a nonprofit journalism studio, recently wrote about how she got a grip on her big small-screen problem, and we launched our conversation with a description of one of the big fixes she found.

Julia Angwin: I use something called grayscale, where I basically turn color off on my phone. It changed my behavior so dramatically that it's actually somewhat embarrassing. [laughter] I was using more than eight hours a day on my phone, and I'm not proud to admit that. When I switched to grayscale, it immediately dropped to four hours, and I've stayed on grayscale since, which is almost four or five months now that it hasn't gone back up.

It just changed my relationship with my phone in the most positive way I could've imagined. I pick it up when I need to do something, but I have no feeling like I want to pick it up. I didn't really understand how strong that feeling had been until it went away. I felt like I had just been released from an addiction. I understand why people use the language of addiction because it did emotionally feel that way.

Brooke Gladstone: Anderson told us that one of the signs of addiction is experiencing withdrawal.

Julia Angwin: Yes. I didn't feel any withdrawal; I felt relief. I felt embarrassed. I can't believe I was a color addict. I realized there's something it does in your brain, which I don't totally understand, and now what happens is when I turn color on, I feel grossed out by it, and I have to turn the color off immediately.

Brooke Gladstone: What do you do with the four hours you just got back each day?

Julia Angwin: Well, of course, Brooke, I'm extremely productive.

[laughter]

Brooke Gladstone: I want to discuss, though, two other studies that address whether social media, if not addictive, is harmful. Australian Researchers followed over 100,000 Australian adolescents across three years and found that the relationship between social media use and wellbeing, it's not linear; it's actually U-shaped. Children who use social media moderately have the best outcomes, while those who use it very much or not at all have worse outcomes.

Julia Angwin: I think that really speaks to something that I am always trying to remind people of. There are huge benefits to social media. People who are alone in a community, don't know how to find other people with similar views, can use social media to find their group, and we have forgotten that, because there are so many toxic things that can happen also on social media.

Brooke Gladstone: Yes, I remember those communities being created, and it was wonderful, and then you also realized that a lot of neo-Nazis who would be snubbed by their fellows and unable to express it and just had to sit in a bar crying over their beer, suddenly found a group of like-minded neo-Nazis.

Julia Angwin: Many types of communities have been able to form on the Internet, and we're in the midst of a rise in right-wing populism around the world and authoritarianism, and you cannot really unlink that from this new medium, but I'm not sure I would want to go back to a time where we couldn't find people to communicate with online who are like-minded.

Brooke Gladstone: Now, the second study comes from researchers at the University of Manchester. They followed 25,000 11- to 14-year-olds over three school years and concluded that screen time spent on social media or gaming doesn't cause any mental health problems in teenagers whatsoever.

Julia Angwin: This is a really amazing finding because it really speaks to the fact that we are very unspecific when we talk about problematic phone usage, and the reality is people do a lot of different things on phones, right? They play games. They message their phone friends. They doom scroll. I spent when I was a teenager, hours and hours and hours on the phone with my friends, which is probably the same maybe as somebody who's texting and messaging their friends all night on their cell phone.

Nobody was saying there was a problematic piece to it, although my parents were really upset with the overuse of the phone line. When we collapse it all into one thing called screen time, we are not looking at the underlying behaviors, and some behaviors are more problematic than others.

Brooke Gladstone: Most of us can recognize when both kids and adults overuse social media; it does cause harm, whether it's addiction or ingrained habit, or just being in a perpetual state of rage. Let's talk about the tools, the remedies being used across the globe. In December, Australia instituted a social media ban for all kids under 16. I guess they ignored the results of that vast Australian study. It's been going on for about two months now, and there are heaps of reports of teens circumventing the ban within minutes via VPNs, and now the UK, France, some other countries are considering following Australia's lead. What's at stake with using such a blunt-force tool?

Julia Angwin: A ban based on age is, in fact, a requirement for everyone to prove their age, right? In order to do that, you often have to show some sort of identity, your driver's license or something. All of a sudden, we're in a world where you have to show some sort of identification to go onto a lot of the portions of the Internet, and that is not a world that is very safe because the idea of browsing the Internet anonymously, reading whatever you want without any consequences, is actually fundamental to the freedom that the Internet promised.

There's this empowerment of being able to seek all this knowledge, just like we do in a library. When you go to a library and look at books, there's no record, even if you check it out. Every state in the United States has a library confidentiality law saying that they won't turn the records over. I want that same level of privacy for the things I read online, and these kinds of laws that Australia has passed threaten that in a very dangerous way.

Brooke Gladstone: Let me bring up another increasingly popular remedy, and that's phone bans in schools. 31 states and the District of Columbia have already instituted them. It seems like getting smartphones out of schools improves academic performance, and I saw a poll that found even the kids like it, or at least they don't hate it. What do you think about that?

Julia Angwin: Teachers and the studies I've seen of students say that the learning experience in the classroom is better without phones. Let's say that that's true. I still don't think that that means we should be passing laws about this, and here's the reason: schools should be setting policies behavior that don't involve law enforcement. We already have too much of the school-to-prison pipeline, where students in vulnerable communities are over policed for "bad behavior in schools," and then that gets them into the law enforcement system, and so I don't know why we have to pass laws for this. Schools should just have a policy, which is fine.

Brooke Gladstone: You say that ICE goes into schools, seizes teachers, and you want kids filming that, but to do that, they need their phones.

Julia Angwin: Yes. Right now, our best weapon as citizens against state violence committed by ICE is filming them.

Brooke Gladstone: I wanted to bring that up, but I still think if these surveys reflect the truth, that it would be, on the whole, better if their phones were put away when they're in school.

Julia Angwin: Yes, I absolutely agree, but it's just a question of under what conditions are they put away, are they confiscated, put into weird bags that the kids can't get access to, or is it just in the kids' locker where they can go get it at the end of the day? There's also a difference in terms of the rules under which the school can access kids' phones. There are schools where they've had enlightened policies like, "We're not going to just take your phone to see if you were using it and scroll through it," and then there are schools where they're just taking them and doing inspections.

Really, what we need are really good school policies that are enlightened and take into account the Fourth Amendment issues about searching people's phones, the First Amendment issues about your ability to film the police, right? I think those things need to be taken account in these school policies.

Brooke Gladstone: Most of these tools, age verification before you can access social media, phone bans in schools, they are intended to protect kids from social media overuse, but you've argued, and I've thought this myself, that you're less worried about your teenage kids than about your parents.

Julia Angwin: I have had to prevent my dad from sending thousands of dollars in cash to somebody on the phone who convinced him, right? This is a man with a PhD. Seniors are being scammed so aggressively over the Internet, and this harm is actually really well quantified. We know it, and the weird thing to me is that we're having this moral panic about kids while ignoring this huge population that is absolutely suffering right now. If we're going to think about how to mitigate harm online, we need to think about seniors as much as we talk about the kids. One thing I've noticed about my kids: they are much smarter.

Brooke Gladstone: Tell me about that. You've said that when you told them about going to grayscale, they sort of snorted and said they'd done that a while ago. [laughs]

Julia Angwin: I know. It was actually really embarrassing. I had just learned about grayscale, and they were like, "We literally did that years ago." [laughter] It wasn't months; it was years. They have little apps to limit how much time they spend on different apps. I think there's a lot more we can do, but I would also say that we as individuals only can do so much. These companies that make these tools also have really abdicated their responsibility. There's things that we could demand of them if we had a functioning legislature that would be really helpful, too.

Brooke Gladstone: What would you start putting in the hopper to get these companies to be accountable?

Julia Angwin: Well, we need to break them up. We kind of got close in the last administration with some of the efforts by the Justice Department.

Brooke Gladstone: In one case, the decision went for Google.

Julia Angwin: Well, that case was complicated. Google was ruled a monopoly, but then the judge said that AI was going to provide enough competition that there was not a remedy needed, like a structural breakup, but it's also worth noting that these battles take years. It took a long time for the public and courts to come around to breaking up Standard Oil. I think we're still in the midst of that fight, and it's still a really important goal. Because these companies control our entire information ecosystem, we have to break them up so that they have incentive to compete on quality. Right now, they don't have that incentive.

There was a time when AT&T prevented us from having any innovations in phones, and we managed to break them up. I am kind of an irrational optimist that I think we will get there eventually, even though I agree with you, there aren't that many signs on the ground right now.

Brooke Gladstone: Let's talk about Infinite Scroll. It's one of the design elements that almost everyone says these apps ought to get rid of. EU regulators have said that this feature is "most likely illegal." How does it work, and how do you get around it?

Julia Angwin: TikTok is sort of the canonical example, or Instagram, where you look at your feed, you see a video, it just goes to the next one and the next one, and there's never an end. You just keep scrolling, scrolling, scrolling. They want to keep you on the platform as long as possible to show you as many ads as possible. There's really no healthy reason why you would have that feature other than that.

When we first started on social media, we just saw posts from our friends, and we had essentially chosen that feed. I really think another intervention that would be super helpful for reclaiming our agency is to have our ability to tune our feeds. On Bluesky, you can choose. I have a feed of just Tech News. I have a feed of cat pictures, I have a feed called Happy News, but I also think we should have the ability to just click a button on a post, like, "I don't want to ever see something like this again," which, by the way, sometimes they offer you, but then it's not true.

I keep going into my son's feed and clicking on the Manosphere stuff like, "Don't show me again," and then it's there the next day, but a meaningful control over our algorithmic feeds, I feel like, is also a really important intervention that we need.

Brooke Gladstone: As you've observed, there are potential consequences to some of these remedies, like age verification or social media bans. They can lead to an increase in unauthorized data collection, censorship, or surveillance. What's at stake for the future of the Internet?

Julia Angwin: Right now, we're in a very dangerous time for the Internet because it's such a powerful medium that we use to access basically every part of our life. All of that is gatekept by a few companies, who can really turn off or turn on our ability to participate based on their whims. I don't know if you remember when Zoom cut off a Chinese dissident from being able to use Zoom because they were giving a favor to the Chinese government.

We have these companies that can really mediate our citizenship, and that is very dangerous, especially as these companies have become more comfortable being very political, and so it's important for our democracy to regain control over our ability to communicate with each other without it being basically suppressed by one of these tech platforms.

Brooke Gladstone: Thank you so much, Julia.

Julia Angwin: Thank you. This was a treat.

Brooke Gladstone: Julia Angwin is an investigative journalist and the founder of Proof News, a nonprofit journalism studio.

Micah Loewinger: Coming up, Brooke, I'm going to bet a dollar that our next segment is about prediction markets.

Brooke Gladstone: You win. This is On the Media.

Brooke Gladstone: This is On the Media. I'm Brooke Gladstone.

Micah Loewinger: And I'm Micah Loewinger. While lawyers and SC are sorting through the long-term effects of so-called social media addiction, there's another beckoning glow coming from our screens.

Judd Legum: Yesterday was Super Bowl Sunday, and some billion or so dollars was bet by Americans.

Micah Loewinger: A big day for big bets for sure, but a smaller news item lurked behind the headlines. The Nevada Gaming Control Board reported the lowest dollar amount wagered with the state sportsbooks since 2016, and that's potentially because there are some new players in Vegas, companies like Kalshi and Polymarket. They're known as prediction markets and claim that they're more like the stock market than traditional betting, since you're wagering against other people rather than the house. And you can bet on all kinds of things, from the order of songs that Bad Bunny would play at the Super Bowl to tomorrow's temperature to a word that might come out of somebody's mouth.

Trevor Noah: Welcome back to the Grammys. Potato.

Micah Loewinger: Trevor Noah.

Trevor Noah: If you had me saying potato on Polymarket, you just made a ton of money. Congratulations, Noah_22, whoever that is.

Micah Loewinger: At a White House press briefing in January, 98% of traders wagered that Karoline Leavitt would speak for more than 65 minutes, only for her to say, with a quick glance at the clock-

Karoline Leavitt: Thank you all very much. It's great to be back with you.

Micah Loewinger: She hustled off the podium with just 27 seconds to spare. Sports betting exploded in the United States after a 2018 Supreme Court decision legalized online gambling. Polymarket and Kalshi sprang up in the aftermath, and recently, both companies have made partnership deals with new organizations. Judd Legum is author of Popular Information, an independent newsletter dedicated to accountability journalism. After tuning into this brave new world of news gambling, he says that the Kalshi ticker has become a strangely normal part of CNN coverage.

Judd Legum: In addition to whatever discussion is going on, you will see something at the bottom of the screen saying, "What is the current state of the market on Kalshi? What is the price of, 'Yes, the Democrats will take control of the House. Is it $0.50? Is it $0.60? Is it $0.75?'" and integrated into the discussion, they'll talk about the movements in these prices and what the Kalshi market is predicting.

Micah Loewinger: Harry Enten, who was a kind of polling whiz kid in his former life at FiveThirtyEight, now will pull up Kalshi odds and report on them as if they have explanatory power when talking about serious news events. Here he is in January talking about Trump's attempts to strong-arm control of Greenland.

Interviewer: What about the prediction markets?

Harry Enten: Yes. Are people in the public taking it seriously? The people who are putting their money where their mouth is, they're absolutely taking it seriously. Take a look here. The chance that Trump buys any of Greenland by the end the of his term. On Friday, it was just 12%. Whoa, way up there now to 36%, a tripling in less than a week!

Micah Loewinger: They are promoting gambling on the seizure of a sovereign nation, in this case, right? Not only is that just trivializing what should be treated as an absolutely insane news story, but journalistically speaking, why would a news organization privilege the opinions of a group of random people?

Judd Legum: I think that this is part and parcel, especially in Trump's second term, of news organizations to at all times avoid accusations of bias. If you were to talk about, "Well, what are the implications of either invading or strongarming Denmark to give up some of Greenland, how would that impact the world?" you might be seen as biased, but if you're simply discussing the numbers, just the data, you can insulate yourself from that.

Micah Loewinger: Of course, CNN is not alone. The Wall Street Journal, CNBC, Yahoo Finance, Sports Illustrated, and Time, they have all inked deals with sites like Polymarket and Kalshi. What is in it for them to push a gambling addiction onto their viewers?

Judd Legum: Well, why have the sports leagues adopted this so enthusiastically? Because I think they understand that if you have even $20 wagered on a game, you are more likely to tune in. I think news organizations that have seen their readership or viewership decline see this as a way of creating more loyal viewers or readers.

Micah Loewinger: You quote Kalshi CEO Tarek Mansour, who said, "Kalshi is replacing debate, subjectivity, and talk with markets, accuracy, and truth." In effect, you write, "Kalshi's betting market being treated as news is based on the efficient market hypothesis." What do you mean by that?

Judd Legum: There is a school of thought, something that has been adopted by Peter Thiel and some of the funders of these betting markets, that in fact, it is the marketplace that is the clearest path to discovering what truth is.

Micah Loewinger: As you write in your piece, a lot of these markets, although they may appear to carry a hefty tag, we're talking about market caps of millions of dollars, they're nothing close to, say, the size of the stock market.

Judd Legum: That's right. Probably the presidential market, who's going to win in 2028, that's going to be a fairly large market, but as you go down to a congressional race, or as you go down to some of these more obscure news events, or even things like, "Will the announcer during this sporting competition say a particular word?" these are going to be very small markets, and what the price of a certain outcome is really has no relationship to truth or even probability; it just relates to how much a handful of individuals who have access to wealth have decided to put down in this market.

Micah Loewinger: Okay, but if we had Kalshi's CEO sitting with us here, he may say something like, "Well, look at our track record. Prediction markets picked Trump to win the 2024 presidential race. They supported Zohran Mamdani's odds for the New York City mayoral race in 2025." Is there not some predictive power on display here, or are you skeptical of the examples and data that have been cited thus far?

Judd Legum: I don't think that there is nearly enough data. These sites are so new that you can draw any conclusions about their accuracy. Look, they deal with binary questions. They're either right or wrong. They're either predicting an outcome or not, and so they're going to be right some of the time. Does that mean they're better at predicting something than some other methodology, maybe traditional polling or other things? That I don't think we could possibly know, but I think the larger question, especially when it comes to news, is why do we need to predict these things in the first place?

Micah Loewinger: Because we have to know.

[laughter]

Judd Legum: Why is it necessary to have a prediction market on "Will the government be funded by next week?" Next week will come, and we will know. Isn't it more important to spend that time talking about "What are the issues that people are talking about that need to be resolved before the government can be funded?"

Micah Loewinger: You argue that these betting markets are still small enough to be disrupted by individuals. How so?

Judd Legum: If you look at Kalshi, there's $50 billion in annual volume, that's about $135 million per day, but if you break that down further, this is spread across thousands of different markets. Once you look at, for example, a market about a special election in Tennessee, it's just going to be a few million dollars at most, probably less than that in many cases, and therefore anyone who's interested in changing the narrative in the context of elections on any level, this can be done now, on Kalshi for tens of thousands, even thousands of dollars.

Micah Loewinger: Our producer, Becca, actually observed this example while looking at Kalshi before one of Fed Chair Jerome Powell's press conferences in January. There was a mention market, so betting on what words he would use during his remarks, and the following Q&A, a commenter in Kalshi's chat posted a few minutes before the conference went live that there was a script out and that Powell was going to utter the phrase "national debt," and that fellow bettors should hammer it.

Needless to say, Powell didn't end up uttering those words, but perhaps someone made a killing betting against that he would say it. In small ways, it's not hard to see how easy it is to make a profit by manipulating people you're betting against.

Judd Legum: Yes, and that example illustrates the problem where you have markets where there is a number of people who know exactly what the outcome will be in advance, and although Kalshi has rules against insider trading, it's far from clear that they have any way to enforce those rules.

Micah Loewinger: Yes. Recently, MS NOW host Chris Hayes posted a video to social media talking about the betting market that emerged around what topics he would bring up in his interview on Stephen Colbert's show. NPR's Bobby Allyn alerted him.

Chris Hayes: That market, which when Bobby first sent it to me was like a $22,000, goes to this $800,000, almost $900,000, like close to a million-dollar market.

Micah Loewinger: As Hayes discussed in his video, the interview with Stephen Colbert was prerecorded, meaning Chris Hayes, someone in the studio audience, a producer working on the CBS segment, anyone with inside knowledge could've made a killing in this market by just betting before the interview aired. Just another example of potential insider trading, an anonymous bettor-

News clip: -bet $32,000 that the Venezuelan President would be out of power by January 31st. The following morning, at 4:21 AM Eastern, the President announced Maduro was in custody, and that user cashed out, making more than $400,000.

Micah Loewinger: Another user made over a million dollars betting on what the top Google searches would be in 2025, which raised suspicions that this was an internal Google employee with prior knowledge of the data, but aren't there mechanisms and a regulatory system aimed at combating this exact kind of fraud?

Judd Legum: Well, that's right. There is a mechanism by the SEC that is aimed at preventing insider trading. Unfortunately, the SEC is not involved in regulating these markets at all. It's the CFTC, which regulates commodities and futures markets, and they have no history of regulating insider trading. In fact, those markets traditionally are used by insiders to hedge their risk in the future, and it's completely acceptable.

Micah Loewinger: It's legal and even normal to act on inside information?

Judd Legum: Yes. Let's say that I'm Frito-Lay, and I'm going to be coming up with a new, what I suspect might be a very popular version of Doritos, which is going to require a lot of corn, and I know I'm going to need to buy an immense amount of corn six months from now to make these new Doritos. I can then use the corn futures market to lock in the current price to hedge against any disruption that my own purchases might make, and that's seen as a legitimate and prudent use of the futures markets.

It's a totally different mindset than the SEC, where using that kind of insider information, trading your own stock because you know of some new initiative that your company's going to do that you think is going to be very profitable, is prohibited.

Micah Loewinger: Because Kalshi and Polymarket are regulated by the Commodity Futures Trading Commission and not the SEC, there isn't really a mechanism for enforcing insider trading?

Judd Legum: Well, there's still prohibitions against fraud, right? You could argue that you're committing fraud against the other people in the market by using insider information because you're participating as if you are on the same level ground as everyone else. The real issue becomes the fact that the CFTC has no experience, no staff, no ability to really enforce those kinds of laws, unlike the SEC, which has been doing this for decades and decades and decades.

Micah Loewinger: Of the three big issues with this industry, you list manipulation, insider trading, and trivialization, this feeling that betting on very serious current events kind of flattens the news. I agree with you that there's something a little sick on making money on these events, but hasn't that ship already sailed?

Judd Legum: I think there's always been two aspects to news. There's certainly always been the entertainment and voyeurism aspect. We want to know where celebrities are vacationing. We want to know about affairs, and that's not a new thing. That's been going on probably since the invention of news, but the question is, is there a place anymore for another kind of news?

Because certainly there's another tradition where news is an essential part of people's ability to inform themselves as citizens, to decide who they might want to vote for, who they might want to donate to, what they might want to protest, how they might want to live their lives in light of what's going on around them, and what you have with Kalshi is it is essentially invading those spaces and turning the news into this game to essentially deemphasize the part of news that is probably the most important, which is that it's about informing yourself as a citizen.

Micah Loewinger: Judd, thanks so much.

Judd Legum: Thanks for having me.

Micah Loewinger: Judd Legum is the author of Popular Information, an independent newsletter dedicated to accountability journalism.

[MUSIC - Sex Pistols: Pretty Vacant]

Micah Loewinger: That's it for this week's show. On the Media is produced by Molly Rosen, Rebecca Clark-Callender, and Candice Wang. Travis Mannon is our video producer.

Brooke Gladstone: Our technical director is Jennifer Munson, with engineering from Jared Paul. Eloise Blondiau is our senior producer, and our executive producer is Katya Rogers. On the Media is produced by WNYC. I'm Brooke Gladstone.

Micah Loewinger: And I'm Micah Loewinger.

[MUSIC - Sex Pistols: Pretty Vacant]

 

Copyright © 2026 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.

New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.

WNYC Studios