Two Verdicts Find Fault With Social Media Giants
Title: Two Verdicts Find Fault With Social Media Giants
[MUSIC]
Brigid: It's The Brian Lehrer Show on WNYC. Good morning again, everyone. I'm Brigid Bergin, sitting in for Brian today. On Wednesday, the verdict came down in a social media addiction trial in a Los Angeles County court. Meta and Google, owners of Instagram and YouTube, respectively, were found liable for harming a teenager with their apps' addictive features. Court watchers are saying the landmark ruling could open up social media companies to even more lawsuits from users who say they were harmed.
Both Meta and Google say they plan to appeal the decision. The verdict came just one day after another jury, this time in New Mexico, found that Meta violated the state's child safety laws. Joining us now to break down what each trial was about and what it could signal for the future of companies like Meta and Google is Bobby Allyn, NPR technology correspondent. Bobby, welcome to WNYC.
Bobby: Hey, Brigid.
Brigid: Listeners, as we lay out these two cases, we want to consider what harms social media companies have caused to young people. How have you seen it? How have they put them in danger of child predators? Is there any of this resonating for you? Especially for our younger listeners who might have grown up with Instagram and YouTube, was there a moment when things shifted for you, when the platforms were harder to put down or made you feel negatively about your own body image?
Parents of young people, we want to hear from you, too. What issues have you encountered with social media that you feel harmed or could potentially harm your child? Is there anything else you'd like to ask our guest, NPR technology correspondent Bobby Allyn? The number is 212-433-WNYC, that's 212-433-9692. You can also text that number. Bobby, you were in the courtroom for that trial in Los Angeles, so let's start there. Before we get into some of the legal arguments, can you just first explain what the case was about for our listeners who maybe haven't been following the story as closely?
Bobby: Sure. For years, people have tried to bring lawsuits against social media companies over the content that we all see, so toxic or harmful stuff that we see while we're scrolling TikTok or Instagram. Those usually have been thrown out of court pretty quickly because of a law known as Section 230 that makes it really hard to sue tech companies over content. What this case in Los Angeles did, and it was a sort of novel legal strategy, was to not sue over content, but about the design itself.
The lawsuit was essentially saying, "Instagram and YouTube, you built these systems that have enabled toxic content to be sent to kids and teens, and that has led to all sorts of adolescent mental health issues." After seven weeks of testimony from Zuckerberg, from therapists, from technical experts, the jury found that the very architecture of social media, so the features like infinite scroll, auto play, push notifications, the algorithm, were defectively designed to addict children, and the jury awarded the woman at the center of the case $6 million in damages.
Brigid: Bobby, you covered a lot there. I want to go back for a moment to what you mentioned in terms of the federal law, Section 230. Since 1996, websites have been protected for liability against content they say they host on someone else's behalf. I know you've reported on this federal law and how it's protected tech companies for decades. The legal arguments in this Los Angeles case circumvented that law. Can you talk a little bit more about how?
Bobby: Sure. Section 230, as you said, it's been around for decades, and it's been a really pivotal legal shield for Silicon Valley, that since then, anytime anyone tries to say, "Hey, you said something mean about me online," or, "Hey, I saw this piece of content on Facebook, on Instagram, on TikTok, Pinterest," or something, this law has basically said the tech companies are not responsible for third-party user content, so what you and I post. I've been covering the tech industry for years, and you see this all the time.
There's a lawsuit filed against a tech company, they invoke Section 230, the case is thrown out of court. For years, lawyers have been trying to figure out how to get around this seemingly impenetrable legal shield. Again, this case in LA is one of more than 2,000 that have been consolidated around this legal theory. What the lawyers thought is, let's look at social media like a defective toaster, or like cigarettes, or like a defective airbag. Let's see Meta and Google as the manufacturers of a product that is defective.
They went in with that theory because, again, it's not about the content, but it's about how the engineers in Silicon Valley built these products in a way that was negligent and in a way that did not warn users about the potential harms. What we saw in New Mexico and what we saw in Los Angeles confirms and really validates for the first time this legal theory.
Brigid: Bobby, we're talking about companies that are worth trillions with a T of dollars. What kind of damages were awarded in this Los Angeles case?
Bobby: The damages were only $6 million, and for multi-trillion dollar companies, this is a rounding error, of course. Remember, this is a bellwether case, so this was a test case to try to see how juries would wrestle with the facts at the center of the argument that social media companies are defectively designed. We're going to see a number of other bellwether cases around the country going to trial to see what other juries think of and conclude when they hear all the evidence about whether or not social media companies have negligently designed these platforms.
Then there's going to be talks about settling this massive, massive consolidated litigation, which, if the plaintiffs win, they could be talking about many billions of dollars in damages. It's a little too early to speculate about what that may look like, but this was really crucial and really a landmark proceeding in Los Angeles, because, again, this was the first time we saw a jury say, "Hey, you know what, we think this legal theory has real momentum. We're going to side with the plaintiff." The plaintiff's lawyers, Brigid, are saying, "We're really sort of heartened by this decision." They think it is a sign that all of this litigation is going to turn in their way. We just don't know, but that's their hope at this point.
Brigid: I want to play some tape from the lawyer for the young woman in this case, who's identified as Kaley or KGM. Here's her main lawyer, Mark Lanier, about what he says is a huge message this verdict sends to Meta and Google.
Mark Lanier: We've sent a message with this that you will be held accountable just because of the features alone that drive addiction. That's a huge message for these companies.
Brigid: Bobby, as we have mentioned, of course, this case is being appealed. I also want to play a clip from Meta's lawyer. This is Ashly Nikkole Davis speaking to reporters after the verdict.
Ashly Nikkole Davis: Teen mental health is profoundly complex and cannot be linked to a single app. We will continue to defend ourselves vigorously as every case is different. We remain confident in our record of protecting teens online.
Brigid: Bobby, can you talk through some of the tension that we're hearing there? We've got on the one side, the big message, and on the others, what we will presumably hear in the appeal that they are not responsible for what has happened to this particular plaintiff.
Bobby: Repeatedly throughout this trial, I was sitting in the courtroom and watching lawyers from Meta and Google tell the jury that there is no scientific proof that social media has a causal link to mental health distress, that it doesn't cause mental health issues. They also have argued repeatedly that they are basically being scapegoated here for really profound complexities that are often at the root cause of adolescent mental health issues, and that there's a lot to blame here when we look at the youth mental health crisis, but social media should not be taking the fall.
In response to that, the plaintiff's lawyers have said, "No, no, no. We have internal documents here that show that when engineers were building social media, they were thinking about casinos. They built this after studying slot machines. These products were engineered to exploit the developing brains of kids and teenagers, and you cannot say that this has not contributed or at least made worse the mental health issues of many millions of people."
Ultimately, the jury, again, believed the plaintiff over the tech companies, but the tech companies are going to appeal. We will see where that all lands, but that is the central tension over whether or not social media companies ought to be blamed for depression, anxiety, body image issues, and a host of other mental health woes.
Brigid: I want to bring in one of our listeners, Camille in Brooklyn. Camille, you're on WNYC. What's your experience been with social media?
Camille: Hi. Thanks for taking my call. I'm a professional dancer, and I'm 21 years old, and I grew up in the 2016/2018 Instagram era, where it was really used for us to communicate and talk to each other and get to know each other. Now that I'm also a middle school dance teacher, I see how it's evolved into this kind of insidious, almost blockade on girls' self-esteem to even express themselves in person. It's really interesting because I'm also in club spaces, and you see how social media even creates boundaries with that as well.
It's like everybody's hyper aware of the surveillance that's going on, whereas I feel like in the past, originally, why Instagram was so successful is that people regarded it as a, I guess, social stimulant. Now I see how that kind of tactic was so insidious because they let us adapt to it gradually, and now we see how it has taken over and reshaped the way that we connect with each other, especially how we feel our bodies in space, and just how free we are with interacting with each other.
I'm curious because I know we were talking about how social media was modeled off of casinos and how this addiction was intentional. How can we inform the public on this addiction, and how can we normalize it? Because I'm addicted, I think most people are addicted, and so how can we actually address that problem?
Brigid: Camille, thank you so much for sharing that experience. Bobby, I'm curious if what Camille just described there was some of what you heard during that trial, some of the impact on Kaley's life and the lives of others who've said that this addiction has changed their experience growing up.
Bobby: Kaley, KGM, the plaintiff in the Los Angeles trial, took the witness stand and talked about being in school and running off to the bathroom during class so she could check the number of likes and comments that she got on Instagram posts. She talked about almost this literal physical rush that she couldn't resist that made it nearly impossible to concentrate on her studies. She just wanted more likes, more validation, more messages. She was so hooked she couldn't put her phone down.
On what level you could say, yes, she's really connecting with friends, yes, she's kind of staying on top of what's going on in her social universe. What the plaintiff's attorneys argued is that compulsive use can quickly morph into addiction, and that she specifically got really into these beauty filters that smooth out people's skin, that can make people appear thinner than they are, and was using them at a really, really young age, as a preteen.
That completely warped her sense of self and led to suicidal thoughts. It made her think differently about her body, body dysmorphia, and really became a big problem for her. I think a lot of people who are using social media all the time are finding this out, that where is the line between using it a lot, using it compulsively, and being addicted? Is that addiction leading to making you feel worse about yourself? Many people are increasingly saying yes.
Brigid: Bobby, some legal analysts and media pundits are calling this the "Big Tobacco" moment for social media. Can you describe what that really means, put that in some context for listeners who might not be familiar?
Bobby: Yes, sure. It's a reference to the reckoning that the Big Tobacco industry saw after a legal crusade in the late '90s led to a settlement that forced Big Tobacco to stop using Joe Camel and flavored cigarettes as a way to specifically target minors. It's an interesting parallel, because in that litigation back in the '90s, there was a slate of internal documents that really showed the public statements of tobacco executives were not what they were saying internally, that the internal documents were indicating that they did have a strategy of aggressively courting young people. We saw that in the Los Angeles case.
Mark Zuckerberg and Adam Mosseri from Meta, Instagram, have said repeatedly publicly that they value the protection of minors and online safety for kids, but they had internal documents where they were saying themselves, "We got to hook these users as tweens. We need to focus on making sure 11 and 12-year-olds are on the platform, are stuck on the platform, and develop habits so that they never put these products down."
When people are making the comparison between Big Tobacco litigation and this, I think what they're saying is this is a huge moment of reckoning, and perhaps eventually it will lead to systemic platform-wide changes that really reconfigure both our relationship with social media and the kind of protections in place.
Brigid: Listeners, if you're just joining us, this is The Brian Lehrer Show on WNYC. I'm Brigid Bergin, filling in for Brian today. I'm speaking with Bobby Allyn, NPR technology correspondent, about two major rulings in Los Angeles and New Mexico that legal analysts are saying could be a watershed moment to hold social media companies like Instagram and YouTube accountable for the impact they're having on kids.
Bobby, I want to get to the New Mexico case in just a moment, but I want to share one text from a listener. This comes from a parent. The listener writes, "Thanks for discussing this case. Although I agree that social media can be addictive, I staunchly disagree with the verdict. Lots of parents like myself accept the responsibility of working with our kids to prevent addiction." Bobby, in the Los Angeles trial, how much of that was something that came up, this idea of the role of parents in helping maybe coach or oversee how their kids are using these platforms?
Bobby: It was a major theme in the trial and part of the defense because Meta and YouTube really shifted the blame away from their platforms and instead put the spotlight on the troubled home life that Kaley experienced. They said that the deterioration of her mental health was not due to using social media, but was due to the sort of stormy life that she had at home. I think most reasonable people can look at the facts and say, "It can be both, can't it? I don't think there's just one thing that causes social media distress."
If you took a look at the verdict sheets that the jurors had and were studying for more than eight days, it did not say, "Did Instagram, did YouTube directly cause all of these mental health problems?" It said, "Was social media a substantial factor?" That was the question that they were weighing. Was it a substantial factor? In the end, they thought it was, but this whole personal responsibility defense was a big part of the tech company's rebuttal in the courtroom.
Brigid: Let's shift gears, Bobby. In a separate trial in New Mexico, just one day before this Los Angeles verdict, a judge ordered Meta to pay $375 million in civil penalties for failing to protect young users from child predators on Instagram and Facebook. Talk about the contours of this case.
Bobby: Sure. In New Mexico, instead of bringing a case from a private citizen, so the plaintiff KGM, New Mexico had a state action that it was the attorney general's office that was pursuing this. They launched an investigation in which they created several decoy accounts that were posing as children under the age of 14, and almost immediately, these accounts were flooded with sexually explicit messages, solicitations. Through this, investigators were able to arrest, I think, three men who used Meta's platforms to target kids. Two of them were, I think, caught after arriving at a motel expecting to meet a 12-year-old girl. The details are really, really distressing.
The primary finding of the sting was how Meta's recommendation algorithms are designed to connect people with shared interests. That's what Meta says, but we're effectively connecting predators with vulnerable minors. The evidence was so compelling and was so damning that we did get that huge $370 million verdict. I say huge because it's quite larger than what we saw in Los Angeles. Again, the caveat for multi-trillion dollar companies, it's not much, but still, it's a historic amount.
Brigid: Bobby, I understand that the trial's second phase begins on May 4th, and this is not with a jury. This is just with a judge. That's to examine the public nuisance portion of the claim. Can you walk us through what the trial judge will be deciding on then?
Bobby: This will be the second phase of the trial. If the first phase was in front of a jury and was looking at what did Meta do or not do to keep children safe or not keep them safe from predators on the platform, this public nuisance stage is really going to be about future prevention. They're going to be looking at remediation, and that could mean a lot of different things. Is it going to mean court-mandated product changes, like we need new age verification systems. You guys need to do more to crack down on predators, the deletion of accounts that are flagged for predatory behavior. Is there going to be a fight over encryption?
The state lawyers in New Mexico, they specifically targeted Meta's end-to-end encryption on Instagram Messenger. They said it basically shields bad actors there. Are they going to try to undo that? We don't know, but this is going to be a really important first instance where we're going to see what kind of changes are going to be asked for, what kind of changes the judge is going to order. A big question, I think, Brigid, is whatever happens, will Meta just unroll these changes in the state, or is this going to become a new federal standard?
Brigid: Let's hear a little bit. I have a clip from New Mexico Attorney General Torrez on CNBC on Wednesday explaining what the state will be seeking from Meta in that May trial. Let's take a listen.
Raúl Torrez: We will be asking for more financial relief for the state of New Mexico to remedy that, to help support our kids and create a safe digital environment. More importantly, we're going to be asking for injunctive relief. That means changes to the design features of the platform itself, real age verification, changes to the algorithm, and independent monitor to oversee those changes, and fundamentally a demand that they do business differently in New Mexico.
Brigid: Again, that was New Mexico Attorney General Raúl Torrez on CNBC. Bobby, what are you hearing in those demands? To what extent are we seeing companies like Meta being asked to change their practices on a state-by-state basis? I know here in New York state, there are laws governing social media access for minors. Is that something we're going to see more of?
Bobby: I think so. This has been coming sort of a patchwork of various rules that are being first raised on the state level, while Congress dithers when it comes to data privacy and real tech regulations. For the companies, they constantly say they just want one national standard. They also say repeatedly that they believe in essentially self-policing, that you should trust us, we have staff, we have systems, we do everything we can to try to keep young people safe online.
One thing that's important to think about in all of these discussions about tech policies on social media is the First Amendment is used by these companies quite a bit to defeat almost any kind of age verification proposal or regulation that would put up new guardrails on social media, because the companies say, once you do that and it's initiated by the government, that's going to lead to a certain kind of censorship. Some people's voices will be suppressed. Because of that, any kind of change that is likely to happen in New Mexico will probably be instantly challenged on First Amendment grounds. A lot of legal experts say it probably will have a serious shot in court, and that's this real dilemma here.
How do we regulate social media in a way that does not infringe on the company's First Amendment rights, that there have been appellate court decisions that have confirmed that algorithm choices, so the content that you see on your feed, is actually a type of protected speech, that the companies themselves have a First Amendment right to curate and to push content to users. It's part of a big debate, but I think one thing is clear: there is real, real demand and thirst for change on social media. Maybe these trials will set some of that in motion, but it's going to be a fight.
Brigid: So interesting, Bobby. Our segment just before this, I was talking with my colleague, Jessica Gould, who covers education locally for us. All of our conversation was about the use of AI in schools, digital bathroom passes in schools, and how to regulate that balance between what feels like surveillance, what feels like tools to help students and teachers, and how do we keep them safe? It feels like a conversation that's cutting across multiple disciplines. Before we let you go, I want to sneak in a caller, John in Bridgewater, New Jersey. John, thanks for joining us here on WNYC.
John: Hello. Good afternoon. Two quick things. One, everyone's been focusing on these internal documents, talking about modeling the social media app after slot machines. Hasn't there been at least a smattering of research showing the dopamine effects that Facebook and Instagram have and how this is very much can foster some sort of compulsive behavior, if not addiction? My second point is to what your guest just said, these companies want to have it both ways. They want to bandy about the First Amendment, but don't they also prohibit certain kinds of speech as well? All they're concerned about is the bottom line.
Brigid: John, thanks so much for those questions. Bobby, any reaction there?
Bobby: There was a lot of testimony at trial that Instagram and YouTube know exactly what they're doing when it comes to delivering a digital drug every time you pick up your phone and you get a notification and you get content. It's this potent delivery system for dopamine that the lawyers in the case said is essentially a digital casino, and Meta and Google have studied this. What we learned in the trial is that some of the internal studies surfaced the harmful effects, and they buried it instead of talking about it publicly. That was certainly something that came up during the trial.
Brigid: It's so interesting to me, Bobby, to think we are talking about social media platforms right now. We are making allusions to digital casinos, and yet you can't watch a sporting event at this point and not see ads for literal digital casinos, all the online gambling apps. That's a conversation for another day, but certainly a conversation worth having. Bobby Allyn is NPR's technology correspondent. Thank you so much for coming on today and explaining all of this for us. Much more to talk about going forward.
Bobby: Anytime. Thanks, Brigid. Thanks for having me.
Copyright © 2026 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.
