Social Media and the Charlie Kirk Killing
[MUSIC]
Brian Lehrer: Brian Lehrer on WNYC. The assassination of Charlie Kirk at Utah Valley University last year was obviously incredibly disturbing and horrible in its own right, but also very disturbing was how quickly graphic footage of his final moment spread across social media platforms. Just moments after Kirk was fatally shot in front of a live audience of thousands, multiple videos from various angles of the event, as many of you know, went viral on nearly every social media platform.
If you happen to be casually scrolling through X on Wednesday afternoon, you are likely served the gory video unprompted amongst the mundane content that you typically consume. That's just one example. We're not even singling out X here. This resulted in what some are describing as a mass traumatic event that's raised urgent questions about content moderation and our relationship with violence online.
While questions still remain about the shooter's motives and the long-term consequences this act of political violence will have on the nation like we were discussing in our previous segment, we will look now at the ubiquity of violence on social media, and specifically how many children, among others, wound up seeing this awful murder without the platforms doing anything preventative that, arguably, they could have done without stifling free speech. We'll do this with Adam Clark Estes, senior technology correspondent at Vox. Adam, thanks for joining us for this. Welcome to WNYC.
Adam Clark Estes: Thank you so much.
Brian Lehrer: How quickly did what go online and who saw it?
Adam Clark Estes: The videos went online immediately. The assassination happened in front of a crowd of people who were already carrying smartphones, recording on them, and uploading content. Some people were live-streaming. Essentially, as soon as it happened, it went online. I think there's a good chance that a lot of these social media platforms knew the video was online and spreading right away, too.
Brian Lehrer: Well, how quickly could they have known? Because it's a pretty routine thing these days, right? Something happens, somebody uploads a video right away to whatever platform, and it starts to spread. How quickly could content moderators or anybody at any of these platforms have intervened in this?
Adam Clark Estes: Well, this is assuming that the content moderators are still at their desks. At all of these platforms, content moderation has been scaled back from what it once was. Often, that's done in the name of free speech. That was definitely the case with X after Elon Musk bought Twitter. You could argue he bought Twitter in order to turn it into a free speech platform. That became X and content moderation. The budget got slashed.
The same thing has happened at the Meta platforms. Mark Zuckerberg announced earlier this year that he wanted the Meta platforms to encourage more free speech, so they cut back on fact-checking. There are still people employed at these platforms that look for content that violates their policies, but they're taking that content down a lot less. They're doing a lot less to stop the spread of it. I think we saw that happen very violently in real time last week.
Brian Lehrer: Listeners, I wonder if anybody out there would like to discuss your personal experiences of this, or maybe your children's personal experiences of this, and how you're having to mediate this. Was it a mass traumatic event in our nation that has touched you or someone you know personally? 212-433-WNYC, 212-433-9692, for Adam Clark Estes, senior technology correspondent at Vox.
I think it is worth saying out loud, how you might have seen this horrific video of Charlie Kirk being assassinated, and whether you think there should have been anything in place to make it harder to see anybody, as I've already asked, with kids who may have been exposed to it, having to have a conversation with them about that, or any other experience related or a question for Adam. 212-433-WNYC, 212-433-9692. You can call or you can text. Adam, were there major differences in how the video spread across different platforms? Were users on some platforms more likely to encounter it than others?
Adam Clark Estes: In this case, it was pretty prevalent everywhere, which I think is what was so striking about this. This was a major news event. Like I said, it was happening in front of an audience that was already streaming. People saw their influencer on TikTok, who might have been at the event, live-streaming from the event. They might have seen it on X. They might have seen it on Instagram or YouTube. It really went wide on all platforms.
Now, different platforms took slightly different approaches to dealing with the video once it was so widely spread. To give you an idea of how widely the video is distributed and how quickly, from the time that Charlie Kirk was shot until two hours later when he died, the video was streamed about 11 million times, according to The New York Times. Then it kept going because people copy the video and reupload it. It becomes a whack-a-mole to the platforms that are trying to limit the spread.
I do know that YouTube and the Meta platforms eventually put a warning label on it that you would have seen before seeing the video. The worst stories I've heard are from people on X where videos autoplay when you scroll. People are scrolling, just trying to get news updates, and then they see the video, and probably the most violent moments of it before they get a chance to click away.
Brian Lehrer: Adam Clark Estes, senior technology correspondent at Vox, who has written about the mass and unexpected traumatizing event for many people in the US as the video of Charlie Kirk's assassination spread so quickly without content moderation very much, and too many kids, as well as adults. Christine in Ridgewood, New Jersey, is calling in with a story from her family. Christine, you're on WNYC. Thank you for calling.
Christine: Hi, how are you? Thanks for having me. [chuckles]
Brian Lehrer: I see you have a story about your own son.
Christine: Yes, I do. He came home from school that day, the day that Charlie Kirk passed away. He showed me the video that was going on in X. He was showing me the close-up where the bullet was in his neck and everything. I was shocked. I knew the video was going to be out there. I didn't realize. The high-schoolers at this time, they all have smartphones, right? They're not in class with smartphones.
As soon as this happened, they come out of this class. We have free periods in our town. The kids coming in now. It's an open campus, and they're looking at their phones all day. He came home, and he was showing me the video from X, and the close-up of-- It was wild. I didn't realize he had his hands on it, and he said, "Mom, look at this. I can't believe it." He's like, "I know this guy." I was like, "Oh." I didn't know Charlie at the time. I had no idea anything about him. He was like, "I can't believe this happened. I can't believe this happened."
Then a few hours later, after school, he took a nap. My son never takes naps. He's a busy buddy. He's doing after-school sports. There's no way he takes naps. We had to wake him up. Finally, we woke him up. I'm like, "Chris, are you okay? Are you okay?" He's like, "Yes, Mom, did you see the video?" I'm like, "Yes, you told me about this already." In that moment, I was like, maybe this is him being traumatized from something that he shouldn't have been seeing. He's not a little kid. He's not in elementary school, right?
Brian Lehrer: High school, you told the screener, right?
Christine: These kids seen a lot. Yes, I just didn't realize the impact it made on him. I think even today, him and his friends sharing things here and there about something, they're witnessing it in real time. It's hard for these kids to watch, but it's out there, right? As many parental controls you can do, it's out there. It's very sad. When I heard the story on the radio, I was just hearing it, yes.
Brian Lehrer: Thank you for adding your voice. Did you wind up having to process it with him or trying to process it with him further after Wednesday, like over the weekend?
Christine: Yes, we did speak about it, generalized stuff about violence, how no one should ever get shot, "I really don't want you seeing this stuff." Like I said, it's hard to monitor this stuff as the kids get older. They're not babies anymore, and they are going to see it. He's okay now. He's in school. He's thriving. He's doing well. It didn't impact him really that much beyond that. I hope it didn't, but I know that it's out there. Anyway, thank you for--
Brian Lehrer: Christine, thank you. Thank you for making that call. We really appreciate it. Here's another mom. Kate in Westchester, you're on WNYC. Hi Kate, thank you for calling in.
Kate: Hi Brian, thank you for taking the call. I have a 15-year-old and a 13-year-old. We were emailed by our school superintendent to give us a heads-up about all of this circulating online to make sure that we were talking to our kids about it. By the time they got home from their cross-country practice and their swim practice, the cat was really out of the bag. All of their peers had looked at these things now. I will say that my girls have been taught since they were three years old, "Be careful little ears what you hear, be careful little eyes what you see."
They have years and years of just being taught to be cautious, and they didn't look at it. They declined to look at it. That came from them, that regulation, that self-discipline. I am proud to say, they showed it was in place in their little hearts, and they didn't look at it. Now, they know there are things that you can't unsee. They try not to look. When it's all around them and the social media is just completely unregulated and unmoderated by the adults in the room, it's the water in which they're all swimming, the mental health, and the anxiety, and everything affects them, too.
Brian Lehrer: Can I ask how your kids knew not to click on the videos? Did they come with titles at least that said, "See Charlie Kirk having a bullet go through his neck," or something like that, and then they had the opportunity to at least screen that way before they decided to click? Do you know?
Kate: I don't know. I can tell you that my girls waited until they were 13 to be given cell phones. They were only given cell phones after we took them out to lunch and sat them down and made them sign a three-page contract that you can find on Common Sense Media. We went through point by point, the contract. We've had a very, very cautious and deliberate, intentional relationship with technology from the very beginning. I don't really know what the nuts and bolts of their decisions were, but it worked, thank God.
Brian Lehrer: It sounds like you're probably in the top 5% of American families in media literacy, so congratulations on that. Thank you very much for calling. Adam, any reaction to the two moms we just heard from?
Adam Clark Estes: Yes, this is something that I think about a lot. Media literacy, especially digital media literacy. I have a daughter who's turning two next month. I've been trying to get ahead of all of this because, as someone who grew up before the internet was what it is today and before we had smartphones, I'm really aware of how ubiquitous technology is, of how these things aren't going away, and will probably become more prominent in my kids' lives than even in my own now.
I want them to be smart about their tech use. I'm not just talking about watching too much YouTube or screen time in general. It's moments like these when kids can make a decision about how they're going to consume media if they're going to watch that video. I think it's really remarkable that the mom from Westchester got out in front of that and really talk to her kids about the fact that they do have a choice when doing stuff online. At the same time--
Brian Lehrer: Not every kid has a choice because of autoplay.
Adam Clark Estes: That's right. At the same time, she mentioned the adults in the room. In this case, I think we were talking about executives in Silicon Valley who have changed policies, who have made it easier for kids to see things like this. They've done so, I would argue, for a lot of reasons. Some of which are political. Some are just economic. It's cheaper not to have as much content moderation. Content moderation is really hard. It's really expensive to employ a lot of people to keep track of what's going on in these platforms and take it down when it becomes harmful.
Brian Lehrer: I want to read two texts and then take a phone call that relates to these two texts that have different points of view on this whole topic. One listener writes, pulling this back up, "We live in a violent society. Why hide it? It's all over the society." The other one says, "The social media industry has convinced us of the same backwards logic that the gun industry has. Good social media is the only thing that stops bad social media. Like a good person with a gun is the only thing that stops a bad person with a gun. This is wrong," writes that listener. "These industries must be regulated and reformed. This is not a speech issue. It's a product issue." To that first text, "We live in a violent society. Why hide it? It's all over society." Let me take Sarah in Springfield, New Jersey, on this. You're on WNYC, Sarah. Thank you very much for calling in.
Sarah: Hi, Brian. I'm a longtime listener. I think this is my third time calling in, but, yes, I wanted to talk about how this killing of Charlie Kirk and the mass spread of this video is not necessarily something that I feel like is new. I'm Gen Z, so I've grown up seeing a lot of really graphic videos online. I think that we see many, many graphic videos of people of color being killed. Specifically, Jordan Neely, and the graphic video of him being choked to death on the F train for many, many minutes.
It was played on TV, the whole entire video. It was spread all over X and all types of social media outlets. I think the video of Charlie Kirk getting killed is producing a lot of conversation about it. However, the many bodies that I've seen, many Black bodies and brown bodies that I've seen on social media have really been a topic of conversation between me and my friends for a very long time.
Brian Lehrer: Where does that make you land on what, if anything, should be done about it? George Floyd, obviously. Another one. Where does this make you land on whether the content should be moderated? I'll tell you, we have another caller who we're not going to have time for, who says it's even older than what you've experienced in your life as a Gen Z person. Listener writes, "We all saw the JFK assassination. We all saw 9/11 victims," caller says, "jumping off the towers over and over, or Vietnam coverage." Where do you land on what should be? What should be the single standard if you've had any thoughts about that?
Sarah: I don't think that we should watch anybody be killed so graphically. I think that violence is a pervasive part of our society that we live in. I think that we should have more of a conversation about who we are okay watching being killed, because I think that many of the people who have seen Charlie Kirk get shot in the neck are very outraged by it. However, I will say, at least on X, there are many, many violent videos of random people being killed. It's not just Charlie Kirk, but we have less of outrage about that. I think that nobody should watch that type of content. However, we should have a larger conversation about why we feel so strongly about certain people being killed.
Brian Lehrer: Sarah, thank you very much. Great contribution. Please keep calling us. One other example of this, before we begin to run out of time, Adam. Yes, people are outraged that this was shown, but there's also been the constant stream of gory videos coming out of Gaza. We often hear arguments that showing that graphic reality is necessary or a good thing for bearing witness to atrocities and holding power accountable. Again, it complexifies the conversation and makes us ask, "Well, what's the single standard, if there is one?"
Adam Clark Estes: It's really hard to say what a single standard would even look like. I do think that having a layer of human interaction between videos like this and how they get disseminated can be helpful. Just speaking of violent videos that have left an impression, I used to cover breaking news. In doing so, I covered the Arab Spring and saw a video of a young woman, Neda Agha-Soltan, in 2009, who was shot and died in the video.
That video got spread widely. It was shared by news organizations, including my own at the time. I was at The Atlantic. The person who took the video ended up getting a George Polk Award for journalism for sharing this moment. It spread a lot of awareness about what was happening in Iran at the time. It's hard to distinguish the difference between a video like that and the assassination of a political figure in the US.
It's hard for a human to do it. I would say it's impossible for an algorithm to do it, but that's what the tech platforms have resorted to doing. They're spreading videos and information that is getting people to engage, getting people upset. It helps their business model to keep people paying attention to what they're doing. There's not necessarily a higher calling or greater purpose in doing that, according to the tech companies.
Brian Lehrer: Adam Clark Estes, senior technology correspondent at Vox, thanks for talking about this with us. We really appreciate it, difficult as it is.
Copyright © 2025 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.
