Andy: 00:00:00 Hi. This is Andy calling from Boeing Island in British Columbia. Radiolab is supported by the Chamber Music Society of Lincoln Center. Presenting summer evenings, three concerts featuring beloved works by Mozart, Brahms and more, running from July 10th through 17th. Tickets at chambermusicsociety.org.
Jason Studstill: 00:00:21 Hi. I'm Jason Studstill in Seattle, Washington. Radiolab is supported by Mount Sinai Health System and the expert care its physicians provide. Which hospital you choose can make all the difference in the world. More information is available at mountsinai.org.
Jad Abumrad: 00:00:38 Okay. Should we try to just improvise something?
Brooke G.: 00:00:40 Sure.
Jad Abumrad: 00:00:41 Hey, this is Jad. Before we launch into this week's podcast, I want to make you aware of, well, our friends down the hall, Brooke, my friend, our friends On the Media, Brooke Gladstone, is here with me, have a new show coming out that I think I might be in, I'm excited to be in, but I'm certainly excited about. So, what is it?
Brooke G.: 00:01:03 You are going to be in it? It's an episode. Although you don't want to say show because, on Radiolab, that generally means that you're launching a whole new enterprise.
Jad Abumrad: 00:01:11 Yeah, episode.
Brooke G.: 00:01:12 Yeah. This is an episode on Twitch, which some people know as if it were part of their family.
Jad Abumrad: 00:01:29 And other people are like, what?
Brooke G.: 00:01:29 Yeah. Is that something you need medication for? So it depends where you're situated. But what we are going to do is examine how it came to be and how it points to the future of where our culture is going.
Jad Abumrad: 00:01:43 And for the people who don't know, what is twitch?
Brooke G.: 00:01:46 Most people, if they've ever heard of it, they know it's about watching and commenting in real-time on people playing video games.
Jad Abumrad: 00:01:57 And you guys, from what I hear, profile one of these Twitch superstars.
Brooke G.: 00:02:07 The main character in that story is Ninja who makes something like half a million dollars a month.
Jad Abumrad: 00:02:12 What?
Brooke G.: 00:02:12 But-
Jad Abumrad: 00:02:13 On Twitch? Jesus.
Brooke G.: 00:02:14 Yes. Through contributors who say, "Say this out loud," or, "Point to me, mention my name." And at the same time he's playing the game, he's even giving advice to young boys with girlfriends problems.
Audio clip: 00:02:35 Don't act like, you know, don't make it seem it's like her ball. You just need to be like, "Babe, we need to have a talk," and be like, "Don't worry. Everything's fine." You know, reassure that. But then you're just going to be like, "I really feel like I'm trying, putting in more effort in this relationship than I feel like you are." Then just be like, "I just want to make sure that you're just as invested in this as I am."
Brooke G.: 00:02:55 I think it's something new.
Jad Abumrad: 00:02:58 Oh, I want to hear that.
Brooke G.: 00:03:00 Well, thank you.
Jad Abumrad: 00:03:01 Okay.
Brooke G.: 00:03:01 And thank you for letting me in the top of your show.
Jad Abumrad: 00:03:04 Absolutely. So, Twitch On the Media, on the media.org. Check it out. I'm in there. It's going to be amazing, but not just because of me, because of you, Brooke. Thanks for dropping in.
Brooke G.: 00:03:14 Yeah. And Bob.
Jad Abumrad: 00:03:14 And Bob.
Brooke G.: 00:03:15 Bye.
Jad Abumrad: 00:03:15 Bye. Oh wait, you're ...
Brooke G.: 00:03:18 Okay?
Jad Abumrad: 00:03:18 All right.
Announcer: 00:03:25 You are listening-
Announcer: 00:03:26 To Radiolab from-
Announcer: 00:03:29 WNY-
Announcer: 00:03:30 C.
Announcer: 00:03:30 Yeah.
Jad Abumrad: 00:03:36 Hey, I'm Jad Abumrad.
Robert Krulwich: 00:03:37 I'm Robert Krulwich.
Jad Abumrad: 00:03:38 This is Radiolab.
Robert Krulwich: 00:03:39 And today we have a story about what we can say.
Jad Abumrad: 00:03:42 And what we (bleep) can't. And by the way, there's going to be a smattering of curse words here that we're not going to bleep, which I think makes sense given the content of this story. And also, there's some graphic scenes that, if you've got kids with you, you may want to miss this one out.
Robert Krulwich: 00:03:57 Yeah. Anyway, the story comes to us from producer Simon Adler.
Simon Adler: 00:04:01 So, let's start. Can we start in 2008?
Robert Krulwich: 00:04:02 Sure.
Simon Adler: 00:04:02 How about with a song?
Robert Krulwich: 00:04:03 Yes, please.
Audio clip: 00:04:05 Rise up. Rise up. Demand Facebook cease oppressive ways. Rise up. Rise up and put back our pictures right now.
Simon Adler: 00:04:18 December 27th, a sunny Saturday morning, this group of young to middle-aged women gathered in downtown Palo Alto.
Audio clip: 00:04:25 Will fight for our cyberspace freedom. Let's all go join riseup.net.
Simon Adler: 00:04:31 They're wearing these colorful hats and are singing and swaying directly in front of the glass-doored headquarters of-
Audio clip: 00:04:38 Yet Facebook says we are pornographic.
Simon Adler: 00:04:42 -Facebook.
Audio clip: 00:04:42 Is their so-called service so free?
Stephanie Muir: 00:04:47 Yes. It was a humble gathering of a few dozen women and babies.
Simon Adler: 00:04:51 That right there-
Stephanie Muir: 00:04:53 Are you the organizer of this event?
Simon Adler: 00:04:54 -is one of the organizers of the gathering.
Stephanie Muir: 00:04:56 I'm Stephanie Muir.
Stephanie Muir: 00:04:58 What are you calling the event?
Protester: 00:04:59 It's a Facebook Nurse-In.
Simon Adler: 00:05:01 Nurse-in, as in breastfeeding.
Stephanie Muir: 00:05:03 The intent was really just to be visible and be peaceful and make a quiet point.
Jad Abumrad: 00:05:11 What point were they trying to make?
Simon Adler: 00:05:13 Stephanie and this group of mothers, they were on Facebook, as many people were, and they'd have photos taken of themselves occasionally breastfeeding their babies. They wanted to share with their friends what was going on, so they would upload those photos to Facebook. These pictures would get taken down and they would receive a warning from Facebook for-
Stephanie Muir: 00:05:33 Uploading pornographic content. And people were really getting their backs up over this.
Simon Adler: 00:05:39 They wanted Facebook to stop taking their photos down. To say that while nudity is not allowed-
Stephanie Muir: 00:05:44 Breastfeeding is exempt, period.
Audio clip: 00:05:51 Rise up and put back our pictures right now.
Simon Adler: 00:05:53 Now what Stephanie couldn't have known at the time was that this small, peaceful protest would turn out to be-
Audio clip: 00:06:02 This morning, a face off on Facebook.
Simon Adler: 00:06:05 -one of the opening shots-
Audio clip: 00:06:06 Facebook triggered a hornet's nest.
Simon Adler: 00:06:08 -in what would become a loud-
Audio clip: 00:06:10 Fuck you, Facebook.
Audio clip: 00:06:11 Fuck you, Facebook.
Simon Adler: 00:06:12 -raucous-
Audio clip: 00:06:13 Fuck you Facebook. Fuck you.
Simon Adler: 00:06:16 -and global battle.
Audio clip: 00:06:17 Embattled Facebook CEO ...
Audio clip: 00:06:18 Facebook today playing defense.
Simon Adler: 00:06:20 And now I'm not talking about all the things you've recently heard about, Russian interference, election meddling or data breaches, but rather something that I think is deeper than both of those: free speech.
Audio clip: 00:06:34 Facebook has been accused of facilitating violence against Rohingya Muslims ...
Simon Adler: 00:06:38 What we can say and what we can't say. We can see and what we can't see-
Audio clip: 00:06:45 They'd let Mueller rape kids in front of people.
Simon Adler: 00:06:49 -on the Internet.
Audio clip: 00:06:53 Fuck you. You're a fucking piece of shit.
Audio clip: 00:06:58 Thank you, Mr. Chairman. Mr. Zuckerberg, I've gotta ask you, do you subjectively prioritize or censor speech?
Audio clip: 00:07:09 Congresswoman, we don't think about what we're doing as censoring speech. There are types of ...
Simon Adler: 00:07:16 But what really grabbed me was discovering that underneath all of this is an actual rule book: a text document that dictates what I can say on Facebook, what you can say on Facebook, and what all 2.2 billion of us can say on Facebook.
Robert Krulwich: 00:07:37 For everyone in the entire globe, who's on Facebook.
Simon Adler: 00:07:39 For everyone, one set of rules that all 2.2 billion of us are expected to follow.
Jad Abumrad: 00:07:45 Is it an actual document?
Simon Adler: 00:07:46 It's a digital document. But, yes, it's about 50 pages, if you print it off, and in bullet points and if-then statements. It spells out sort of a first amendment for the globe, which made me wonder, like, what are these rules? How were they written?
Jad Abumrad: 00:08:03 And can you even have one rule book?
Simon Adler: 00:08:05 Right, exactly. So I dove into this rule book and dug up some stories that really put it to the test.
Robert Krulwich: 00:08:15 I'm interested in that.
Jad Abumrad: 00:08:15 How many stories are we going to hear?
Simon Adler: 00:08:16 Three-ish.
Jad Abumrad: 00:08:17 Okay. Cool.
Robert Krulwich: 00:08:19 I'm particular interested in the ish, but let's go ahead with the first one.
Simon Adler: 00:08:23 Let's start back on that morning in 2008, the morning that you could argue started at all.
Audio clip: 00:08:27 Rise up, rise up.
Simon Adler: 00:08:30 Because in the building, right behind those protesting mothers, there was a group of Facebook employees sitting in a conference room trying to figure out what to do.
FB Employee: 00:08:39 Cool. So I'm just going to switch this ...
Simon Adler: 00:08:43 I was able to get in touch with a couple of former Facebook employees, one of whom was actually in that room at that moment. Neither of these two were comfortable being identified, but they did give us permission to quote them extensively.
FB Employee: 00:08:56 How's that? Will that take work for you?
Simon Adler: 00:08:57 It sounded great.
FB Employee: 00:08:58 Cool.
Simon Adler: 00:08:59 Just so we have it, let's ...
Simon Adler: 00:08:59 What you're going to hear here is an actor we brought into read, quote, it's taken directly from interviews that we did with these two different former Facebook employees. All right. Ready.
FB Employee: 00:09:10 At the time when I joined them, there was a small group, 12 of us-
Simon Adler: 00:09:16 Mostly, recent college grads.
FB Employee: 00:09:17 -who were sort of called the Site Integrity Team.
Simon Adler: 00:09:20 Again, keep in mind, this was in the early 2000s.
Audio clip: 00:09:22 Seismic changes this week in the Internet hierarchy.
Simon Adler: 00:09:25 This was like the deep dark past.
FB Employee: 00:09:27 MySpace.com is now the most visited website in the U.S.
Simon Adler: 00:09:31 Facebook had somewhere in the neighborhood of 10 million users.
FB Employee: 00:09:35 We were smaller than MySpace.
Simon Adler: 00:09:36 The vast majority of them college kids. And so, in those early days, those 12 people, they would sit around in a sort of conference-like room with a big, long table, each of them in front of their own computer.
Kate Klonic: 00:09:48 And things would come up onto their screen, flagged to Facebook and-
Simon Adler: 00:09:53 Flagged, meaning like, "I, a user, saw something that I thought was wrong."
Kate Klonic: 00:09:56 Exactly. Like a reporting a piece of content that you think violates the community standards.
Simon Adler: 00:10:00 This is Kate Klonic. She's a professor of law at St. John's and she spent a lot of time studying this very thing. She says in those early days what would happen is a user would flag a piece of content and then that content along with an alert would get sent to one of those people sitting in that room. It would just pop up on their screen.
FB Employee: 00:10:20 Most of what you were seeing was either naked people, blown off heads, or things that there was no clear reason why someone had reported, because it was like a photo of a golden retriever and people are just annoying.
Simon Adler: 00:10:32 And every time something popped up onto the screen, the person sitting at that computer would have to make a decision whether to leave that thing up or take it down. And at the time, if you didn't know what to do-
FB Employee: 00:10:43 You would turn to your pod leader, who was somebody who had been around nine months longer than you, and ask, "What do I do with this?" And they would either have seen it before and explain it to you or you both wouldn't know and you'd Google some things.
Kate Klonic: 00:10:56 It really was just kind of an ad hoc approach.
Robert Krulwich: 00:10:59 Was there any sort of written standard or any common standard?
Simon Adler: 00:11:02 Well, kind of.
Kate Klonic: 00:11:03 They had a set of community standards that, at the end of the day, they were just kind of, it was one page long and it was not very specific.
Simon Adler: 00:11:10 Sorry. The guidelines were really one page long?
Kate Klonic: 00:11:13 They were one page long.
Simon Adler: 00:11:15 And basically, all this page said was, "Nudity is bad, so is Hitler."
Kate Klonic: 00:11:20 And if it makes you feel bad, take it down.
Simon Adler: 00:11:23 And so when one of the people sitting in that room would have a breastfeeding picture pop-up on the screen in front of them, they'd be like, "I can see a female breasts. So I guess that's nudity." And they would take it down until-
Audio clip: 00:11:38 Rise up. Rise up.
Robert Krulwich: 00:11:38 Rise up. Fight for the rights for breastfeeding ... Anyway.
Simon Adler: 00:11:42 Now a dozen or so people in front of their offices on a Saturday. It probably wasn't causing Facebook too much heartache, but-
Stephanie Muir: 00:11:49 I thought, "Hey, we have an opportunity here with over 10,000 members in our group.
Simon Adler: 00:11:54 According to Stephanie Muir, those protesters were just a tiny fraction of a much larger online group who had organized, ironically enough, through Facebook.
Stephanie Muir: 00:12:04 So to coincide with the live protest, I just typed up a little blurb encouraging our members that were in the group to do a virtual nurse-in.
Simon Adler: 00:12:14 A virtual nurse-in?
Stephanie Muir: 00:12:15 Right. What we did ...
Simon Adler: 00:12:17 They posted a message asking their members-
Stephanie Muir: 00:12:19 To for one day change their profile avatar to an image of breastfeeding and then change their status to the title of our group, Hey Facebook Breastfeeding is Not Obscene.
Simon Adler: 00:12:33 And?
Stephanie Muir: 00:12:33 It caught on.
Audio clip: 00:12:34 A social networking website is under fire for its policy on photos of women breastfeeding their children.
Simon Adler: 00:12:40 Big time.
Stephanie Muir: 00:12:40 Twelve thousand members participated and the media requests started pouring in.
Audio clip: 00:12:45 The Facebook group called Hey Facebook Breastfeeding is Not Obscene.
Stephanie Muir: 00:12:48 I did hundreds of interviews for print, Chicago Tribune, Miami Herald, Time magazine, New York Times, Washington Post.
Audio clip: 00:12:57 You know, the Internet is an interesting phenomenon.
Stephanie Muir: 00:12:59 Dr. Phil. It was a media storm. And eventually, perhaps as a result of our group and our efforts, Facebook was forced to get much more specific about their rules.
Simon Adler: 00:13:15 For example, by then nudity was already not allowed on the site. But they had no definition for nudity. They just said no nudity. And so the Site Integrity team, those 12 people at the time, they realized they had to start spelling out exactly what they meant.
Stephanie Muir: 00:13:28 Precisely. All of these people at Facebook were in charge of trying to define nudity.
FB Employee: 00:13:33 So, I mean, yeah, first cut at it was visible male and female genitalia and then visible female breasts. And then the question is, well, okay, how much of her breast needs to be showing before it's nude? And the thing that we landed on was if you could see essentially the nipple and areola, then that's nudity.
Simon Adler: 00:13:51 And it would have to be taken down, which, theoretically at least, would appease these protesters because now when a picture would pop up of a mother breastfeeding, as long as the child was blocking the view of the nipple and the areola, they could say, "Cool, no problem."
Kate Klonic: 00:14:07 Then you start getting pictures that are women with just their babies on their chest with their breasts bare. Like for example, maybe baby was sleeping on the chest of a bare breasted woman and not actively breastfeeding.
Simon Adler: 00:14:19 Okay. Now what? Like, is this actually breastfeeding? No, it's actually not breastfeeding. The woman is just holding the baby and she has her top off.
Jad Abumrad: 00:14:27 Yeah, but she was clearly just breastfeeding the baby.
Simon Adler: 00:14:31 Well, I would say it's sort of like kicking a soccer ball. Like a photo of someone who has just kicked a soccer ball, you can tell the ball is in the air, but there no contact between the foot and the ball in that moment potentially. So although it is a photo of someone kicking a soccer ball, they are not in fact kicking the soccer ball in that photo.
Jad Abumrad: 00:14:49 That's a good example.
Simon Adler: 00:14:50 This became the procedure or the protocol or the approach for all of these things, was we have to base it purely on what we can see in the image.
Kate Klonic: 00:14:59 And so they didn't allow that to stay up, under the rules, because it could be too easily exploited for other types of content, like nudity or pornography.
FB Employee: 00:15:08 We got to the only way you could objectively say that the baby and the mother were engaged in breastfeeding is if the baby's lips were touching the woman's nipple.
Simon Adler: 00:15:15 So they included what you could call like an attachment clause. But as soon as they got that rule in place, like you would see, you know, a 25-year-old woman and a teenage-looking boy. Right? And like, what the hell is going on there?
Kate Klonic: 00:15:29 Oh, yeah. It gets really weird if you start entering into child age. And I wasn't even going to bring that up because it's kind of gross.
Simon Adler: 00:15:35 It's like breastfeeding porn.
Jad Abumrad: 00:15:39 Is that a thing?
Robert Krulwich: 00:15:39 Are there sites like that?
Simon Adler: 00:15:39 Apparently. And so, this team, they realized they needed to have a nudity rule that allowed for breastfeeding but also had some kind of an age cap.
FB Employee: 00:15:48 So then we were saying, "Okay. Once you've progressed past infancy, then we believe that it's inappropriate."
Simon Adler: 00:15:53 But then pictures would start popping up on their screen and they'd be like, "Wait, is that an infant?" Like, where is the line between infant and toddler?
FB Employee: 00:16:02 And so the thing that we landed on was, if it looked like the child could walk on his or her own, then too old.
Simon Adler: 00:16:07 Big enough to walk too big, too big to breastfeed.
Robert Krulwich: 00:16:09 Oh, that could be 18 months.
Jad Abumrad: 00:16:10 Yeah, that's like a year old in some cases.
Simon Adler: 00:16:12 Yeah. The World Health Organization recommends breastfeeding until 18 months or two years, which meant there were a lot of photos still being taken down.
Kate Klonic: 00:16:23 Within days, we're continuing to hear reports from people that their photographs were still being targeted.
Simon Adler: 00:16:31 But ...
Audio clip: 00:16:32 Facebook did offer a statement saying ...
FB Employee: 00:16:34 You know, that's where we're going to draw the line.
Audio clip: 00:16:36 The Facebook isn't budging on its policy.
Simon Adler: 00:16:39 And keep in mind through this whole episode ...
Audio clip: 00:16:42 Is this perhaps the next big thing and the Facebook.com ...?
FB Employee: 00:16:45 The company was growing really, really fast.
Audio clip: 00:16:47 It seems like almost everyone is on it.
Kate Klonic: 00:16:50 And there just got to be a lot more content.
Audio clip: 00:16:53 When we first launched, we were hoping for, you know, maybe 400, 500 people and now we're at 100,000. So, who knows where we're going now?
Simon Adler: 00:17:00 Thousands more people are joining Facebook every day-
Audio clip: 00:17:03 Sixty million users so far, with a projection of 200 million by the end of the year.
Audio clip: 00:17:08 And now more people on Facebook than the entire U.S. population.
Simon Adler: 00:17:11 Not just within the United States, but also-
FB Employee: 00:17:13 It was growing rapidly more international.
Kate Klonic: 00:17:16 You know, you were getting stuff from India and Turkey.
Audio clip: 00:17:20 Facebook.
Audio clip: 00:17:21 Facebook is going to win.
Simon Adler: 00:17:23 It's getting big throughout the EU.
Audio clip: 00:17:25 Korea has joined the Facebook.
Simon Adler: 00:17:28 So they have more and more content coming in from all these different places, in all these different languages.
FB Employee: 00:17:34 How are we going to keep everybody on the same page?
Kate Klonic: 00:17:39 And so, once they saw that this was the operational method for dealing with this, creating this like nesting set of exceptions and rules and these clear things that had to be there or had to not be there in order to keep content up or take it down, that I think became their procedure.
Simon Adler: 00:17:55 And so this small team at Facebook got a little bigger and bigger, jumped up to 60 people and then a hundred, and they set out to create rules and definitions for everything. Can we go through some of sort of the ridiculous examples?
Robert Krulwich: 00:18:09 Yes. That's why we're here.
Simon Adler: 00:18:09 Okay. So, gore.
Robert Krulwich: 00:18:11 Gore. You mean violence kind of gore?
Simon Adler: 00:18:13 Yes. So, the gore standard was, headline-
FB Employee: 00:18:16 We don't allow graphic violence and gore.
Simon Adler: 00:18:18 And then, the shorthand definition they used was-
FB Employee: 00:18:21 No insides on the outside.
Robert Krulwich: 00:18:22 No guts, no blood pouring out of something.
Simon Adler: 00:18:25 Blood was a separate issue. There was an excessive blood rule. They had to come up with rules about bodily fluids.
FB Employee: 00:18:32 Semen, for example, would be allowed in like a clinical setting, but like, what does a clinical setting mean? And, does that mean if someone is in a lab coat?
Kate Klonic: 00:18:41 One of my favorite examples is, like, how do you define art?
Simon Adler: 00:18:44 Because, as these people are moderating, they would see images of naked people that were paintings or sculptures come up. And so what they decided to do is say, "Art with nakedness can stay up."
Kate Klonic: 00:18:57 It stays up if it is made out of wood, made out of metal, made out of stone.
Simon Adler: 00:19:03 Really?
Kate Klonic: 00:19:03 Yeah. Because how else do you define art? You have to just be like, is this what you can see with your eyeballs?
Simon Adler: 00:19:10 And so from then on, as they run into problems ...
Kate Klonic: 00:19:12 Those rules just constantly get updated.
Simon Adler: 00:19:15 Constant amendments.
Kate Klonic: 00:19:16 Yeah, constant amendments.
Simon Adler: 00:19:18 New problem, new rule. Another new problem, updated rule. In fact, at this point, they are amending these rules up to 20 times a month.
Robert Krulwich: 00:19:27 Wow.
Jad Abumrad: 00:19:29 Really?
Simon Adler: 00:19:29 Yeah. Take for example those rules about breastfeeding. In 2013, they removed the attachment clause. So, the baby no longer needed to have its mouth physically touching the nipple of the woman. In fact, one nipple and/or areola could be visible in the photo.
Jad Abumrad: 00:19:49 But not two.
Simon Adler: 00:19:50 Only one. Then, 2014, they make it so that both nipples or both areola may be present in the photo.
Robert Krulwich: 00:19:58 This is what happens in American law all the time, this very thing.
Simon Adler: 00:20:01 Yes.
Kate Klonic: 00:20:01 Yeah. It sounds a lot like common law.
Simon Adler: 00:20:06 Common law is this system dating back to early England where individual judges would make a ruling, which would sort of be a law, but then that law would be amended or evolved by other judges. So the body of law was sort of constantly-
Kate Klonic: 00:20:18 Fleshed out in face of new facts.
Simon Adler: 00:20:22 Literally, every time this team at Facebook would come up with a rule that they thought was airtight, ka-plop, something would show up that they weren't prepared for, that the rule hadn't accounted for.
FB Employee: 00:20:35 As soon as you think, yeah, this is good, the next day, something shows up to show you, yeah, you didn't think about this.
Simon Adler: 00:20:40 For example, sometime around 2011, this content moderator is going through a queue of things.
Audio clip: 00:20:47 Accept. Reject. Accept. Escalate. Accept.
Simon Adler: 00:20:52 And she comes upon this image.
Audio clip: 00:20:54 Oh, my god.
FB Employee: 00:20:55 The photo itself was a teenage girl, African by dress and skin, breastfeeding a goat, a baby goat.
Simon Adler: 00:21:03 The moderator throws her hands up and says-
FB Employee: 00:21:05 "What the fuck is this?" And we googled breastfeeding goats and found that this was a thing. It turns out it's a survival practice.
Simon Adler: 00:21:13 According to what they found, this is a tradition in Kenya that goes back centuries. That, in a drought, a known way to help your herd get through the drought is, if you have a woman who is lactating, to have her nurse the kid, the baby goat, along with her human kid. And so there's nothing sexual about it.
Robert Krulwich: 00:21:35 Just good farming.
Jad Abumrad: 00:21:36 It's good for business.
Simon Adler: 00:21:38 Theoretically, if we go point by point through this list, it's an infant. It's sort of could walk, so maybe there's an issue there. But there is physical contact between the mouth and the nipple.
FB Employee: 00:21:51 But obviously, breastfeeding, as we intended anyway, meant human infants.
Simon Adler: 00:21:56 And so, in that moment, what they decided to do is remove the photo.
FB Employee: 00:22:01 There was an amendment, an asterisk, under the rules stating animals are not babies. We added that so in any future cases, people would know what to do.
Soren Wheeler: 00:22:09 They removed? They discovered it was culturally appropriate and a thing that people do, and they decided to remove the photo?
Simon Adler: 00:22:15 Yeah.
Jad Abumrad: 00:22:15 That outraged individual is our editor, Soren Wheeler.
Simon Adler: 00:22:18 Why?
FB Employee: 00:22:18 Why didn't we make an exception? Because when a problem grows large enough, you have to change the rules. If not, we don't. This was not one of those cases. The juice wasn't worth the squeeze.
Simon Adler: 00:22:29 And like, if they were to allow this picture, then they'd have to make some rule about when it was okay to breastfeed an animal and when it wasn't okay.
FB Employee: 00:22:37 This is a utilitarian document. It's not about being right 100% of the time. It's about being able to execute effectively.
Simon Adler: 00:22:48 In other words, we're not trying to be perfect here and we're not even necessarily trying to be 100% just or fair. We're just trying to make something that works.
Aurora A.: 00:23:00 One, two, three, four, five, six, seven, eight.
Simon Adler: 00:23:04 When you step back and look at what Facebook has become, from 2008 to now, in just 10 years-
Aurora A.: 00:23:10 Simon, I've just arrived at the Accenture Tower here in Manila. I don't know how many floors it is. One, two-
Simon Adler: 00:23:19 The idea of a single set of rules that works that can be applied fairly-
Kate Klonic: 00:23:24 That's just a crazy, crazy concept.
Aurora A.: 00:23:26 Fifteen, 16, 17, 18.
Simon Adler: 00:23:28 Because they've gone from something like 70 million users to 2.2 billion.
Aurora A.: 00:23:32 It's hard to count, but I would say it's about 30 floors.
Simon Adler: 00:23:36 They've gone from 12 folks sitting in a room deciding what to take down or leave up to somewhere around 16,000 people.
Aurora A.: 00:23:42 There's a floor in this building where Facebook supposedly outsources content moderators.
Simon Adler: 00:23:49 Around 2010, they decided to start outsourcing some of this work to places like Manila where you've just heard reporter Aurora Almendral as well as-
Garrett Stack: 00:23:58 I mean, I would guess that there are thousands of people in this building.
Simon Adler: 00:24:00 -Dublin where we sent reporter Garrett Stack.
Garrett Stack: 00:24:04 Oh, I can see in where they get their delicious Facebook treat is cooked. Everybody's beavering away.
Simon Adler: 00:24:08 We sent them there to try to talk to some of these people, who for a living sit at a computer and collectively click through around a million flagged bits of content that pop up onto their screen every day.
Jad Abumrad: 00:24:18 Wow. I'm just curious, what's that like?
Simon Adler: 00:24:22 Well ...
Aurora A.: 00:24:22 Hello. Can I ask you some questions?
Male: 00:24:25 Sorry, [foreign language 00:24:26].
Simon Adler: 00:24:26 We found out pretty quickly-
Aurora A.: 00:24:29 Who do you work for?
Simon Adler: 00:24:29 -none of these folks were willing to talk to us about what they do.
Aurora A.: 00:24:33 There's a lot of running away from me happening.
Garrett Stack: 00:24:36 Hey, sorry to bother you, do you guys work at Facebook? Do you happen to work in Facebook by any chance?
Female: 00:24:40 No, I don't.
Garrett Stack: 00:24:41 Hi. Sorry to bother you, do you work inside?
Female: 00:24:42 No. Sorry.
Garrett Stack: 00:24:44 Do you work in Facebook?
Male: 00:24:45 No.
Garrett Stack: 00:24:46 I mean, you just came out of there. I know you're lying.
Simon Adler: 00:24:50 In fact, most people wouldn't even admit they work for the company.
Jad Abumrad: 00:24:54 Is there something wrong about being in the ...
Robert Krulwich: 00:24:56 Is there like an NDA that they signed?
Simon Adler: 00:24:58 Well, yeah. So, when I finally did find someone willing to talk to me ... Do you want to be named or do you not want to be named?
FB Employee: 00:25:06 I'd rather not.
Simon Adler: 00:25:09 That's totally fine.
FB Employee: 00:25:10 You know, I'm still in the industry, I don't want to lose my job.
Simon Adler: 00:25:14 He explained that he and all the other moderators like him were forced to sign these non-disclosure agreements, stating they weren't allowed to admit that they work for Facebook, they're not allowed to talk about the work they do.
FB Employee: 00:25:26 My contract prohibited me from talking about what content moderation was.
Robert Krulwich: 00:25:31 Why?
Simon Adler: 00:25:32 Several reasons. One is that up until recently, Facebook wanted to keep secret what these rules were so that they couldn't be gamed. At the same time, it creates a sort of separation between these workers and the company, which, if you're Facebook, you might want-
FB Employee: 00:25:47 Yeah. I knew I signed up to monitor graphic images.
Simon Adler: 00:25:50 -just given the nature of the job.
FB Employee: 00:25:52 But, you know, I didn't really ... You know, you don't really know the impact that's going to have on you until you got through it.
Simon Adler: 00:25:59 This guy talked to ... He got his first contract doing this work several years back and for the duration of it, about a year, he'd show up to his desk every morning, put on his headphones-
FB Employee: 00:26:08 Click, click, click, click, click.
Simon Adler: 00:26:09 Ignore, delete, delete.
FB Employee: 00:26:11 Case by case by case by case, 5,000 cases every day. It's just image and decision. Image, decision, image, decision.
Simon Adler: 00:26:18 Five thousand a day you just said?
FB Employee: 00:26:22 Yeah. It was a lot of cases. Yes.
Simon Adler: 00:26:22 He said, basically, he'd have to go through an image or some other piece of content every three or four seconds.
Jad Abumrad: 00:26:28 Wow. All day long?
Simon Adler: 00:26:30 All day, eight hours a day.
Jad Abumrad: 00:26:38 Wow.
Simon Adler: 00:26:38 Well, if I can ask, what kind of things did you see?
FB Employee: 00:26:41 I don't know if this is even radio-worthy. I think it's too X-rated.
Simon Adler: 00:26:50 Clicking through, he came across unspeakable things.
FB Employee: 00:26:53 From heads exploding to people being squashed by a tank, to people in cages being drowned, to a 13-year-old girl having sex with an eight-year-old boy, and it's not just once it's over and over and over.
Simon Adler: 00:27:15 Well, did this keep you up at night?
FB Employee: 00:27:17 Absolutely. Absolutely, 100%. It kept me up at night.
Simon Adler: 00:27:24 He'd catch himself thinking about these videos and photos when he was trying to relax. He had to start avoiding things.
FB Employee: 00:27:29 Yeah, there were specific, like movies that I couldn't watch. It was one, I think it was Quentin Tarantino one, my wife wanted to see it. I was like, "Okay." I turned it on. It was like heads were exploding. I was like, nope, nope. I have to walk away. I just had to. It was too real. I saw that. It's a classic PTSD.
Simon Adler: 00:27:55 A different moderator I spoke to described it as seeing the worst side of humanity. You see all of the stuff that you and I don't have to see because they are going around playing cleanup.
Male: 00:28:08 Yeah. What a job! Wow.
Simon Adler: 00:28:10 Yeah. And it's worth noting that more and more of this work is being done in an automated fashion, particularly with content like a gore or terrorist propaganda. They're getting better-
Robert Krulwich: 00:28:21 You can automate that?
Simon Adler: 00:28:22 Yeah. Through computer vision, they are able to detect hallmarks of a terrorist video or of a gory image. With terrorist propaganda, they now take down 99% of it before anyone flags it on Facebook.
Robert Krulwich: 00:28:39 Wow.
Simon Adler: 00:28:40 But, moving onto our second story here, there is a type of content that they are having an incredibly hard time, not just automating, but even getting their rules straight on, and that's surrounding hate speech.
Jad Abumrad: 00:28:57 Oh, good. Some more laughs coming up.
Simon Adler: 00:28:59 Well, there will be laughter.
Robert Krulwich: 00:29:01 Oh, really?
Simon Adler: 00:29:01 There will be comedians. There will be jokes.
Robert Krulwich: 00:29:03 Comedians.
Jad Abumrad: 00:29:04 Hey.
Robert Krulwich: 00:29:04 All right. Shall we take a break and then come right back?
Jad Abumrad: 00:29:07 No, I think we're going to keep going.
Robert Krulwich: 00:29:08 Okay.
Intern: 00:29:09 Testing. One, two, three, four, five. Testing. One, two, three, four, five. I'm Simon Adler.
Simon Adler: 00:29:13 A couple months back-
Intern: 00:29:14 I think it's working.
Simon Adler: 00:29:15 -we sent our pair of interns.
Intern: 00:29:18 On the left, 60 feet.
Simon Adler: 00:29:20 Carter Hodge-
Intern: 00:29:22 Here we go at the standing room.
Simon Adler: 00:29:23 -and Lazy Yeager.
Male: 00:29:23 Hello. Do you guys have tickets for tonight?
Simon Adler: 00:29:23 I think we're on the guests list.
Male: 00:29:24 Okay.
Simon Adler: 00:29:31 To this cramped, narrow little comedy club. The kind of place with, like-
Intern: 00:29:37 It's super expensive.
Intern: 00:29:37 I know.
Simon Adler: 00:29:39 -$18 smashed rosemary cocktails.
Intern: 00:29:44 None of that.
Intern: 00:29:45 We do not need to get another drink. It's fine.
Simon Adler: 00:29:47 High top tables.
Intern: 00:29:48 The AC is dripping on me.
Simon Adler: 00:29:50 But still kind of a dive.
Intern: 00:29:52 That feels good. Yeah. All right. You guys are here. Now ...
Simon Adler: 00:29:56 We sent them there to check out someone else who'd found a fault line in Facebook's rule book.
Intern: 00:30:02 It's exciting. We're going to keep moving right along to the next comedian to come to stage, please give it up for Marcia Belsky.
Marcia Belsky: 00:30:12 Thank you. Yes. I get so mad. I feel like my first time to the city, I was such a carefree, brat. You know, I was young and I had these older friends, which I thought was like very cool and then you just realize that they're alcoholics, you know.
Simon Adler: 00:30:25 She's got dark curly hair, was raised in Oklahoma.
Marcia Belsky: 00:30:33 I was raised Jewish. When you're raised Jewish, you read about Anne Frank a lot, you know a lot, a lot. When you read about Anne Frank ... This will get funny. She-
Simon Adler: 00:30:46 How did you decide to become a comedian?
Marcia Belsky: 00:30:48 You know, it was kind of the only thing that ever clicked with me. And especially political comedy, you know, I used to watch the Daily Show every day.
Simon Adler: 00:30:56 And back in 2016, she started this political running bit that I think can be called sort of absurdist, feminist comedy.
Marcia Belsky: 00:31:05 Now a lot of people think that I'm like an angry feminist, which is weird. This guy called me a militant feminist the other day and I'm like, "Okay. Just because I am training a militia of women in the woods."
Marcia Belsky: 00:31:22 At first, I just had this running bit online on Facebook and Twitter.
Simon Adler: 00:31:26 She was tweeting, posting jokes.
Marcia Belsky: 00:31:28 You know, like we have all the Buffalo Wild Wings surrounded, you know, things like that.
Simon Adler: 00:31:32 Eventually took this bit on stage, even wrote some songs.
Marcia Belsky: 00:31:36 "All older men should die but not my dad. No, no, not my dad."
Simon Adler: 00:31:49 Anyhow, so about a year into this running bit, Marcia was bored at work one day and logs on to Facebook. But, instead of seeing her normal newsfeed, there was this message that pops up.
Marcia Belsky: 00:32:01 It says, "You posted something that discriminated along the lines of race, gender, or ethnicity group."
Simon Adler: 00:32:08 "And so we've removed that post."
Marcia Belsky: 00:32:11 And so I'm like, "What could I possibly have posted?" I really, I thought it was like a glitch.
Simon Adler: 00:32:16 But then she clicked Continue and there, highlighted, was the violating post. It was a photo of hers.
Simon Adler: 00:32:23 What is the picture? Can you describe it?
Marcia Belsky: 00:32:25 The photo is me as what can only be described as a cherub: cute little seven-year-old with big curly hair, and she's wearing this blue floral dress, her teeth are all messed up.
Simon Adler: 00:32:35 And into the photo, Marcia had edited in a speech bubble-
Marcia Belsky: 00:32:39 That just says, "Kill all men." And so it's funny, you know, because I hit ... I hit ... It's funny, you know, trust me ... whatever. I thought it was ridiculous because I ...
Simon Adler: 00:32:49 She searched through her library of photos and found that kill all men image.
Marcia Belsky: 00:32:53 And I post it again.
Simon Adler: 00:32:55 Immediately after?
Marcia Belsky: 00:32:57 Yeah, and it got removed again.
Simon Adler: 00:32:59 And this time there were consequences.
Marcia Belsky: 00:33:01 I got banned for three days after that.
Simon Adler: 00:33:04 Then after several other bans-
Marcia Belsky: 00:33:05 Shoot forward, this is months later.
Simon Adler: 00:33:08 -a friend of hers had posted an article and underneath it, in the comment section, there were guys posting just really nasty stuff.
Marcia Belsky: 00:33:14 So I commented underneath those comments, "Men are scum," which was very quickly removed.
Simon Adler: 00:33:23 How long did you get banned for this time?
Marcia Belsky: 00:33:25 Thirty days.
Simon Adler: 00:33:26 Wow.
Marcia Belsky: 00:33:27 Yeah. I was dumbfounded.
Soren Wheeler: 00:33:30 So there's a rule somewhere that, if I type "men are scum," you take it down?
Simon Adler: 00:33:36 Yes.
Marcia Belsky: 00:33:37 I'm like, "What could it be?"
Simon Adler: 00:33:41 And so Marcia called on her "militia of women"-
Marcia Belsky: 00:33:44 Exactly.
Simon Adler: 00:33:44 -to find out, like, is this just me?
Marcia Belsky: 00:33:46 Female comedians who are sort of mad on my behalf started experimenting, posting "men are scum" to see how quickly it would get removed and if it would be removed every time. And it was.
Simon Adler: 00:33:59 So they started trying other words.
Marcia Belsky: 00:34:02 Yeah.
Simon Adler: 00:34:03 To find out where the line was.
Marcia Belsky: 00:34:04 My friend put "Men are da scum," that got removed. "Men are the worst."
Simon Adler: 00:34:09 Removed and banned.
Marcia Belsky: 00:34:10 This one girl put "Men are septic fluid." Banned.
Simon Adler: 00:34:15 But-
Marcia Belsky: 00:34:16 We're only at the middle of the saga.
Simon Adler: 00:34:17 It doesn't end there.
Marcia Belsky: 00:34:19 Because there's no ...
Simon Adler: 00:34:19 Now she's really like, "What the hell is going on? Is it sexism?"
Marcia Belsky: 00:34:23 So I just started doing the most bare minimum amount of investigating.
Simon Adler: 00:34:29 She's googling around trying to figure out what these policies are. And pretty quick, she comes across this leaked Facebook document.
Marcia Belsky: 00:34:36 So this is when I lose my mind. This is when Mark Zuckerberg becomes my sworn nemesis for the rest of my life.
Female: 00:34:44 Because what she'd found was a document Facebook used to train their moderators. Inside of it, in a section detailing who Facebook protected from hate speech, there was a multiple choice question that said, "Who do we protect, white men or black children?" And the correct answer was white men, not black children.
Marcia Belsky: 00:35:05 Not even kidding.
Jad Abumrad: 00:35:09 White men are protected, black children or not. That's not a good look.
Marcia Belsky: 00:35:12 It's racist. Something's going on here. There is absolutely some sort of unaddressed bias or systematic issue at Facebook.
Monica Bickert: 00:35:23 Hi.
Simon Adler: 00:35:24 Hello.
Monica Bickert: 00:35:24 How are you?
Simon Adler: 00:35:25 I'm doing well. Thank you so much for being on the air.
Monica Bickert: 00:35:26 Yeah, I know.
Simon Adler: 00:35:28 So not long after sitting down with Marcia, Facebook invited me to come out to their offices in California and sit down with them.
Monica Bickert: 00:35:36 I'm going to eat one cookie and then more. Oh, they're little? I think I'm going to get two.
Simon Adler: 00:35:43 All right. Can I just get your name and your title?
Monica Bickert: 00:35:45 I'm Monica Bickert and I lead the policies for Facebook.
Simon Adler: 00:35:48 Monica Bickert is in charge of all of Facebook's rules, including their policies on hate speech. And so I asked her like, why would there be a rule that protects white men, but not black children?
Monica Bickert: 00:36:02 We have made our hate speech policies ... Let me rephrase that. Our hate speech policies have become more detailed over time, but our main policy is you can't attack a person or group of people based on a protected characteristic, a characteristic like race, religion, or gender.
Simon Adler: 00:36:21 So this takes a couple of beats to explain, but the gist of it is that Facebook borrowed this idea of protected classes straight from U.S. anti-discrimination law. These are the laws that make it so that you can't not hire someone, say based on their religion, their ethnicity, their race. And so, on Facebook, you can't attack someone based on one of these characteristics. Meaning you can't say, "Men are trash," nor could you say, "Women are trash," because essentially you're attacking all men for being men.
Soren Wheeler: 00:36:53 Is it the "all"? Can I say, "Bob is trash?"
Simon Adler: 00:36:56 Yeah. You can say, "Bob is trash," because, as my sources explained to me-
FB Employee: 00:37:00 The distinction is that, in the first instance, you were attacking a category. In the second instance, you were attacking a person, but it's not clear that you're attacking that person because they are a member of a protected category.
Jad Abumrad: 00:37:12 So Bob might be trash for reasons that have nothing to do with him being a man.
Simon Adler: 00:37:15 Yeah.
Jad Abumrad: 00:37:16 He just might be annoying.
Simon Adler: 00:37:17 Right?
Jad Abumrad: 00:37:18 Okay. So that explains why you take down "men are scum." But why would you leave up "black children are scum"? Why would that not get taken down?
Monica Bickert: 00:37:27 Traditionally, we allowed speech once there was some other word in it that made it about something other than a protected characteristic.
Simon Adler: 00:37:35 In Facebook jargon, these are referred to as a "non-protected modifier."
Robert Krulwich: 00:37:41 Just means literally nothing to me. Give us an example of this.
Monica Bickert: 00:37:46 Traditionally, if you said, "I don't like this religion cab drivers."
Simon Adler: 00:37:51 Cab driver would be the non-protected modifier because employment is not a protected category. And so what the rule stated was, when you add this non-protected modifier to a protected category, in this case the cab driver's religion-
Monica Bickert: 00:38:10 We would allow it because we can't assume that you're hating this person because of his religion. You actually just may not like cab drivers.
Jad Abumrad: 00:38:18 So in the case of black children, "children" is modifying the protected category of black. And so, children trumps black?
Simon Adler: 00:38:27 Age is a non-protected category.
Jad Abumrad: 00:38:30 Okay.
Simon Adler: 00:38:30 And so children becomes a non-protected modifier and their childness trumps their blackness. You can say whatever you want about black children. Whereas in the case of white men, you've got gender and race, both protected, you can't attack them.
Jad Abumrad: 00:38:53 That's just a bizarre rule. I would think you'd go the other direction, that the protected class would outweigh the modifier.
Simon Adler: 00:39:00 Well, they made this decision, as they explained to me, because their default was to allow speech. They were really trying to incorporate or nod to the American free speech tradition.
FB Employee: 00:39:12 And so there's a whole lot of stuff out there that none of us would defend as a valuable speech, but didn't rise to the level of stuff that we'd say, "This is so bad. We're going to take it down."
Simon Adler: 00:39:22 In this case, their concern was-
FB Employee: 00:39:24 We're all members of at least half a dozen protected categories. We all have gender, we all have sexual orientation.
Simon Adler: 00:39:31 If the rule is that any time a protected class is mentioned, it could be hate speech. What you are doing at that point is opening up just about every comment that's ever made about anyone on Facebook to potentially be hate speech.
FB Employee: 00:39:48 Then you're not left with anything, right?
Monica Bickert: 00:39:50 No matter where we draw this line, there are going to be some outcomes that we don't like. There are always going to be casualties. That's why we continue to change the policies.
Simon Adler: 00:39:59 In fact, since Marcia's debacle, they've actually updated this rule. So now black children are protected from what they considered the worst forms of hate speech.
Monica Bickert: 00:40:09 Now our reviewers take how severe the attack is into consideration.
Simon Adler: 00:40:15 But despite this, there are still plenty of people-
Marcia Belsky: 00:40:17 That is flawed because you are a social network-
Simon Adler: 00:40:20 Including Marcia who think this still just isn't good enough.
Marcia Belsky: 00:40:24 There are not systematic efforts to eliminate white men in the way that there are other groups. That's why you have protected groups.
Simon Adler: 00:40:32 She thinks white men and heterosexuals should not be protected.
Marcia Belsky: 00:40:36 Protect the groups who are actually victims of hate speech.
Jad Abumrad: 00:40:40 Makes sense.
Simon Adler: 00:40:41 Well, yeah, because in sort of hate speech, or thinking about hate speech, there's this idea of privileged or have historically disadvantaged groups and that those historically disadvantaged groups should have more protection because of being historically disadvantaged. And the challenge with that was presented to me was, okay-
Audio clip: 00:41:03 By the thousands new Japanese reinforcements poured-
Simon Adler: 00:41:06 In the 1940s-
Audio clip: 00:41:08 -to cut off the Chinese in Zhabei and Zhengzhou.
Simon Adler: 00:41:10 -you had Japanese soldiers-
Audio clip: 00:41:14 Shot and beheaded tens of thousands of Chinese civilians.
Simon Adler: 00:41:15 -killing millions of Chinese during World War II. At that same time, you had Japanese American citizens-
Audio clip: 00:41:24 More than a hundred thousand persons of Japanese ancestry, all of them would have to move.
Simon Adler: 00:41:28 -being put into internment camps.
FB Employee: 00:41:30 And so we had to ask ourselves a question like, are the Japanese a historically advantaged or disadvantaged group?
Simon Adler: 00:41:37 Japanese-Americans, pretty easy to make a case that they were disadvantaged. But in China, it's a totally different story. And this happened at the exact same moment. So you've got two different places, two different cultural stories. When you have a website like Facebook, this transnational community, they realized or they decided that ideas of privilege are so geographically bound that there is no way to effectively weigh and consider who is privileged above who, and decided, therefore, that we are not going to allow historical advantage or historical privilege into the equation at all.
Simon Adler: 00:42:22 I think it's very important to keep in mind here-
Audio clip: 00:42:24 I hate Americans-
Simon Adler: 00:42:26 -these moderators only have like four or five seconds.
Audio clip: 00:42:28 Republicans are scum.
Simon Adler: 00:42:30 -to make a decision.
Audio clip: 00:42:34 [Inaudible 00:42:34].
Simon Adler: 00:42:35 In those four seconds, is there enough time to figure out where in the world someone is, particularly, given IP addresses can easily be masked?
Audio clip: 00:42:44 Go back where you came from.
Simon Adler: 00:42:46 Is there enough time to figure out a person's ethnicity?
Audio clip: 00:42:48 White children are better than black children.
FB Employee: 00:42:51 On top of that, we often don't know an individual's race.
Audio clip: 00:42:55 Straight people suck.
FB Employee: 00:42:57 Other categories are even less clear, like sexual orientation.
Simon Adler: 00:43:01 They just realized it would be next to impossible to get anybody to be able to run these calculations effectively.
Monica Bickert: 00:43:07 When we were building that framework, we did a lot of tests and we saw some times that it was just too hard for our reviewers to implement a more detailed policy consistently. They just couldn't do it accurately. So we want the policies to be sufficiently detailed to take into account all different types of scenarios, but simple enough that we can apply them consistently and accurately around the world. And the reality is anytime that the policies become more complicated, we see dips in our consistency.
Simon Adler: 00:43:41 What Facebook's trying to do is take the first amendment, this high-minded, lofty legal concept and convert it into an engineering manual that can be executed every four seconds for any piece of content from anywhere on the globe. And when you've got to move that fast, sometimes justice loses.
Monica Bickert: 00:44:05 That's the tension here. And I just want to make sure I emphasize that these policies, they're not going to please everybody. They often don't please everybody that's working on the policy team at Facebook. But if we want to have one line that we enforce consistently, then it means we have to have some pretty objective black and white rules.
Jad Abumrad: 00:44:56 When we come back, those rules-
Robert Krulwich: 00:44:59 They get toppled.
Danny: 00:45:09 This is Danny from Denver, Colorado. Radiolab is supported in part by the Alfred P. Sloan Foundation, enhancing public understanding of science and technology in the modern world. More information about Sloan at www.sloan.org.
Jad Abumrad: 00:45:25 Hey, this is Jad. Radiolab is supported by IBM. What kind of tech company does the world need today? One that applies smart technologies at scale with purpose and expertise, not just for some, but for all. With AI, blockchain, and quantum technology, IBM is developing smart, scalable technologies that help businesses work better together. Let's expect more from technology. Let's put smart to work. Visit ibm.com/smart to learn more.
Robert Krulwich: 00:45:55 Hi, I'm Robert Krulwich. Radiolab is supported by Capital One. With the Capital One Saver Card, you can earn 4% cash back on dining and entertainment. That means 4% on checking out that new French restaurant and 4% on bowling with your friends. You also earn 2% cash back at grocery stores and 1% on all other purchases. Now when you go out, you cash in. Capital One. What's in your wallet? Terms apply.
Jad Abumrad: 00:46:24 Jad.
Robert Krulwich: 00:46:24 Robert.
Jad Abumrad: 00:46:25 Radiolab.
Robert Krulwich: 00:46:25 Back to Simon Adler.
Jad Abumrad: 00:46:27 Facebook.
Robert Krulwich: 00:46:27 Free speech.
Simon Adler: 00:46:28 As we just heard before the break, Facebook is trying to do two competing things at once. They're trying to make rules that are just, but at the same time can be reliably executed by thousands of people spread across the globe in ways that are fair and consistent. And I would argue that this balancing act was put to the test April 15th, 2013.
Audio clip: 00:46:50 That's right.
Audio clip: 00:46:50 Hey, Carlos, I'm so sorry. We have some breaking news, otherwise I wouldn't cut you off so abruptly. Carlos ...
Simon Adler: 00:46:58 Monday, the 15th, 2013, just before 3:00 in the afternoon. Two pressure cooker bombs rip through the crowd near the finish line of Boston Marathon. And as sort of the dust begins to settle-
Audio clip: 00:47:29 Oh, my god.
Simon Adler: 00:47:31 -people like springing into action. This one man in a cowboy hat sees this spectator who's been injured, picks him up, throws him in a wheelchair. And as they're pushing him through the sort of ashy cloud, there's this photographer there and he snaps this photo. And the photo shows that the runner in the cowboy hat and these two other people pushing this man who, his face is ashen from all of the debris, his hair is sort of standing on end and you can tell that actually the force of the blast, and then the particles that got in there are actually holding it in this sort of wedge shape. And one of his legs is completely blown off and the second one is blown off below the knee other than the femur bone sticking out and then sort of skin and muscle and tendons. It's horrific. Meanwhile ...
Audio clip: 00:48:25 From the CBS Bay Area studio.
Simon Adler: 00:48:27 ... on the other side of the country.
Audio clip: 00:48:29 KPIX 5 News.
FB Employee: 00:48:30 I remember snippets of the day.
Simon Adler: 00:48:33 Facebook employees were clustering around several desks staring at the computer screens, watching the news break.
Audio clip: 00:48:39 This has occurred just in the last half hour or so.
FB Employee: 00:48:42 I have memories of watching some of the coverage.
Audio clip: 00:48:45 Chilling new images just released of the Boston bombings.
FB Employee: 00:48:49 I remember seeing the photo published online. And it wasn't long after that, someone had posted on Facebook.
Simon Adler: 00:48:56 From the folks I spoke to, the order of events here are a little fuzzy, but pretty quickly this photo's going viral.
FB Employee: 00:49:06 And we realized we're going to have to deal with it.
Simon Adler: 00:49:09 This image is spreading like wildfire across their platform. It appears to be way outside the rules they'd written, but it's in this totally new context. So they got their team together and sat down in a conference room.
FB Employee: 00:49:22 I don't know, there was probably eight or 10 people thinking about, like, should we allow it?
Simon Adler: 00:49:26 Or should they take it down according to their rules?
FB Employee: 00:49:29 Yeah. So if you recall the "no insides on the outside" definition that we had in place, meaning you can't see people's organs or that sort of thing; and if you can, then we wouldn't allow it. And in this photo, you could definitely see bone.
Simon Adler: 00:49:46 And so by the rules, the photo should obviously come down.
FB Employee: 00:49:49 Yep.
Simon Adler: 00:49:49 However, half the room says no.
FB Employee: 00:49:52 The other people are saying this is newsworthy.
Simon Adler: 00:49:57 Essentially, this photo's being posted everywhere else. It's important. We need to suspend the rules. We need to make an exception, which immediately received pushback.
FB Employee: 00:50:07 Well, I was saying that what we've prided ourselves on was not making those calls and there are no exceptions. There is either mistakes or improvements.
Simon Adler: 00:50:17 We made the guidelines for moments like this to which the other side shoots back.
FB Employee: 00:50:21 "Oh my god, are you kidding me? Like the Boston Globe was publishing this all over the place and we're taking it down? Are you fucking kidding me?
Simon Adler: 00:50:27 Damn the guidelines. Let's have common sense here. Let's be humans. We know that this is important.
FB Employee: 00:50:31 And, yeah, they're kind of ... They're right, but the reality is, if you say, "Well, we allowed it because it's newsworthy," how do you answer any of the questions about any of the rest of the stuff?
Simon Adler: 00:50:48 In other words, this is a Pandora's box. In fact, for reasons that are totally clear, Team Consistency, Team Follow-the-Rules eventually wins the day, they decide to take the photo down. But before they can pull the lever, word starts making its way up the chain.
FB Employee: 00:51:03 And internally within Facebook-
Simon Adler: 00:51:05 According to my sources, an executive under Zuckerberg sent down an order.
FB Employee: 00:51:09 -we were essentially told, "Make the exception."
Simon Adler: 00:51:16 I don't care what your guidelines say, I don't care what your reason is, the photo stands, you're not taking this down.
FB Employee: 00:51:24 Yes. Yes, that's what happened.
Robert Krulwich: 00:51:28 This decision means that Facebook has just become a publisher. They don't think maybe they have, but they've made a news judgment, and just willy nilly they've become CBS, ABC, New York Times, Herald Tribune, Atlantic Monthly, and all these other things. All at once they've just become a news organization.
Simon Adler: 00:51:47 Yeah. And this brings up a legal question that's at the center of this conversation about free speech. Is Facebook a sort of collective scrapbook for us all? Or, is it a public square where you should be able to say whatever you want? Or, yeah, is it now a news organization?
Audio clip: 00:52:05 Adds transparency-
Audio clip: 00:52:06 Let me get, I'm sorry to interrupt, but let me get to one final question that kind of relates to what you're talking about in terms of what exactly Facebook is.
Simon Adler: 00:52:14 And this question has been popping up a lot recently. In fact, it even came up this past April when Zuckerberg was testifying in front of Congress.
Audio clip: 00:52:22 I think about 140 million Americans get their news from Facebook. So, which are you, are you a tech company? Are you the world's largest publisher?
Audio clip: 00:52:34 Senator, this is a ... I view us as a tech company because the primary thing that we do is build technology and products.
Audio clip: 00:52:41 You said you're responsible for your content, which-
Audio clip: 00:52:43 Exactly.
Audio clip: 00:52:44 -makes you a kind of a publisher, right?
Audio clip: 00:52:45 Well, I agree that we're responsible for the content, but I don't think that that's incompatible with fundamentally at our core being a technology company where the main thing that we do is have engineers and build products.
Simon Adler: 00:52:57 Basically, Zuckerberg and others at the company are arguing, no, they're not a news organization.
Jad Abumrad: 00:53:01 Why? What would be the downside of that?
Simon Adler: 00:53:04 Well, Facebook currently sits on this little idyllic legal island where they can't be held liable for much of anything, they're subjected to a few regulations. However, were they to be seen in the eyes of the court as a media organization, that could change. But setting that aside, what really strikes me about all of this is, here you have a company that really up until this point has been crafting a set of rules that are both as objective as possible and can be executed as consistently as possible. And they've been willing to sacrifice rather large ideas in the name of this, for example, privilege, which we talked about, they decided was too geographically bound to allow for one consistent rule. But if you ask me, there's nothing more subjective or geographically bound than what people find interesting or important, what people find newsworthy.
Simon Adler: 00:54:05 I'll give you a great example of this that happened just six months after the Boston Marathon bombing. When this video starts being circulated out of northern Mexico and it's a video of a woman being grabbed and forced onto her knees in front of a camera and then a man with his face covered grabs her head, pulled her head back and slices her head off right in front of the camera. And this video starts being spread.
Shannon Young: 00:54:33 I can't count how many times, like just reading my Twitter feed, I've been like, "Ahhh," you know.
Simon Adler: 00:54:37 One person who came across this video or at least dozens of others liked it was Shannon Young.
Shannon Young: 00:54:42 My name is Shannon Young. I am a freelance radio reporter. I've been living here in Mexico for many years now.
Simon Adler: 00:54:49 Her beat is covering the drug war. And doing so years back, she noticed this strange phenomenon.
Shannon Young: 00:54:54 It first caught my attention in early 2010.
Simon Adler: 00:54:57 She'd be checking social media.
Shannon Young: 00:54:59 You're scrolling through your feed and you'd see all this news where people say, "Nah. There was this three-hour gun battle and intense fighting all weekend long."
Simon Adler: 00:55:07 Folks were posting about clashes between drug cartels and government forces. But then when Shannon would watch the news that night-
Simon Adler: 00:55:20 -she'd see reports on the economy and soccer results, but-
Shannon Young: 00:55:23 The media wasn't covering it.
Simon Adler: 00:55:25 There'd be no mention of these attacks.
Shannon Young: 00:55:27 Nothing to do with the violence.
Simon Adler: 00:55:29 And so she and other journalists tried to get to the bottom of this.
Shannon Young: 00:55:32 Reporters in Mexico City would contact the state authorities and public information officer and they'd be like-
Simon Adler: 00:55:37 "Shootings, bombings, what are you talking about?"
Shannon Young: 00:55:40 "Nothing's going on. We have no reports of anything. These are just internet rumors."
Simon Adler: 00:55:44 The government even coined a term for these sorts of posts.
Shannon Young: 00:55:46 The famous phrase at the time was "collective psychosis." These people are crazy.
Simon Adler: 00:55:51 Because they didn't want the situation to seem out of control. But then, the video was posted. It opens looking out the windshield of a car on a sunny day. The landscape is dry, dusty, and the video itself is shaky, clearly shot on a phone.
Simon Adler: 00:56:12 And then the woman taping starts talking.
Shannon Young: 00:56:20 And this woman, she just narrates as they drive along this highway.
Simon Adler: 00:56:29 She pins the phone from the passenger window to the windshield focusing in on these two silver destroyed pickup trucks.
Shannon Young: 00:56:40 And she's saying look at these cars over here shot up and-
Shannon Young: 00:56:45 "Ooh, look here, look here. This 18-wheeler is totally abandoned. It got shot up."
Simon Adler: 00:56:51 At one point, she sticks the phone out the window to show all of the bullet casings littering the ground.
Shannon Young: 00:56:58 And she just turned the official denial on its head.
Simon Adler: 00:57:03 The government was saying there's no violence. Here were cars riddled with bullets. It was impossible to dismiss.
Shannon Young: 00:57:11 And from then on, you had more and more citizens, citizen journalists uploading anonymously video of the violence.
Simon Adler: 00:57:26 These low-fi, shaky shots of-
Shannon Young: 00:57:28 Shootouts, dismemberments, beheadings. I mean, bodies hanging, dangling off of overpasses to prove to the world that this was really happening and say, "We're not crazy."
Robert Krulwich: 00:57:58 It's a cry for help.
Simon Adler: 00:57:59 Yeah. Which brings us back to that beheading video we mentioned a bit earlier.
FB Employee: 00:58:05 Yeah. That video of the beheading, a lot of people were uploading it, condemning the violence of the drug cartels.
Simon Adler: 00:58:11 And when it started showing up on Facebook, much like with the Boston Marathon bombing photo, this team of people, they sat down in a room, looked at the policy, weighed the arguments.
FB Employee: 00:58:20 And my argument was, it was okay by the rules during the Boston bombing, why isn't it okay now?
Simon Adler: 00:58:26 Particularly, given that it could help.
FB Employee: 00:58:28 Leaving this up means we warn hundreds of thousands of people of the brutality of these cartels. And so we kept it up. However-
Audio clip: 00:58:38 It's fucking wrong. It's wrong.
Audio clip: 00:58:40 I think it's utterly irresponsible and in fact quite despicable of them to put-
Simon Adler: 00:58:44 When people found out-
Audio clip: 00:58:44 I'm talking I have little neighbor kids that don't need to see shit like that.
Simon Adler: 00:58:48 -backlash.
Audio clip: 00:58:49 Is there really any justification for allowing these videos?
Simon Adler: 00:58:52 People as powerful as David Cameron weigh in on this decision.
Audio clip: 00:58:56 Today, the prime minister strongly criticized the move.
Simon Adler: 00:58:58 Saying we have to protect children from this stuff.
Audio clip: 00:59:01 David Cameron tweeted, "It's irresponsible of Facebook to post beheading videos."
FB Employee: 00:59:06 Yes. People were really upset because of what it was showing.
Simon Adler: 00:59:09 And so, according to my sources, some of the folks involved in making this decision to leave it up were once again taken into an executive's office.
FB Employee: 00:59:17 And so we went up and there was a lot of internal pressure to remove it. And I go to my boss and say, "Hey, look, this is the decision we made. I recognize this as controversial. I want to let you know why we made these decisions."
Simon Adler: 00:59:30 And they made their case.
FB Employee: 00:59:31 There are valid and important human rights reasons why you would want this to be out there to show, the kind of savagery. And she vehemently disagreed with that.
Simon Adler: 00:59:40 They took another approach arguing that if we take this down-
FB Employee: 00:59:43 You're deciding to punish people who are trying to raise awareness.
Simon Adler: 00:59:46 Again, she wasn't budging.
FB Employee: 00:59:48 And just didn't get past that. Ultimately, I was overruled and we removed it. Just because there was pressure to do so.
Simon Adler: 01:00:00 The same people that six months prior told them to leave it up because it was newsworthy said, "Take the video down."
Audio clip: 01:00:07 Facebook this week reversed the decision and banned a video posted to the site of a woman being beheaded.
Audio clip: 01:00:12 In a segment, Facebook said, quote, when we were ...
Robert Krulwich: 01:00:15 If you want the one from Boston in, you probably should have the one from Mexico in.
Simon Adler: 01:00:19 Right.
FB Employee: 01:00:20 It was a mistake.
Simon Adler: 01:00:22 Yeah, I think it was a mistake.
FB Employee: 01:00:26 Because I felt like, why do we have these rules in place in the first place and it's not the only reason, but decisions like that are the thing that precipitated me leaving.
Simon Adler: 01:00:43 Leaving?
FB Employee: 01:00:44 Yeah. Not too long after that incident, a few members of the team decided to quit.
Simon Adler: 01:00:50 What I think this story shows is that Facebook has become too many different things at the same time. So Facebook is now sort of a playground. It's also an R-rated movie theater. And now it's the front page of a newspaper.
Robert Krulwich: 01:01:09 It's all those things at the same time.
Simon Adler: 01:01:11 It's all those things at the same time. And what we, the users, are demanding of them is that they create a set of policies that are just. And the reality is justice means a very different thing in each one of these settings.
Robert Krulwich: 01:01:22 Justice would mean that the person in Mexico gets told the truth in Mexico by Facebook and the little boy in England doesn't have to look at something gory and horrible in England. But you can't put them together because they clash.
Simon Adler: 01:01:36 Exactly.
Robert Krulwich: 01:01:37 So how do you solve that?
Simon Adler: 01:01:40 I don't know. I think it's important to keep in mind that even if you have the perfect set of policies that somehow managed to be just in different settings and that can be consistently enforced, the people at the end of the day making these decisions, they're still people, they're still human beings.
Simon Adler: 01:02:04 Is this working or no?
Marie: 01:02:05 I can hear it, yeah. Yeah.
Simon Adler: 01:02:07 Great. Okay. At long last we figured it out, huh?
Marie: 01:02:11 Yeah. Clearly.
Simon Adler: 01:02:13 I spoke to one woman who did this work for Facebook.
Marie: 01:02:15 I just want to be anonymous. I don't want them to even know that I'm doing it because they might file charges against me.
Simon Adler: 01:02:23 We'll call her Marie. She's from the Philippines where she grew up on a coffee farm.
Marie: 01:02:28 Yeah. That's my father's grandpa. I didn't know that the coffee was only for adults.
Simon Adler: 01:02:36 She said many afternoons while she was growing up, she and her mother would sit together outside sipping their coffee and tuning into their short wave radio.
Audio clip: 01:02:49 This is the Voice of America, Washington, DC.
Simon Adler: 01:02:49 And they'd sit there-
Marie: 01:02:50 Listening to the Voice of America.
Simon Adler: 01:02:52 Silence.
Audio clip: 01:02:54 I'm going to ask that we all bow our heads in prayer.
Simon Adler: 01:02:57 She said one of her favorite things to catch on Voice of America were Billy Graham's sermons.
Marie: 01:03:01 Billy Graham, one of the great evangelists.
Audio clip: 01:03:05 Our Father, we thank thee for this love of God that reaches around the world and engulfs all of mankind.
Simon Adler: 01:03:15 But then fast forward 50 years to 2010 and Marie is consuming a very different sort of American media.
Marie: 01:03:27 The videos were the ones that affected me. There were times when I felt really bad that I am a Christian and then I looked into these things.
Simon Adler: 01:03:36 She became a content moderator back in 2010 and was actually one of the first people in the Philippines doing this work.
Marie: 01:03:42 I usually had the night shift, in the early morning or at dawn, from 2:00 AM to 4:00 AM.
Simon Adler: 01:03:51 She worked from home and, despite it being dark out, she'd put blankets up over the windows so no one could see in at what she was looking at. She'd lock the door to keep her kids out.
Marie: 01:04:01 I have to drive them away, or I would tell them that it's adult thing, they cannot watch.
Simon Adler: 01:04:07 And she and the other moderators on her team who live throughout the Philippines, they were trained on the guidelines on this rule book.
Marie: 01:04:14 There were policies that we have to adhere to, but some of us were just clicking pass, pass, pass, even if it's not really pass, just to finish.
Simon Adler: 01:04:25 Just to get through the content fast enough. And in some cases, she thinks-
Marie: 01:04:29 A number of the moderators are doing it as a form of retaliation for the low rate.
Simon Adler: 01:04:35 People were pissed at the low pay.
Simon Adler: 01:04:37 If I can ask, how much were you making an hour doing this?
Marie: 01:04:42 As far as I remember it, we were paid like $2.50 per hour.
Simon Adler: 01:04:49 Marie wouldn't say whether or not this low wage led her to just let things through. But she did say ...
Marie: 01:04:56 Based on my conservative background, there are things that I cannot look objectively at, so I reject many of the things that I think are not acceptable.
Simon Adler: 01:05:11 Really?
Marie: 01:05:13 Of course.
Simon Adler: 01:05:14 She said whether something was outside the rules or not, if her gut told her to, she just took it down.
Marie: 01:05:20 Whenever it affects me a lot, I would click the button of, like, it's a violation, because if it's going to disturb the young audience, then it should not be there. Like, if there's a new person-
Simon Adler: 01:05:35 Whether it was a breastfeeding photo, or an anatomy video, or a piece of art.
Marie: 01:05:40 I would consider it as pornography, and then click. Right away, it's a violation.
Simon Adler: 01:05:49 You took the law into your own hands. You went vigilante.
Marie: 01:05:58 Yeah, or something. So, yeah, I have to protect kids from those evil side of humankind.
Male: 01:06:50 Where does that leave you feeling? Does that leave you feeling that this is just, at the end, this is just...undoable?
Marie: 01:06:59 I think they will inevitably fail, but they have to try and, and I think we should all be rooting for them.
Robert Krulwich: 01:07:07 Where does that leave you feeling? Does that leave you feeling that at the end this is just undoable?
Jad Abumrad: 01:07:07 I think they will inevitably fail, but they have to try, and I think we should all be rooting for them.
Jad Abumrad: 01:07:38 This episode was reported by Simon Adler with help from Tracie Hunte, and produced by Simon with help from Bethel Habte.
Robert Krulwich: 01:07:45 Big thanks to Sarah Roberts, whose research into commercial content moderation got us going big time and we thank her very much for that.
Jad Abumrad: 01:07:52 Thanks. Also to Jeffrey Rosen who helped us in our thinking about what Facebook is.
Robert Krulwich: 01:07:57 To Michael Churnis, whose voice we used to mask other people's voices.
Jad Abumrad: 01:08:01 To Carolyn Glanville, Ruchika Budhraja.
Robert Krulwich: 01:08:05 Brian Dogan, Ellen Silver, James Mitchell, and Guy Rosen.
Jad Abumrad: 01:08:06 Of course, to all the content moderators who took the time to talk to us.
Simon Adler: 01:08:10 Do you want to sign off?
Jad Abumrad: 01:08:12 Yeah, I guess we should.
Robert Krulwich: 01:08:12 We should. Ready? Do you want to go first?
Jad Abumrad: 01:08:15 Yeah, I'm Jad Abumrad.
Robert Krulwich: 01:08:17 I'm Robert Krulwich.
Jad Abumrad: 01:08:18 Thanks for listening.
Voice machine: 01:09:30 To play the message, press 2. Message one, Kate Klonic from Brooklyn, New York. Radiolab was created by Jad Abumrad and is produced by Soren Wheeler. Dylan Keefe is our director of sound design. Maria Matasar-Padilla is our managing director. Our staff includes Simon Adler, Maggie Bartolomeo, Becca Bressler, Rachel Cusick, David Gebel, Bethel Habte, Tracie Hunte, Matt Kielty, Robert Krulwich, Annie McEwen, Latif Nasser, Malissa O'Donnell, Arianne Wack, Pat Walters, and Molly Webster, with help from Shima Oliaee, Carter Hodge and Liza Yeager. Our fact checker is Michelle Harris. End of message.