BROOKE GLADSTONE: During the course of preparing this hour, it struck us that the giant companies that rule the internet are in dire need of a code of ethics, not for their users but for themselves. Why don’t they have one? Often, when we have a thorny problem like this, we put in a call to technology writer, commentator and entrepreneur Paul Ford. Welcome back to the show, Paul.
PAUL FORD: It’s great to be here, thank you.
BROOKE GLADSTONE: This is a week where Facebook, getting slammed for the speech it allowed on its platform and also for the speech it's apparently suppressing. The episode that caught your eye recently involves posts from Rohingya activists in Burma. What happened there?
PAUL FORD: The Daily Beast reported on it and what happened is that in Burma the Rohingya activists were having their accounts erased and posts erased and things they were putting up erased. And it’s not really clear what's going on. Some actor, whether it's automated or whether it's a human being, is saying, get rid of that person, get rid of that speech, that's not good for our platform. And Facebook, being kind of semi-robotic, often goes, oh yeah, all right.
BROOKE GLADSTONE: The Muslim ethnic minority is being persecuted and maybe ethnically cleansed.
PAUL FORD: Mm-hmm. [AFFIRMATIVE]
BROOKE GLADSTONE: Facebook, though, responded to criticism by saying, we work hard to strike the right balance between enabling expression, while providing a safe and respectful experience.
PAUL FORD: I mean, that’s a canned message, right? Like, the wrong spokesperson essentially got in there and was like, oh, I know what to say. But this is ethnic cleansing and so it requires a, a far more serious response to a journalist’s inquiry. It just makes it seem like a giant faceless organization that doesn't care. And, at some point, seeming like and being is a, is a really trick boundary.
This is a problem with these organizations. They get really big, really fast and they can't grow up at the speed at which they become a fundamental utility that controls how millions or billions of people get their information. And so, you end up in the situation over and over again where the giant internet organism did this damn thing. And we’re going, like, why would it do that damn thing?
BROOKE GLADSTONE: Well, let's talk about why. The economics of the business is part of it.
PAUL FORD: Look, when you go to a computer and you say, hey, give me that thing over there and it goes, got it, a millisecond and it’s in your hands, right, that thing could be an advertisement, it could be a blog post, it could be a hateful ethnic screed about how everyone should be murdered. The computer doesn’t care, it’s really, really fast. And then what you're saying here is what needs to happen is you have to add friction to it.
BROOKE GLADSTONE: In order to apply ethics, you have to slow things down. You have to throw sand in the gas tank.
PAUL FORD: Sand in this form is human beings. [AUDIBLE BREATH] You know, I’ll give you an example. There’s been a lot of coverage of this. There are vast places in, like, the Philippines and other spots where content that might be pornographic, that might be offensive, there are people looking at it and reviewing it and saying, this is offensive, this is not. There are people who review things for the German market because the laws there are very specific about what you can and can't say about Nazism and anti-Semitism, right? And so, law becomes a source of, like, okay we better do it.
BROOKE GLADSTONE: But in the US there are very few restrictions on speech. It’s built into the core of the nation. Hate speech is legal.
PAUL FORD: Well, now you’re into the fundamental question of all this stuff, right, because Facebook is its own entity. It might have the power of a nation state, ‘cause it's at that scale, but it's its own entity, and so, it has the choice, as does Twitter, as does Google, as to what kind of content will flow through its system.
BROOKE GLADSTONE: Right, so you have them reflexively taking down a site posted by Rohingya activists who are in imminent danger of death and want the world to know --
PAUL FORD: Right.
BROOKE GLADSTONE: -- and, reflexively, through the same highly efficient technology, selling targeted anti-Semitic ads.
PAUL FORD: Sure. The things that are supposed to help you control abusive speech are probably the things that are cutting off the Rohingya activists.
Sheryl Samberg is well, the C-something of Facebook, the second in command to Mark Zuckerberg and sometimes the first in command, I think.
BROOKE GLADSTONE: Mm-hmm.
PAUL FORD: And, and she just put it up on Facebook, you know, what they’re going to do about the anti-Semitic advertising demographic categories. She’s like, yeah, okay, we got people reviewing everything and then we turned it off, we looked through everything, so now we’re turning it back on. The thing is, is they should have seen this coming. They should have been in charge of their own ad product --
BROOKE GLADSTONE: Mm-hmm.
PAUL FORD: -- and they should have had people on it. This is one of the major revenue-driving products for a bazillion-dollar company, and it let you buy demographic information by like people who want to burn Jews. Like, that is a disaster.
BROOKE GLADSTONE: Now, for Facebook to really get into their content, they have to start acting like a media company, and the minute they start acting like a media company, no longer just a tech interface, they’re subject to all kinds of regulation and that could play havoc with their business model. So it has to admit to being, in fact, what it really is in order to deal with these ethical problems that have such a broad impact.
PAUL FORD: See, this is incredibly tricky for an organization like Facebook and, and Google and some because they are media companies at one level, right? They produce content that people read. Tech companies never want to align themselves with media because it, it's a terrible non-profitable industry. Like, they think --
BROOKE GLADSTONE: So they can't be ethical.
PAUL FORD: They have to pretend that they're not media. They can. But the thing is, is you don't have the definition around tech ethics in the same way you do around media.
BROOKE GLADSTONE: Mm-hmm. [AFFIRMATIVE]
PAUL FORD: Aside from a few thinkers, there isn’t like some giant academic discipline that they can just go to and say, hey, what should we do? Media ethics, I can go read two books and then I kind of know how I need to behave as a journalist. There's nothing like this.
BROOKE GLADSTONE: But the stakes are incalculably high.
PAUL FORD: Right, and you have a relatively small number of people trying to process and react to that, and I think that's your hugest failure point, right? Like, someone wasn't looking and didn't think to themselves, hey, we should make sure that our ad product isn't really critical of Jews.
BROOKE GLADSTONE: What about an ad product that is explicitly paid for by the Russians, targeted towards voters?
PAUL FORD: Sure. At one point, the Trump organization literally said, like, you know, Facebook’s ad product was really critical for us in those last days.
BROOKE GLADSTONE: What’s the solution, Paul?
PAUL FORD: Basically, what’s happening is that they’re embarrassed by the press and they’re creating an ethos on the fly as a form of PR. They have zillions of dollars. They’re going to work this out. I don't know if it's going to be good or bad for our culture, but there’s no easy path here.
BROOKE GLADSTONE: Okay, so to bring you up to speed on our episode this week, we just heard from the COO of Gab, which is a Twitter-like platform with a free-speech bent that hosts a range of voices but is mostly notable for the hate stuff.
PAUL FORD: Sure.
BROOKE GLADSTONE: He said that with the liberal establishment decade after decade pushing its ethics, its morality down the throats of people who have different sets of ethics -- for instance, he, he referred to the gay marriage debacle.
PAUL FORD: A debacle?
BROOKE GLADSTONE: Yeah. By us doing that -- I’m putting us in that category -- we are creating such a profound resentment that we are the generators of those people's hate.
PAUL FORD: So what it comes down to is are Google and Facebook, you know, when they limit speech, when these giant platforms limit speech and don’t include you readily in their search index, then are they exercising monopolistic control? And I can tell you that, as someone who builds technology platforms, I don't want what I see as hate speech on those platforms, don’t want it. People I -- other people I know who do it, don’t want it. At a certain size, I think it's completely easy to say that and be like, nope, sorry you’re banned.
But then when you have 100 million people using your system and it’s in the central utility, your definition of hate speech becomes a, a really big issue, and individuals can't really be trusted, and you need the government or some sort of larger ethical system to say, this is truly toxic and cannot be allowed or everything must be allowed, or whatever.
In Europe, they -- there’s a -- the idea of the right to be forgotten, which is that if you cheat on your taxes in your ‘20s and now you’re in your ‘40s and 20 years later that’s still the first thing that comes up when somebody searches for your name, you have the ability to petition and ask for Google to remove that from their search results so it doesn’t come up. Google fought that. They did not want that at all. It’s now part of law. People apply it.
BROOKE GLADSTONE: There, not here.
PAUL FORD: Not here, but in Europe Google does that. And so, you can have that kind of change. It's totally possible. If you decide that anti-Semitic speech is not tolerable on your platform, which is something that, you know, essentially Germany did as a culture, I think that that's a right that you can exercise as a platform owner and controller, especially ethically, if you feel that it will damage the overall experience for most people.
There's a black woman academic that I follow, and I look at how people react and respond to her. Her life on Twitter is exhausting, just like essentially, you know, you don't deserve to exist, over and over and over again.
BROOKE GLADSTONE: So what sort of process or system could you imagine that could impose the right kind of ethical sensibilities on this massively concentrated, highly efficient machine-run universe, which is essential to the way we communicate and has incalculable power to influence us?
PAUL FORD: I think that, you know, ultimately the, the community and the users are going to lead, right, and I think at a certain point, unless you’re really committed or it’s necessary for your job, if you just find something miserable or depressing or you just don't want to do it anymore, you're gonna get the hell off the platform. And so, the platforms are gonna --- what’s gonna make them respond is if growth becomes negative. If growth is positive, they're going to keep doing what they’re doing and wait for the press to call them out. But if growth is negative, you'll see them adapting to all kinds of user needs.
BROOKE GLADSTONE: So fundamentally, what you're saying is humans created this problem, they'll have to solve it by signing off.
PAUL FORD: I don’t think this life lived in public is that much fun but, you know, for right now this is where we’re at, and these giant companies run the world of online information and we’re all dependent on them and they’re kind of dependent on us, as statistical noise that we might be, and --
BROOKE GLADSTONE: And where does that leave ethics?
PAUL FORD: [AUDIBLE EXHALE] It leaves ethics as an annex to public relations, as a response. That’s the lousy part.
BROOKE GLADSTONE: You know, public relations is how you respond to public opinion --
PAUL FORD: Mm-hmm. [AFFIRMATIVE]
BROOKE GLADSTONE: -- right? Generated by individuals, amplified through the media so, in a way, that's not as unhealthy as it sounds.
PAUL FORD: No, it’s just how it works. The media pokes and the giants go, ow, it hurt my toe, and then the PR person says, wait, we’re dealing with that issue. Sheryl Sandberg writes a nice post and says, absolutely, we shouldn’t say we failed, so this was a fail --
BROOKE GLADSTONE: Mm-hmm.
PAUL FORD: -- right, ‘cause that’s, that’s -- it’s 2017. There’s no failure anymore.
BROOKE GLADSTONE: Well, she said that seeing the words “how to burn Jews” in a person’s profile disgusted and disappointed her, “disgusted by these sentiments and disappointed that our systems allowed this.”
PAUL FORD: That is a strong ethical response to a failure of a system that she manages and controls.
BROOKE GLADSTONE: “We have long had a firm policy against hate. Our community deserves to have us enforce this policy with deep caution and care.”
PAUL FORD: It’s good to have a policy against hate.
On the side -- no, seriously. NBC can say it. You know, IBM can say it. So, Facebook can say it. They just have to act on it in ways that other ones might not have to.
BROOKE GLADSTONE: Paul, thank you very much.
PAUL FORD: Always a privilege, thank you!
BROOKE GLADSTONE: [LAUGHS] A privilege? Paul Ford is a tech writer, commentator, entrepreneur and our pocket ethicist.
[MUSIC UP & UNDER]
Okay, all right.
PAUL FORD: I can’t wait to have ethicists say, what the hell is your credential?
BOB GARFIELD: That’s it for this week’s show. On the Media is produced by Alana Casanova—Burgess, Jesse Brenneman, Micah Loewinger and Leah Feder. We had more help from Jon Hanrahan and Monique Laborde. And our show was edited -- by Brooke. Our technical director is Jennifer Munson. Our engineers this week were Sam Bair and Terence Bernardo.
BROOKE GLADSTONE: Katya Rogers is our executive producer. Jim Schachter is WNYC’s vice president for news. On the Media is a production of WNYC Studios. I’m Brooke Gladstone.
BOB GARFIELD: And I’m Bob Garfield.
* [FUNDING CREDITS] *