Facebook's Global Influence

( Richard Drew / AP Photo )
[music]
Brian Lehrer: It's the Lehrer show on WNYC. Good morning everyone. Today begins with Facebook in the crisis of its life over its role in the crisis of other people's lives. The crisis at Facebook has been caused by internal documents that are being called the Facebook papers, and have been shared with a consortium of news organizations by whistleblower Francis Haugen. Here are a few of the headlines the Facebook papers have generated in the last few days from the New York Times, Internal alarm, Public shrugs: Facebook's employees dissect its election role. From Huff Post, Facebook froze as Anti-vax comments swarmed users. From Vox, Wall Street doesn't care about the Facebook leaks. Mark Zuckerberg does. USA Today from just this morning, Instagram's dangers to children unite liberal and conservative lawmakers who agree on little else.
A few more from The Times, In India, Facebook Grapples with Amplified Version of Its Problems, and Instagram Struggles with Fear of Losing Its 'Pipeline': Young Users. With me now is the writer of all those New York Times articles, Sheera Frenkel, Times technology reporter, and co-author of the book, An Ugly Truth: Inside Facebook's Battle for Domination. Sheera, thanks for coming on today. Welcome back to WNYC.
Sheera Frenkel: Thank you so much for having me.
Brian: I want to walk you through some of the stories that you tell in these articles. I think the listeners will find it really interesting to go point by point through a couple of them. Let's begin with your article, Internal alarm, Public Shrugs: Facebook's employees dissect its election role. It begins with a Facebook employee who opened an experimental account in 2019 and she quickly discovered something alarming. Want to start with that story?
Sheera: Sure. This Facebook employee is a long-term researcher who likes to run these kinds of experiments. She joins Facebook in a new user and then see where Facebook's algorithms take her. In this case, she wanted to be a new user who was a Christian woman based somewhere in the United States and that was essentially all the information she gave Facebook. What she found was within just a matter of weeks, Facebook's algorithms were pushing her towards extreme content. By that, I mean fringe conspiracy theories like QAnon, militia groups. The type of stuff that Facebook says that it bans and that it doesn't want to push people towards, it was giving this person that had just joined the platform.
Brian: Then you go on to another Facebook employee raising an alarm on November 5th, last year, that first one was 2019, November 5th, last year, three days after election day. What did that employee see?
Sheera: What they start to see is that the amount of election misinformation is just overwhelming and that the safeguards that the company said that it put in place, the break-glass measures is what Facebook was calling, which is not working. It wasn't effective. I think that it was interesting for us as reporters to see that here we are incredibly important elections in the United States and this employee is sounding the alarm very early on to say something's going wrong here. Our measures are not working.
Brian: Let's keep going. Just four days later, November 9th last year, yet another Facebook employee, one of their data scientists reported up the chain that an alarming 10% of all US views of political material on Facebook were posts that claimed the election results were fraudulent. What did Facebook do with any of that information?
Sheera: What did they do? [chuckles] They didn't do much outside of their plan. They did enact some additional break-glass measures that they had prepared. It was a lot of brow furring and employees asking one another why their company was being used in that way. Yes, you don't see them go into crisis mode, you don't see them taking some of the more extreme steps that those same researchers were proposing to them.
Brian: One more story before we dig into what Facebook could have done, according to its employees. From your article, In India, Facebook Grapples With an Amplified Version of Its Problems. That article also starts with a Facebook researcher setting up an account just to see what would happen. This Time in Kerala, India, and with a really simple and interesting game plan, follow all the recommendations generated by Facebook's algorithms and see what happened. What happened?
Sheera: We don't note it in our articles, but this is the same researcher, the same researcher who set up the profile of a young American woman, goes and runs the same experiment in India and she sets up this account and again, very little information, she just says that she's based in Kerala, India, and she wants to see what Facebook tells her to do, what groups does it tell her to join, what pages does it tell her to follow. Within a matter of weeks, she is inundated with violent content, hate speech, misinformation. This is a time where India and Pakistan tensions are running very, very high, there's been a suicide attack along the border there.
What she finds is just by following the recommendations that Facebook surfaces for her, she's driven to believe all these conspiracies about the suicide bombing that happened along the India-Pakistan border. She could see how people on the platform, it could be fed incredibly divisive and polarizing content if they were just brand new to Facebook and joining the platform looking for information.
Brian: Listeners, we're going to try to take a global angle on the phones for this. If you have ties to India or any other country that's not the US and you have a story to tell or an impression of how Facebook matters politically or culturally in that country, call in. Let's see if we get some callers with ties to different countries, 212-433-WNYC, 212-433-9692. Certainly, this can include India, which is Facebook's biggest market and which our guest Sheera Frenkel from the New York Times has just reported on. People with ties to India, help us report that part of the story with some of your experiences of Facebook's presence in national life there, 212-433-WNYC, 433-9692, or anyone else with any ties to any other countries and then we'll bring it back to the US as we go. 212-433-9692. Let's see how global we can be on the phones at the beginning of this.
Sheera, you note that India is Facebook's largest market and I know India has a billion people, second largest country in the world by population, but some people might still be surprised to hear that Facebook is so prevalent there. Why India?
Sheera: Facebook was really aggressive about entering the market in India, they made these deals with local telecom carriers, which made it pretty much free to get Facebook. If you were getting a phone, you would have a version of the Facebook app on your phone which was great for messaging friends, for doing business. It became incredibly prevalent because it was so inexpensive. This is a country where you often have to pay for data on your phone. It can be very prohibitive for the average person to have to pay for all that data. If you are given a cheap alternative, you can see how people were drawn to it.
Facebook also did a lot of marketing in India, it really wanted to be in this country. I think it's interesting now looking and reading their internal documents about how much they struggled to secure the elections in India, how complicated they found India's voting system. This is a voting process that lasts for over a month. It is a country which has over 22 official languages. This is not an easy environment to regulate content in or to monitor content in. Yet, Facebook pushed very aggressively into that market and then found itself struggling and only able to monitor content in I think three of the 22 languages or four rather if you include English.
You see a company in love with the idea of doing business in India, in love with the idea of this billion-person market that it can potentially capture but they're not really able to, I think at least what these documents show, is not really able to keep up with the sheer volume of what is taken on.
Brian: As we debate the extent of Facebook's role in disinformation and allowing disinformation in this country, and allowing this country to be torn apart over falsehoods about the election and vaccines and other things, how is this affecting politics or other aspects of life in India that you were just describing over there?
Sheera: We see time and again that the kind of conspiracies that we worry about here in the United States are amplified tenfold when they reach other countries. When I say that, I'm really thinking specifically about the pandemic, about the start of the pandemic and what COVID was, and how it spread, and then all the early narratives about the vaccine that anti-vaccine activists wanted to spread. I was doing some research for this India story that I reported for the Times and I was just astounded at the anti-vaccine contents that I found being spread in India to its user base there.
I think a big reason for that is that if Facebook-- Another startling figure that came out of these documents, Facebook dedicates 87% of its budget on misinformation to combating misinformation here in the United States. That's an umbrella that covers hate speech and conspiracies and everything else. So 87% of that budget goes to the United States. 13% goes to the entire rest of the world and let's put that figure in context, Facebook has more people using its platform in India than in the United States, just one country, just one country in the rest of the world.
You can see how here in America they might struggle, but they are somewhat effective in taking down conspiracies about the vaccine and allegations that the vaccine isn't effective and all the rest of that stuff, how can they possibly be effective in the rest of the world when they're spending just 13% of their budget on it?
Brian: Any other countries you might want to add to the US and India? I hear those whopping statistics that you just gave with respect to India and, of course, the US, but where similar things are taking place, any others?
Sheera: I think about Myanmar a lot, Sri Lanka and Tunisia, Ethiopia. Just around the time that we had our elections here in the United States, Ethiopia saw an outbreak of violence. Partially it seems spurred by rumors that were spread on Facebook. I think that you'd be hard-pressed to find a country in the world that has not been affected by misinformation spread on Facebook.
Brian: Let's take a phone call. Here is Molly in New Hyde Park. You're on WNYC. Hello, Molly.
Molly: Hi, Brian Lehrer. I'm an ardent listener of you. What I wanted to say is, I live in New York and then I have lots of friends all over the country, but they are very active on WhatsApp and Facebook. Especially a couple of years ago during the election, they were watching all kinds of videos and they send it to groups and then it keeps spreading and they strongly believe that the Muslims are going to take over Hindu religion and Christianity is going to take over the Hindu religion. They are going to eradicate all the Hindus and they say all the rapings happening all over the country, but it is very disturbing to listen to that.
Of course, I don't have WhatsApp. I do have Facebook, but I don't really go on that very often, but it's a really endemic over there and people strongly believe and if I say anything against it, they think that I'm not a truly Indian, of Indian origin so they turn against me, but I just stay away from that because it's really disturbing in what they believe. Some of them say they can't sleep at night because this is happening all over the country and they get messages every day, all day, very derogatory, disturbing, fake, violent messages about the other ethnicity in India.
Brian: Many of these are on WhatsApp, you say, which is a Facebook property, Facebook property, as is Instagram. Your experience of this, Molly, is that these are divisions that pre-existed Facebook in Indian society, but they're being amplified and made worse through the way people are using Facebook and WhatsApp.
Molly: The problem is that some really believe it's happening.
Brian: Molly, thank you so much for your call. We really appreciate it, disturbing as that is. Bhavesh in North Brunswick. You're on WNYC. Hi, Bhavesh.
Bhavesh: Hello, Brian. Thanks for having me on.
Brian: Your call is about COVID misinformation, I see, right?
Bhavesh: Yes. We had a family member in India who started receiving misinformation from WhatsApp messages as well. They started off as like Facebook Lang, so it was very easy to go from Facebook to then WhatsApp. Then when there's a large group with a bunch of people that either like holistic doctors or have some other qualifications, they start sharing this misinformation. It got to a point where the links that they clicked on Facebook, it basically contributed to conditions in their health that were almost, were life-threatening. We saw how one social media, like Facebook, can easily be amplified through another social media, like WhatsApp, which it's not social media, but it was very quickly amplified into something much worse.
Brian: Bhavesh, thank you very much. Let's do one more in this set, Sujata in Summit, you're on WNYC. Hi, Sujata.
Sujata: Hi. thanks for having me on. I just wanted to reiterate what the previous college has said that WhatsApp conspiracy theories are just a dime a dozen in India and circulated endlessly by everyone in India uses WhatsApp, but not sure you had a question for me.
Brian: Oh, well it sounds like you told our screener, this includes some of your family members, right?
Sujata: Yes, my husband's family did, he just sent me on mine, but it's just insane conspiracy theories that are circulated. There's issues with things that are known issues. For instance, in the US, it's the mask. Like wear a mask and that's that, but here it's like the Valley, the Indian festival is coming up and usually to control pollution that is a narrative saying, "Hey, don't burst crackers because it's just pollutes the air." Now it's being like, this is a very anti-Hindu thing to say, it has, they removed the pollution aspect of it out, but centering it on religion, which says that because your people are being anti-Hindu, when they say that you shouldn't be bursting firecrackers. That's one of the examples.
Brian: Encouraging division, basically.
Sujata: 100% and I was telling someone, the lady who was taking my call earlier, I read this article on slate.com yesterday, which spoke about how the head of public policies for Facebook in India basically was very, you know, comfortable with the ruling party and all of the members of the ruling party just saying whatever it is that they wanted to say without enforcing any restrictions on them. She stepped down from her role. She was not fired so she resigned eventually, but they've got another person. Again, the article states that the person is similar. They get, the Facebook gets a lot of its revenue in India from members of the ruling party.
It's almost like they don't want to rock that boat because why would you, it's a money-making machine. People say all kinds of hideous and crazy things to people that get amplified and circulated and there's money all over it.
: Sujata, thank you so much for your call. As we continue with Sheera Frenkel, a New York Times technology reporter and co-author of the book an Ugly Truth: Inside Facebook's Battle for Domination. Sheera, one thing, did you notice how Sujata through her in-laws disowned, she disowned her in-laws there? I said, "Oh, like you told our screener, that this is happening in your family." She said, "Well, my husband's family", so there go the in-laws.
Sheera: We all have someone. I have people in my family who I find spreading misinformation and conspiracies and it's always a funny conversation. I actually remember around the time of the US elections. I had to get in touch with a family member because they were spreading a conspiracy that I had personally spent time debunking at the New York Times and I sent that person my article and I said, "Hey, just FYI, I've actually written about this" and we still had a very lengthy argument about it. I don't think they believed me, but it's always hard when it's your own family.
Brian: Oh, "Why should I believe the New York Times over Joey from Alabama?" What were you thinking as you listened to that set of calls of Indian Americans?
Sheera: Well, everything they're saying is just so telling, and I noticed two of your colleagues brought up WhatsApp and that's something that we haven't been able to fully explore in our articles, but I think is such an important part of this because whatever happens on Facebook is incredibly viral and shareable. The minute it makes its way into those WhatsApp groups, it's gone. It's done for, it's in the ether because Facebook has some level of metadata on WhatsApp.
They can see, for instance, if two phones are messaging each other regularly, but no one can see the contents of WhatsApp, they can't, it's encrypted. You can't actually see what people are sending each other. That makes it a hotbed for misinformation. I think, especially in countries like India, where people like to join big groups and they're much more popular in India than they are here in the United States, you'll join a WhatsApp group with hundreds of people on it in your local neighborhood or your local soccer club or whatever. That is where a lot of this misinformation is just going hugely viral. With a click of a button, you can share one piece of misinformation to all your WhatsApp groups.
Brian: We have a few callers coming with ties to other countries. Let's sample a few of those, Ali, originally from Iran, Ali, you're on WNYC. Hello.
Ali: Brian, longtime listener, four times I'm calling you, thank you very much for taking my phone call. Your reporters who check in Iran. The Iranians, they use WhatsApp. 82 million people. During the election, they were saying that they believe election was a stolen. My brother, my cousin, my friend of 50 years, because WhatsApp was passing a lot of links. Your report is right. WhatsApp is the biggest, worst thing that Facebook owns. I don't know how did the Forum of Justice approved such a big purchase. I have no idea why Wall Street was pushing Facebook to buy the WhatsApp.
The WhatsApp is worse than the Facebook. As your reporter said, it's encrypted, people talk nonsense. I had an argument with them. I said, "Are you kidding me? We have election, We check IDs." They said, "No, the election was a stolen", and because Iranians take 70% take more loss, they want to Trump wins because they thought Trump is doing a lot of good things for them but they forgot Trump was doing business with Iranian Revolutionary Guard in Dubai on his hotel.
Brian: Ali, I'm going to leave it there for time because we want to get a few more people on and I know Sheera has to go in not too long a time to pick up her kid from preschool, I've been told, but I won't pass that on to everybody. Ali, thank you. We're going to go to Arvy in Queens. You're on WNYC. Hi, Arvy.
Arvy: Yes, sir. Yes, my name is Arvy, and I'm also a journalist from Guyana, and I want to say, Brian, I listen to your show every day and I try to emulate some of the things that you do. You make everything so simple. Thank you very much for that. My concern with WhatsApp is pretty much the same like your previous caller where we had a disputed- Well, it wasn't a disputed elections. We had an election in Guyana last year where the party that was in power lost elections but were refusing to give up power. The entire world, all the reputable organizations came together, came out against the government that they're saying that they've lost the election and they should concede.
It took five months before they actually give up, before they were actually give up power, from the time that the elections were held to the day that the new president was sworn in. Facebook was the main culprit in the disinformation. Sir, you should have seen some of the comments and some of the commentators on Facebook, how similar their argument was to that of Donald Trump, but yet still, our people in Guyana, well, those that supported that party believed every single thing that came out. All the organizations throughout the world, the Commonwealth, the CARICOM.
We had so many independent organizations observing the elections. All came out to say that the ruling party lost the elections, but because of the commentators on Facebook, they were able to spread that disinformation and cause people to believe. Now, the two main political parties in my country are predominantly of Indo-Guyanese descent and Afro-Guyanese descent. Imagine the division that they caused in my country, racial division that they caused in my country because of the commentators that use Facebook to pedal this disinformation.
Brian: Wow. Arvy, thank you very much for calling up. I had no idea about Guyana in this respect and most of our listeners probably did not. That's very revealing. One more, Bari in Manhattan, originally from another country, Bangladesh. Hi, Bari. You're on WNYC.
Bari: Hi, this is Bari Khan. Thanks so much for putting me on. I just wanted to tell your listeners that there's something curious going on also. I'm Muslim and I get lot of a friend requests from very suggestive posed women who are Hindu and Chinese. Especially I've been noticing whenever I share a story of a Muslim getting lynched in India or something like that and then when I click on their profile, I see they're clearly fake and extremely suggestive, usually pictures of Photoshop pictures in semi-clad women and everything. I think that's another thing that Facebook is doing to sway people, and I noticed a few of my friends have actually befriended these fake profiles because it'll say you have two friends and they're clearly fake.
I've also, I reported them, and what's also funny. Now, WhatsApp, I've been getting a lot of messages saying, "Hi, how are you? Do you want to meet up?" Then there'll be a picture of usually a Hindu or an Asian woman. That's a very weird hookup [unintelligible 00:24:32] [crosstalk] to say that that's happening in social media too.
Brian: Thank you very much, and Sheera, your articles frame an uncertainty, I think it's fair to say, over whether to think Facebook tried to contain disinformation, at least about the election in this country, but it's really hard to do when you have 3 billion individual users versus Facebook didn't really try to contain the disinformation because it was profiting from the extra user engagement that all the election activity was bringing, and user engagement is at the heart of their business model, but you say the documents that you've obtained show that Facebook's own employees believe that the company could have done more. What do these employees believe it could have done?
Sheera: These employees that we look at in these documents, they're an interesting group, because a lot of them work for Facebook's Civic Integrity Unit. This is their specialty. This is what they were hired to do, and of those, quite a few have a background in research. Some of them are trained as data scientists. They're looking at this purely from the point of view of given the tools Facebook has at its disposal, what could it do to make things better? I think what they show time and time again is, "Here's a step we can take, for instance, on vaccine misinformation."
They find that if you disable all comments on posts related to the vaccine, you'll reduce vaccine misinformation, because Facebook does not seem to have the ability to adequately monitor content in its comments, and yet they're told by executives, "Well, no, we can't do that because the moment we do that, we get less engagement on Facebook. People are less interested in spending time on our platform, fewer eyeballs mean less ad revenue." It means less data on people, and so the company executives who are thinking about this as a business are coming at it from one point of view while this civic integrity team is really just thinking about, I think the health of democracy and the health of society.
I think one thing that was telling for me when I was working on the book was one of Mark Zuckerberg's earliest speech writers writes and tells the story of how he used to end company meetings by saying company over country. I just wonder he has been the founder of that company, the chief executive, he is completely in control, and if that is the mentality going in, company over country, you can see how so many of these decisions would be made downwind of that.
Brian: Now, on that Fox headline I cited in the intro, Wall Street doesn't care about the Facebook leaks, Mark Zuckerberg does. The Wall Street part of that is that Facebook stock does not seem to be taking a hit because of the Facebook papers, but it includes a clip of Zuckerberg on Monday, largely blaming the media and critics with an agenda rather than the company's own policies for the hits its reputation is taking. Listen.
Mark Zuckerberg: Good thing criticism helps us get better, but my view is that what we are seeing is a coordinated effort to selectively use leaked documents, to paint a false picture of our company. The reality is that we have an open culture where we encourage discussion and research about our work so we can make progress on many complex issues that are not specific to just us.
Brian: Mark Zuckerberg had a public presentation of quarterly earnings on Monday. Sheera Frenkel still with us from the New York Times. Sheera, do you give Zuckerberg's argument there any credence? Is there a complexity that maybe some of the reporting, not yours, but some of the reporting in simpler news organizations is missing in favor of an overly simplistic narrative of Facebook coddling disinformation for financial gain?
Sheera: I'll be honest. I haven't been able to read all the articles that have come out because there's been dozens and dozens of articles. I think that of the articles I have read in the Washington Post, Wall Street Journal, AP, I have seen some really fantastic reporting that has delved into these documents. Is it complete? Does it have the 360-degree view that I imagine Mark Zuckerberg has? No, and I imagine it's not black and white because of all the reporting we've done, we haven't seen any examples where Mark Zuckerberg or Sheryl Sandberg or anybody else at the top of the company makes a decision out of malice. I think you could say they're not evil people.
They're not sitting there, I forget the name of the character from Austin Powers, twiddling their thumbs and saying, "How do we destroy the world?" That's not what they're doing as a company, and I think they do take some measures that are advised by these researchers. However, I think he's trying to portray it in a certain way. I think you can also look at these documents and say, if they're given a one through five option, where five is the most effective but loses them people, and one is the least effective, but doesn't lose them any users, they're not going to choose the most effective and risk losing those users. They're not going to risk hitting their bottom line. That is not how they're operating as a company.
Brian: The USA Today headline I cited in the intro says, "Instagram's dangerous to children unite liberal and conservative lawmakers who agree on little else." Assuming that common ground, how much is Washington divided on the main Facebook platform along the lines of, you know, Democrats say Facebook doesn't stop disinformation enough and Republicans say their politics of the right are singled out for censorship.
Sheera: For a very long time, Democrats and Republicans have been divided along the very lines that you just described, has become a free speech argument and as we know America cares very much about the first amendment. It is a very complicated argument, and this is actually a really great place for Facebook to have people divided on. If we're busy talking about the first amendment and what should Facebook monitor and what specific pieces of content are appropriate to remove versus those that aren't, you can see how you would get bogged down in that for decades, really, but I think what we're starting to see now, and what I certainly saw in the last Senate hearing was members across the aisle, looking at this in a different way.
They're not asking the question of, should you be allowed to say something on Facebook? They're asking the question of, should Facebook be promoting it? If you want to share a conspiracy and say that the earth is flat, for instance, fine. Say that, say the earth is flat, but should Facebook push people into groups that try to convince them that the earth is flat? That is where senators are now focusing their attention and that's something that Republicans and Democrats can both get around because we're talking about Facebook's decision as a company, over-promoting things, over-pushing things to people.
I sometimes think of this as the equivalent of soda companies, should Pepsi, Cola, whatever, be allowed to sell their product? Sure, but should they place them in the hallways of schools where elementary kids are encouraged to buy them? Probably not, like that's probably something we've moved against. We've said let's, it's not a great idea to put them right there, accessible to small kids. We're focusing on the decisions Facebook makes as a business and what it recommends rather than the decisions of individuals and what they post.
Brian: Do all these politics add up to any likely new laws or regulations that could actually be helpful? Even Mark Zuckerberg says, "Regulate me, please."
Sheera: Well, he wants regulation, but he wants a specific regulation. He wants regulation of the first type I've described. If members of Congress go down that path and say, "Okay, here are very specific rules about what you must take down", that makes Facebook's life really easy because all it needs to do is follow those rules and as we all know, the nature of misinformation, disinformation warfare is that it's always changing. If you make a rule saying, "Take down these five conspiracies", within weeks or months, new conspiracies will arise, which are not part of that list, and then Facebook will be able to shrug its shoulders and say, "Well, Congress, didn't tell me to take that down".
Take it up with Congress. It's their responsibility to just tell me what to take down. This is a great model for Facebook and I think it is something that they have been open to because it's easy for them. I think it's very, very different if lawmakers do head down the path of what should Facebook be recommending to people, what should its algorithms be pushing?
Brian: Well, I think we've established that Mark Zuckerberg is not Dr. Evil from Austin Powers, but short of that, there's a lot of room for debate, Sheera Frenkel, New York Times technology reporter, and co-author of the book An Ugly Truth: Inside Facebook's Battle for Domination. We'll keep following your reporting. It's been fascinating coming out almost day by day, as you get to dig deeper into these Facebook papers leaked by employees. Thank you so much for sharing this with us.
Sheera: Thank you so much for having me.
Brian: Lehrer and WNYC, much more to come.
Copyright © 2021 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.