Rep. Malinowski On Regulating Social Media And Foreign Policy

( Noah K. Murray / AP Images )
[music]
Brian Lehrer: It's The Brian Lehrer's Show on WNYC. Good morning, everyone. An interesting person in Congress meets an interesting development in digging out from the Trump era. The development, as some of you just heard on the BBC, is the news today that the Biden Administration is taking steps toward negotiating with Iran about rejoining the nuclear deal. The person is democratic representative Tom Malinowski of New Jersey who has been elevated in the new Congress through Vice-Chair of the House Foreign Affairs Committee because of his unique experience in this area.
As the bio on his website puts it, Malinowski was born in communist Poland during the height of the Cold War. When he was six, he and his mother fled to the United States and he grew up in Central Jersey. Malinowski served as a senior director on President Clinton's National Security Council where he worked to end conflicts around the globe. He then served as the Chief Advocate for Human Rights Watch, where he led the bipartisan campaign to end the use of torture by the Bush Administration. Later, he served the Obama Administration as Assistant Secretary of State for Democracy, Human Rights, and Labor, where he helped lead America's fight for human rights around the world.
Now he's in Congress, and Vice-Chair of the Foreign Affairs Committee. President Biden is looking for way to get back into the Iran nuclear deal and improve it at the same time, dealing with Iran's sponsorship of terrorism. Congressman Malinowski is also very focused right now on fighting terrorism at home after what he saw on January 6th. He's also been named to the Homeland Security Committee and is among those most pressuring the social media giants the hardest to make it harder for violent extremism to spread. Congressman Malinowski, always good to have you on with us. Welcome back to WNYC.
Congressman Malinowski: Thanks so much. Happy to be on.
Brian Lehrer: Let's begin with terrorism at home, and let's begin with a little of your own story. I see in an NJ Spotlight story, I did not follow this during the fall, that you are the victim of some false conspiracy theory spreading during your re-election campaign that resulted in death threats against you. NJ Spotlight refers to the National Republican campaign committee falsely alleging that you lobbied to protect sexual predators, which was enough to prompt a so-called QAnon on rants posted to various message boards that resulted in death threats against you. Can you tell a little more of that story?
Congressman Malinowski: Well, that's basically what happened. They chose that line of attack for obvious reasons. The central QAnon conspiracy is that there is this cabal of powerful people in government who are somehow engaged in protecting sexual trafficking or the kidnapping of children. It's a modern version of the old antisemitic blood libel. I think that we have evidence that they pull tested that message in our district and then came out with an ad that drove it home.
The ad ran for more than a month, which is very unusual in political advertising to put out a message for that long. Eventually, it was noticed by the QAnon people themselves. Coincidentally, I was leading an effort in the House of Representatives to condemn QAnon. I was the lead sponsor of a bipartisan resolution condemning this horrific conspiracy mongering movement. The Q supporters put those things together and came after me.
Brian Lehrer: I see you have a bill designed to hold the tech giants accountable for their algorithms that wind up amplifying extremist content. How do you write a law to regulate algorithms, not just people?
Congressman Malinowski: We can't tell the tech companies how to design their networks, but we also don't have to protect them from liability under Section 230 of the Communications Decency Act, this old law that's gotten a lot of attention recently. Big tech giants like Facebook, and Twitter, and YouTube are simply not responsible legally for anything that's posted on their websites, even though they write these algorithms to promote this content, to make it spread wildly across the internet.
These algorithms work in a very pernicious way. They know everything we do online. They follow every single click, every website that we visit, every video that we watch, every post that we linger over. They then feed us more intense versions of what we've already indicated we like, what we already have shown them by our behavior online that we fear, that we hate that triggers our emotions.
They know that if they feed us more and more of that stuff, we will stay on their platforms longer. We’ll see more ads, we’ll buy more stuff. If you like cute kitten videos, you'll get more cute kitten videos. If you're susceptible to extremism, to antisemitism, to messages of hate, you will be fed more and more and more until you're deep down that rabbit hole. Our legislation basically says, if you do that, if you promote that sort of content with your algorithms and it leads to real-world violence, like what happened in the capital of January 6th, then you can be held liable for it.
Brian Lehrer: Ironically and maybe weirdly, Trump and Republicans who support him also want to repeal that Section 230 provision that gives the tech giants immunity from liability for things people do on their platforms. Here is Ted Cruz at a Senate hearing on this issue in December.
Ted Cruz: Do you really want to submit total control of the public debate to a handful of Silicon Valley billionaires, modern-day oligarchs with money and power and no accountability?
Brian Lehrer: Ted Cruz live from his beach chair in Cancun. No, I'm kidding that was in December. To you, Congressman, the issue is protecting America from violent radicalization, most of which is coming from the right. To Ted Cruz, the issue was stopping the tech companies from coding a liberal bias into their algorithms that cancels conservative speech. Can one policy do both of those things?
Congressman Malinowski: It's very hard to argue with Ted Cruz and Donald Trump on this issue because they completely don't understand how Section 230 works. If Donald Trump and Ted Cruz got their way and we completely eliminated Section 230, which I'm not proposing at all, the tech companies would be even more eager to censor their posts, because suddenly they could be sued for any lie that Ted Cruz or Donald Trump tell on the internet if they're promoting some fake cure for COVID that people then take and die.
Then Twitter could be sued, Facebook could be sued, and so the likelihood of someone like Cruz being banned from Twitter would go up exponentially if he got his way and section 230 was repealed. It’s hard to debate with because they're just so confused. Whereas what we're trying to do deals with the reality of how the law works. I'm not proposing repealing. It’s just a very targeted reform that says, "If you are promoting with your algorithms content that can be linked to an act of terrorism, to a violent hate crime, then you at least have to answer for it potentially in court."
What we're trying to do is incentivize them to change the algorithms to basically to dampen the spread of extremist content, on the left as well as on the right by the way. If you're conservative and you think that Antifa is the biggest threat to our country, I don't agree with you, but you should love my legislation, because the same thing I'm fighting is also pushing people on the left further to the extreme left just as it pushes people on the right further to the extreme right.
Brian Lehrer: Now, listeners, we can take some phone calls for New Jersey Congressman Tom Malinowski on how to fight domestic terrorism. Also, on the new Biden steps toward potentially rejoining or reforming the Iran nuclear deal which will get to, and human rights and US Foreign policy generally, 646-435-7280, 646-435-7280. You can tweet a question @BrianLehrer.
Picking up on what you just said, Congressman, there are also progressives, not just Ted Cruz, who worry about regulating social media speech too much. The focus today may be on the right, but eventually, it'll come back to where it tends to be in this country, criminalization of Black and brown people fighting for their rights or your other main concern in general, human rights.
Congressman Malinowski: Yes. Look I don't think the government can tell tech companies to regulate speech, and that's not what this is about. Look, I have the right to say anything I want under the First Amendment. I don't have a right to come on your show, right? I'm on your show because you invited me and you have a right not to invite me. By the same token, I can say whatever I want, but I don't have a right to expect that Facebook, this giant powerful company will take what I say and promote it to millions of people around the world using their algorithms.
There's a difference between freedom of speech and freedom of reach. I have the former, I don't have the latter. I have to earn the latter. All we're really saying here-- We're not telling Twitter or Facebook to ban anybody. Government has no business doing that. That is the private business of those private companies. All we're saying is that, if they are promoting content that leads to acts of terrorism because they deliberately designed these algorithms to foster divisions in our society, then they should at least answer for it in the way that any other company, including you, would have to answer in court if somebody were to allege that you were inciting violence.
Brian Lehrer: Désirée in Park Slope, you're on WNYC with Congressman Tom Malinowski. Hi, Désirée.
Désirée: Good morning. As you know I'm the librarian, I say every time I call. Sir, algorithms are not sentient. It's not that the algorithm is promoting things that are negative, or promoting things that will radicalize people, algorithms provide you with more videos, like the one you just watched, it's very basic.
The thing that you actually should be focusing on instead of focusing on trying to censor people, which is what you're doing, you say that it's not what you're doing, but it is going to be the outcome. The outcome is going to be that if anybody who is saying anything that anybody else could find offensive, is the person who's going to be flagged and whose videos are going to be removed. An algorithm can't stop you from liking something that you already like.
Again, you gave the example yourself, you like kitten videos, it's going to show you more kitten videos. Do you know what else it does? If I'm struggling with my sexuality, and I watched a video about that, it will show me more of that. If I'm a woman who's going through breast cancer, and I watch videos about that, it's going to show me more videos about that.
Legislating against an algorithm won't just affect the videos that you don't like, it will affect all speech that could possibly offend people. Even right now, with the human beings who are on the monitoring board, for YouTube, and Facebook, they are struggling with these issues as human beings because they don't have the same values. It's not an easy thing where you can just get rid of an algorithm or elect a board of people to look at videos and decide if they're offensive or dangerous or not.
It is more about information literacy of the users, and of the people who put videos on YouTube. Those videos don't pop up on YouTube by themselves. What you should be focusing on is the human beings, the actual live human beings who are putting the videos up, who are promoting that message, not just on YouTube, Facebook, Twitter, but also in other areas of the world. They always find that out after the fact. My point is, information literacy is the issue, not based upon trying to make its algorithm turn into some kind of sentient being that can predict what people are going to be radicalized by.
Brian Lehrer: Désirée, thank you so much. Congressman Malinowski, what do you say?
Congressman Malinowski: I agree with all of that, I just don't think there's one solution to the problem. We absolutely have to invest in digital literacy, we have to do better educating kids in this country and adults about how to discern truth from falsehood, but that's a 20,30 year effort in terms of the effect that it's going to have. At the same time, I do think there's a problem here with the way the social networks are designed.
As I've stressed, I'm not trying to get rid of algorithmic promotion, because there's a lot of it that we like, that absolutely exposes us to things that expand our understanding of the world. I love it when I listen to a piece of music on YouTube and it recommends some other piece of music I might never have discovered, but which I'm going to like. Our legislation doesn't stop that, it doesn't disincentivize that. It only incentives the companies to crack down on the aspects of its algorithmic promotion that lead to violence in our society, and they know how to do it.
Let me give you a really great example of something Facebook experimented with but discontinued. Last year, they took a subset of their users. They asked them to rate every post they saw on their newsfeed as either good for the world, or bad for the world. The algorithm learned from what the users themselves said, would be good for the world. It then gave them a news feed that emphasize things that they thought were good, rather than things that they themselves thought were bad.
It was a great successful experiment because it resulted in what people at Facebook referred to as the nice newsfeed, there was less propaganda, there was less anger, there was less hate, there was more of what we love from Facebook; pictures of our friends, and vacations, and kids. There was just one problem with the experiment, that same subset of users spent less time on the platform.
They were getting more of what they thought was good, and yet, they spent less time on Facebook, because human nature being what it is, what really glues us to the screen is when we see things that make us angry. Facebook discontinued the experiment because what they're really interested in is keeping us addicted to the screen, so that again, we see more ads and buy more stuff. The purpose of my legislation is to incentivize those kinds of experiments becoming permanent.
Brian Lehrer: My guest is Central Jersey Congressman Tom Malinowski, now elevated to Vice-Chair of the House Foreign Affairs Committee and also placed on the Homeland Security Committee. His background, if you're just joining us, includes being an Assistant Secretary of State for Human Rights and Labor in the Obama administration.
Before we get on to some of the foreign policy stuff. That article that I mentioned about you in NJ Spotlight cites a Washington Post ABC News poll that found 8% of Americans support the actions of people who stormed the Capitol. I'm curious how you view that number. I could take that it's pretty encouraging, just 8% would make that a very fringe group of militia members, supporters, and explicit white supremacists, not like Trump has a third of the country ready to start a new Civil War as some people feared. თhen again, 8% would still be millions of people who could cause incredible carnage if they really act on it. How do you see that number?
Congressman Malinowski: Yes, glass half full glass half empty. What really disturbed me about what happened on the sixth was that, okay, maybe it wasn't millions of people storming the Capitol, but it was thousands. This was not one of those cases where you had a lone gunman who walks into--
Brian Lehrer: Wait. Just to be accurate, was it thousands, or was it hundreds?
Congressman Malinowski: Well, I think around 800 people actually fought their way into the Capitol Building, but there were many more outside. We're used to horrific tragedies when it's one or two people walking into a shopping mall or synagogue or church and opening fire. It's bad enough, and those people are radicalized online, but maybe you can dismiss it as a mental health problem, somebody is deeply disturbed. Here, we had hundreds or thousands of people who looked like they were ordinary Americans.
All of them were absolutely sincerely convinced that this deviant thing that they were doing, rampaging through the US Capitol, smashing things, saying, "Hang Mike Pence," they were completely convinced that it was normal, that it was patriotic, that they were defending the Capitol, not destroying it. They were convinced that millions and millions of Americans supported them, that everybody supported them because again, that they are like many of us living in a filter bubble online, in which everybody they interact with, feels the same way and reinforces their anger, reinforces the lies that they have been told.
This was not happening 20 years ago, 30 years ago. That's something new and it has resulted in these extreme acts committed by larger and larger groups of people, and that's what we've got to do something about.
Brian Lehrer: Christian in Irvington, New Jersey, you're on WNYC with Congressman Tom Malinowski. Hi, Christian.
Christian: Hi, good morning, Brian. Thank you for taking my call. I wanted to point out something that has been like, from listening to your conversation about censoring, specifically the racial class and gender biases that already exist within social media platforms. For the last, I think one, two, three, four, five years, the BIPOC, Black, Indigenous and people of color organizers, grassroots organizers, activists, has been constantly getting censored on Instagram, our accounts have been taken down or post have been taken down.
We have been shadow banned. All because we challenge the status quo. We challenge aspects of colonialism, aspects of capitalism, aspects of the patriarchy that are really harmless, and when we speak out about it, our accounts get censored. I just found it really disturbing that we're talking about how we have to censor white supremacists, and just like, why haven't they been getting censored beforehand.
This just point out of the biases that exists within the algorithm, because of who designed the algorithm, which were probably white people. They have designed, the cis hetero white people, that has been designed in a way whenever their perspectives or their ideals get challenged by, I don't know, usually BIPOC organizers, we get censored and we get our accounts get taken down.
Brian Lehrer: What do you think is the solution in terms of regulation, if any?
Christian: Well, I think it's important to distinguish free speech between hate speech, how, I feel it's been talked about here, how one comment can really spread, can ignite violence. I think those are pretty obvious. I just think we haven't been able to have the conversation to decide on those things. The people that are making the decisions are usually people that don't have any different racial, gender or class perspective, that allows them to see beyond a white centric perspective. To me, it could be addressed in a way that's more ethical, in a way that BIPOC organizers are not being shut down for challenging the status quo.
Brian Lehrer: Really, they need inclusive leadership in the tech companies.
Christian: Yes, I agree.
Brian Lehrer: Congressman, briefly on this, and then I want to get to Iran before we run out of time.
Congressman Malinowski: Well, there are two different issues. One is, the tech companies enforcing their own so-called terms of service, the rules that govern what can be posted and what can't be posted. That's a subjective and very controversial process, because they're deciding what can be taken down, what needs to be flagged, what needs to be censored based on the rules that they set.
The algorithms are a little bit different, the algorithms don't have any political bias. They just know what your bias is and they reinforce it. If you have a tendency towards racism, or antisemitism, the algorithm will feed you more and more content that reinforces those views. That's the problem that happens automatically. It's not somebody in Facebook deciding to promote white supremacy, or QAnon, or anything else. It's just something that is inherent to the design of the network. These are two different things. What gets censored, is one question. What gets promoted by the algorithm is something different.
Brian Lehrer: Christian, thanks for your call, please call us again. Let's go on to today's news about Biden taking the first steps toward potentially rejoining the Iran nuclear agreement. This is relevant to you as new Vice-Chair of the House Foreign Affairs Committee, and to your past roles as Assistant Secretary of State for Human Rights under President Obama, and a leader of the group Human Rights Watch. Explain to people what Biden is doing, as you understand it, and tell us what you would like to see happen?
Congressman Malinowski: My understanding is that President Biden wants both Iran and the United States to return to compliance with the so-called JCPOA, with the agreement that we and other countries signed with Iran during the Obama administration to stop Iran’s development of nuclear weapons. The idea here is, Iran poses all kinds of threats to people in countries in the Middle East, conventional threats, support for terrorism, development of missiles, that all those threats will be easier to deal with if we don't also have to worry about Iran developing a nuclear weapon.
The Trump administration, of course, withdrew from the JCPOA. After a while, Iran itself started breaking its commitments, citing America's withdrawal as the reason. The idea here is for both countries to come back into compliance.
Brian Lehrer: It wasn't just Trump who opposed the original Iran deal. Chuck Schumer, for example, voted against it, because he said it didn't do enough to stop Iran's aggression using conventional weapons in the region, and gave them more financial ability to do so by lifting economic sanctions. One of the Biden people's goals, I see from this morning's news reports, is to actually broaden the deal to not just limit nuclear weapons development, but also to get around to agree to stop making conventional warfare in certain ways and violating human rights and sponsoring terrorism in some of the ways that it does.
If the Biden people are talking like this, does it mean Trump and Schumer and others were always right about the shortcomings of the deal?
Congressman Malinowski: There were shortcomings of the deal. Again, the argument was that getting the nuclear question off the table, guaranteeing that Iran never develops a nuclear weapon increases our ability to then challenge them on other things, because if Iran had a nuke, it would be able to deter us and our allies in the region from confronting it over these other issues.
I think that it's certainly fair, it's certainly true that the nuclear deal itself did nothing to address Iran’s support for Hezbollah, a terrorist group that is threatening Israel. It certainly did nothing to improve respect for human rights inside Iran. It's important to remember, the nuclear issue is not the only issue. There are many other threats emanating from this regime that are killing a lot more people right now in real time that we have to address. The Biden administration gets that. They want to come back into compliance, they want Iran to come back into compliance, but that's just the foundation for addressing these other problems.
Brian Lehrer: What leverage does the United States have to make Iran go further in a new version of the deal than we were able to do in the original version?
Congressman Malinowski: If both countries come back into compliance, there is a little bit of sanctions relief. Some of the sanctions that the Trump administration put into place do get eased, but there are still considerable sanctions on the Iranian economy that would remain in place. On top of that, even if there is sanctions relief on paper, I think most multinational companies, banks, et cetera, will be extremely wary of doing business in and with Iran, unless they have assurance that the conflict between Iran and the outside world is resolved. Iran will still have a tremendous incentive to go for.
Brian Lehrer: Also in the Middle East, to me, one of the most underreported reversals the Biden administration has made from Trump is withdrawing support for Saudi Arabia's war in Yemen. Iran is on the other side of that. Many Yemenis are victims, many caught in the middle. Most Americans don't know a thing about this and couldn't find Yemen on a map. What's significant here from a human rights standpoint, and from a United States breaking with the murder of Saudi Prince, Mohammed bin Salman standpoint, after Trump and NBS were becoming close allies?
Congressman Malinowski: Yemen is the worst humanitarian crisis in the world. Human suffering on a scale that's hard to imagine, not just people killed by weapons of war, but suffering from malnutrition, the spread of cholera, and other diseases. Every side in that war is to blame the movement that has taken over. Much of Yemen are certainly partly to blame, but so is Saudi Arabia, which has indiscriminately bombed civilians in Yemen, ever since the war began, unfortunately, with material support from the United States.
The Obama administration tried to cut it off. One of Trump's first decisions when he was elected was to restore arm sales to the Saudi military to enable them to continue this war. President Biden made absolutely the right decision to say we're not going to help you continue to bomb and starve the people of Yemen. It's a humanitarian calamity. It actually helps Iran because it keeps the Saudis tied down in this endless unwinnable war. That's Iran's intention and they're succeeding.
I'm glad they're doing it. I'm glad that they're confronting Mohammed bin Salman more broadly. There's bipartisan support for this in Congress. We have been demanding it, Republicans and Democrats. We're expecting, in the coming days, the Biden administration to release a previously classified report that establishes who was responsible for the murder of Jamal Khashoggi, a Saudi journalist who lived in the United States. All of this is is good. It'll set this relationship with Saudi Arabia on a more balanced and honest footing.
Brian Lehrer: Do you agree that one of the most threatening things about a potential Trump's second term, if he was reelected, was that the United States's alliances around the world would have continued to shift from being with other democracies to being with other authoritarian-led regimes that happened to support Trump?
Congressman Malinowski: Yes. Trump's foreign policy was basically, "I like people who like me," that was it. He sensed that our traditional allies, Germany, the United Kingdom, Japan, South Korea, Australia, that they disapproved of him, and so he wanted nothing to do with them. I think he absolutely was very, very clear in his desire to withdraw American troops from Europe, in the defense of our NATO allies. He was trying to withdraw our troops from Japan and South Korea.
In his first term, he was surrounded by people like General Mattis and McMaster and even Mike Pompeo who restrained him. I think there's no doubt that in the second term, those people would have been gone, he would have been unrestrained, and our allies would have been Putin and Kim Jong-un, and Xi Jinping of China, not the democracies that have fought and bled with us for decades in defense of the values we share with them.
Brian Lehrer: Last question, what's the proper role of human rights in US foreign policy, as you see it today, with all your experience in that field with Human Rights Watch, and as Deputy Secretary of State for Human Rights? President George W. Bush said he was all about promoting democracy and human rights at the same time as defending our interests as a rationale for launching the disastrous Iraq war. Neither party wants something like that again, and sometimes upholding human rights in another country is against the interests of the United States, like when we've supported human rights violators, even pre-Trump, who supported us.
Can you make a single standard that doesn't leave us being hypocrites eventually as a superpower?
Congressman Malinowski: I think America is a country with a conscience. It's something that distinguishes us from every other major power in the world. I think it's a comparative advantage. It's one reason people around the world despite all of our failings, despite the last four years, still look to us to be a leader. If we turn our backs on that, we are turning our backs on one of our greatest advantages in the world.
Our true friends around the world are not governments in places like Saudi Arabia, or Iran, or China, or Russia, they're people who want a better life for themselves and their kids, who want to live in freedom, who want to live in countries where the rule of law is respected, where corruption is punished. The more we stand with folks like that, the stronger our position in the world will be.
Brian Lehrer: Central Jersey Congressman Tom Malinowski, now the Vice-Chair of the House Foreign Affairs Committee, and now placed on the house Homeland Security Committee. Thanks for joining us, Congressman. We always appreciate it.
Congressman Malinowski: Thank you so much, Brian.
Copyright © 2020 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.