How Wikipedia May Be the Antidote to Trumpism

( In Pictures Ltd. / Getty Images )
Title: How Wikipedia May Be the Antidote to Trumpism
[theme music]
Matt Katz: It's The Brian Lehrer Show on WNYC. Welcome back, everybody. I'm Matt Katz, keeping the big seat warm for Brian today. We'll end the show with a look at Wikipedia. Opinions on the site have long been divided. Some say it makes information more accessible. Opponents argue that because anyone can edit it, it's not a reliable source. It is true that anyone in the world can edit the online encyclopedia, but supporters claim that this strengthens the website's credibility. A free source of factual information sounds like it would benefit everyone, but this is an age, as we know, where facts are becoming increasingly political.
Some people on the right, like Elon Musk, have accused Wikipedia of left-leaning bias. Recently, the Heritage Foundation, the organization behind Project 2025, threatened to leak the information of individual editors over their contributions to articles on Palestine. With Wikipedia in the far right's crosshairs, we'll take a look at how the site works, who it's run by, and why it might prove pretty resilient in the age of Trump's attacks on news sources. Joining us now is Margaret Talbot, staff writer at The New Yorker. Her latest piece is titled 'Elon Musk Also Has a Problem with Wikipedia.' Margaret, welcome back to the show.
Margaret Talbot: Thank you. Nice to be here.
Matt Katz: For those who aren't familiar, can you just give us a quick summary of how the Wikipedia editing process works? What goes on there?
Margaret Talbot: It's pretty remarkable because, in many ways, you could say that it really functions as the Internet dreamers hoped the Internet would. It's collaborative, it's demographic democratic, it's crowdsourced, it's run by volunteers. The content is not monetized in any way. Now, you might think that would result in all kinds of information and misinformation ending up on the site. But in fact, it works remarkably well to gatekeep. The way it does that is through its volunteer editors and through a real commitment to what they call a neutral point of view, which is something you would associate with an encyclopedia, and by transparency.
They really have an ethic of transparency. Anybody, you or I, could go up today and edit an article on the website. Maybe it would be the article about us or about WNYC or whatever. That wouldn't stay up if it was not accurate. Eventually, the volunteer editors would see it. If it's a big, important subject, they'll see it pretty quickly. If it's more obscure, it might take a little while. They will ask for sourcing on it and they will check the sourcing against their standards and they will delete it if it's not correct or not adequately sourced. This editing process is going on all the time.
What I found really remarkable and did not actually realize before I wrote this piece is that there are two buttons at the top of every two tabs at the top of every Wikipedia article. One is talk and one is history. If you go on talk, you can see all the discussions between editors about what to delete, what to add, what to correct and why, and all their, their, their debates about sourcing, debates about tone.
It's really, really transparent how these articles are made. Then, in the history section, you can see the actual history logged of every revision that's been made to an article. If you're curious about what goes on behind the scenes, it's really not that behind the scenes at all. At Wikipedia, it's very easy to see and to check and evaluate yourself.
Matt Katz: I actually edited my own page to clarify something, but I followed the protocol and did not edit it directly because they don't like when you're not supposed to edit your own page. This is all an effort to make it as accurate as possible. I just went into the talk section and somehow there's editors who just monitor these things. I don't have a very prominent page by any respect, but they still came across it, and then they fact-checked my suggested changes, and then they implemented them. Really just a remarkable system. Then you wrote that the the article's reliability actually increases depending on the page's popularity. Can you explain that?
Margaret Talbot: Occasionally, you'll look up something, some obscure '80s punk band or something, just to say something I might have looked up recently. It will say somewhere in the article, I forget the exact wording, but it's something to the effect of this does not meet our Wikipedia sourcing standards, or this article is,-- It used to say, this article is a stub, but sometimes it'll say this article is incomplete or inadequate for some reason. That's an alert both to you, as the reader, somebody going to look up this information, and also to the editors that it needs improvement.
That's one of the things that editors do is look for these and then try and figure out what the problem with it is. It might just be that there is a statement made that isn't sourced or is sourced to a source that Wikipedia editors generally consider unreliable. They'll look and see if there's a better source. Those more obscure ones take a while for people to come around and find. If it's an article on, let's say, Elon Musk or an article on Donald Trump or Kamala Harris, it's going to be looked at and corrected constantly.
Matt Katz: Right. You did write in your piece in The New Yorker that Wikipedia is in almost every aspect the inverse of Trumpism. That's not a statement about its politics, if it isn't already obvious, by explaining all of these verification processes. Can you expand on that a bit? Why do some people refer to the site as the last good place on the Internet, and why is it an inverse to Trumpism?
Margaret Talbot: A lot of it is this transparency I'm talking about so that it's very easy to not only look up these talk and history tabs and see how an article was made, but also there are many, many pages on the site describing Wikipedia's policies and practices, and so it's quite easy to find out how they work. It's not purposefully obfuscated at all. Also, there's a real dedication to using reporting from reliable news sources that Musk and Trump deride as legacy media propaganda, all those.
Then also just the fact that this administration has been so focused on cutting off reliable sources of information, whether it be closing down Voice of America or erasing or deleting information on government websites about vaccines or reproductive health or African American LGBTQ history, or today, apparently trying to close down the Institute of Museum and Library Services, which is an agency that supplies a lot of the funding for libraries in states across the country, firing the head of the National Archives, all of these are part of a pattern.
A place like Wikipedia or the libraries and archives that are trying to preserve information, or the news organizations that are trying to report reliably and accurately, are all bulwarks against this assault on information.
Matt Katz: Listeners, have your perceptions of Wikipedia changed along with the political climate? Are you more or less likely to use the site or support the website using your dollars because they always looking for donations. We want to hear from you. Call or text the 212-433-WNYC, 212-433-9692. Are there any Wikipedia editors out there? Give us a call. 212-433-9692. Margaret, Katherine Maher, the CEO of NPR, has been in the congressional hot seat of late. She's accused of having a liberal bias, but Maher also used to be CEO of the organization that oversees Wikipedia. Are the attacks on Wikipedia and NPR and her connected in some ways?
Margaret Talbot: Probably in the sense that there is this general hostility to reporting and the media that Musk and Trump have expressed very pointedly. In that sense, they're linked. I think it's worth mentioning that the Wikimedia Foundation is in many ways pretty separate from Wikipedia. It's for the donations, but volunteers run the site and are not paid. The Wikimedia Foundation maintains the site and it gives grants for improving technology and outreach to editors and various other initiatives to improve the running of the site, but it doesn't really have anything to do with the content. I think that's worth pointing out.
Matt Katz: The Heritage Foundation has attempted to expose personal information of the website's users. The Forward broke this story back in January. What was going on there?
Margaret Talbot: There has been this quiet and getting louder rumbling on the right about Wikipedia for a while that Elon Musk has kind of given voice to. I think one place it comes from is Wikipedia maintains a list, again, accessible on their site, easy to find, called Reliable Sources/Perennial Sources, where they evaluate the accuracy and reliability of sources that they typically use. There has been a complaint on the right that there are more conservative sites included in the category that they consider questionable or unreliable than sites on the left.
That is true to some extent. There may be a few more, but there are also sites on the left. For example, Fox News and Democracy Now are in the same category on their site, which is not that you can't use it or rely on it, but you have to approach it with caution and can only use it-- It's probably only reliable for certain kinds of reporting. For example, they have distinguished between The Sun and The Telegraph, which are both conservative British newspapers. The Sun, they say, is not reliable. The Telegraph is.
They're not actually looking at the ideology, I do not believe. They're looking at, do they have good fact-checking? Are they highly partisan? Do they use paid content? Are they reliable reporters? Are they passing on false information on a regular basis? Those are the categories they use. However, there is the sensitivity to it on the right. The ADL, the Anti Defamation League, was recently put on that list of sources considered generally unreliable, specifically for matters having to do with Israel and Palestine, not in general. That angered some people, including these Heritage Foundation people, and that has been a source of controversy.
Matt Katz: We're going to take a call. Carolyn in Tarrytown. Hi, Carolyn. You use Wikipedia quite often, yes?
Carolyn: Yes, I do. I use it both for research. I am a visiting researcher at the Graduate Center, CUNY, and I was actually taught how to use the talk pages within a degree program there for a certificate in interactive technology and pedagogy. One of the things I teach my students always is to really get to know their way around the talk pages, and it can be a quite reliable source. Recently, I actually donated $10 to the Wikimedia Foundation because I just think it's a worthy cause.
Matt Katz: Does it help you point you in the right direction? Because there are footnotes on the page. You can get information, and then it sends you to primary sources, I imagine, right?
Carolyn: Yes, there are references that lead to other sources, and it's the best way to fact check them, the volunteer editors, to see if they're really finding areas of specificity that really are accurate in your field, your topic that you're researching. I don't rely on it for the references as much as the narrative and the overview, which I do find quite reliable, and that is different as of the past five to seven years. It used to be quite all over the place before that.
Matt Katz: But you found it gotten more reliable. Interesting. Thanks so much for calling, Carolyn.
Carolyn: Sure.
Matt Katz: We have a couple of minutes left here, Margaret, and I was hoping you could just tell us about this case you wrote about in the piece with the quote, "much debated, stiff-armed salute by Elon Musk." Fascinating how this all played out. Can you tell us that story?
Margaret Talbot: Musk has had some beef with Wikipedia in the past. I mean, it may be that he would like to buy it like he could buy Twitter, but it's not for sale. He didn't like something in his article that he thought downplayed his early role in Tesla. He said things like, "I will give them $1 billion if they change their name to Dickopedia." He said in 2023, he tweeted. He's had that for a while. He did not like the fact that the article described the gesture that he made at the inauguration.
Obviously, any article about Musk and particularly Musk's current political role is going to mention that there was a controversy about that. I challenge anyone to look at the description in the Wikipedia article. It physically describes the gesture he made, describes what many people said about it, and then has Elon Musk denying that.
Matt Katz: Right.
Margaret Talbot: I think it's a pretty fair presentation. He apparently did not and tweeted to that effect.
Matt Katz: Yet, as you point out, this was down in the article. It wasn't like the lead thing. It was within that context. Now, there is the stiff arm salute has its own Wikipedia page. Is that right?
Margaret Talbot: Yes, which often happens with Wikipedia, where there's a spin-off to a specific controversy or subcategory.
Matt Katz: Margaret Talbot is staff writer at The New Yorker. Her latest piece, Elon Musk, also has a problem with Wikipedia. Margaret, thank you so much for coming on the show.
Margaret Talbot: Thank you, Matt.
Matt Katz: I'm Matt Katz. This is The Brian Lehrer Show. Thanks for listening, everybody. Stay tuned for all of it.
Copyright © 2025 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.