BOB GARFIELD: From WNYC in New York, this is On the Media. Brooke Gladstone is away this week. I’m Bob Garfield.
This was the week that Facebook got caught committing – journalism?
MALE CORRESPONDENT: Facebook says its users determine what topics are trending, but new claims suggest that may not be the case all the time.
MALE CORRESPONDENT: Facebook's Trending News section isn't actually determined by a computer algorithm but by Facebook workers.
MALE CORRESPONDENT: What's trending on Facebook? Liberalism.
BOB GARFIELD: The tech site Gizmodo, based on whistleblower testimony, reported that the world's largest social platform has been manipulating its supposedly algorithmically-generated trending story feature with human intervention, either to make up for the tin ear of computer code or to inject political bias, or both.
MICHAEL NUÑEZ: According to one of our sources, conservative news was suppressed regularly.
BOB GARFIELD: Michael Nuñez broke the story for Gizmodo.
MICHAEL NUÑEZ: This particular curator kept a running list for the course of six months because he felt so uncomfortable with how frequently conservative news topics were being blacklisted. Another revelation that we discovered was that these curators are using something called an injection tool and, in those cases, they’re plugging news stories into Facebook's Trending News that aren’t naturally trending at all.
BOB GARFIELD: Facebook this week categorically denied using curators to advance an anti-conservative agenda, but it also denied having a so-called “injection tool.”
And then the Guardian newspaper released leaked Facebook documents instructing curators on such a tool’s use. So, at a minimum, the company has some serious explaining to do. The question is how much and to whom.
Republican Senator John Thune of South Dakota, chair of the Senate Commerce Committee, wrote to Facebook's Mark Zuckerberg, demanding accountability.
SEN. JOHN THUNE: If the news that’s coming through there which is represented to be objective and it's not, then I think the people deserve the right to know that. And so, it's just a question of having them clarify, you know, how they go about doing this, if, in fact, what is their policy and if their policy has been breached, what they’ve done to correct that.
BOB GARFIELD: Due to a quaint little loophole known as the First Amendment, of course, a publisher has no accountability whatsoever to Congress or any other institution of government, if no libel or crime has been committed.
Indeed, back in 2007, a prominent senator wrote, quote, “I know the hair stands up on the back of my neck when I hear government officials offering to regulate the news media.” That senator was John Thune.
The twist is that Facebook loudly claims not to be a news publisher but rather a platform, a vast ubiquitous platform that routinely has business with legislators and regulators. It’s a conundrum, all right and also, despite the media tempest that this week broke out, may be a red herring. As we shall see in this segment, when it comes to social media as distributor of news, the gaming of trending stories lists may be the least of our worries.
Emily Bell, a professor at Columbia Journalism School and previously director of digital content at the Guardian, is watching, with some trepidation, a radical reordering of the media landscape. Emily, welcome.
EMILY BELL: Thank you, Bob.
BOB GARFIELD: This is all about Facebook, Google and Apple as arbiters of what we, the audience, will consume. But you can argue and, in fact, I think you have argued, that it really starts with the social media app favored by millennials called Snapchat.
EMILY BELL: Just over a year ago, this new app called Snapchat, which have a mere 100 million users, launched something called Snapchat Discover, and when you look at it on your mobile phone it’s a channel within the app itself. There are a dozen channels or so, and they’re taken up by brands like Cosmo and Vice and The Daily Mail and various other media brands, and within that you have these stories.
BOB GARFIELD: They’re not linked to the publishers’ own webpages.
EMILY BELL: Exactly.
BOB GARFIELD: They reside within the app.
EMILY BELL: Yes. This is the first time we've seen a social app say, right, we’re going to allow publishers to reach this vast new audience, but it’s much better for the users and for the publishers if the stories live here rather than linking out. And that kicks off a whole chain of events whereby other competing publishers, noticeably Facebook, really kind of revved up this idea that they would follow suit and get publishers to publish within their apps, as well. And if you’re reading news you’ll barely notice it, but if you're reading news on Facebook and you go to an article and it loads very quickly and it looks very nice, you know, if you look carefully, you’ll notice that you haven’t left the Facebook environment. And this is new. This is Facebook Instant Articles.
Then we saw Apple News launch something very similar, and then we saw Google launch something called Accelerated Mobile Pages.
BOB GARFIELD: The uproar of the Facebook trending revelation seems to assume that human intervention is corrupt and algorithmic decisions are neutral and pure. But [LAUGHS] EdgeRank, which is Facebook's algorithm, is itself biased by design, not politically but to literally serve the users’ interests and Facebook's itself.
EMILY BELL: If you regularly consume your news just through Facebook, you will see a repetitive nature to the kinds of stories that you get shown. And that’s a way in which the algorithmic sourcing, if you like, sort of gets rid of serendipity, doesn’t really know quite how accurately to guess at what we’re thinking or what it might be good for us to see next. So we live inside this thing that the entrepreneur and web thinker Eli Pariser coins as being the “filter bubble.”
BOB GARFIELD: Now, it was sneaky about it and disingenuous and I guess it mislabeled trending topics as having somehow been algorithmically organic, as opposed to editorially enhanced, let's say. But, if I understand what you're saying correctly, in terms of fear and outrage about what Facebook represents, focusing on the trending box is the right church but wrong pew. EMILY BELL: You know, I think that it is ironic that Facebook is now at the center of the eye of this media storm. It’s exposing in several ways because it means that, first of all, in our popular imagination we are thinking that trending topics are somehow fair because they are automatic and they’ve been decided by mass. But, as any good statistician will tell you, [LAUGHS] there are biases in statistics and numbers and the algorithms, as well,
But what it has exposed is that people haven't really been cognizant enough of what the overall kind of influence and possibilities are of having a news ecosystem now which is effectively dictated by the behaviors of these very big platform companies. And, in a way, it's an excellent piece of news that we are at least now having that debate, even if we’re having it on, if you like, a slightly faulty footing.
BOB GARFIELD: And certainly, the influence of the Facebook newsfeed is not to be sneezed at. Can you tell me about the 2010 voter turnout experiment?
EMILY BELL: Voters were shown messages to see whether or not just by reminding, if you like, people to vote, there was any impact at all on voter turnout, and there was a detectable uptake. Now, you might say, well, that’s pretty straightforward and uncontroversial.
Now, if Facebook is able to prompt and increase voter turnout but we don't know whether it’s doing that holistically or whether it's doing it partially, then that obviously may well have political consequences. If somebody puts an advert on TV telling us to go and vote, there are contracts between broadcasters and political advertisers that we can scrutinize and see where the money is spent. On Facebook, all of that is obscure. And so much of the brouhaha this week has been about this opacity.
So, in many ways we’re not talking about particularly new problems for gatekeepers, but we are talking about a very new problem, in terms of being able to interrogate, you know, how these decisions are arrived at and if there’s any accountability for the actions that these companies take. You know, all of this language and the ways that these companies have framed this behavior have created this expectation of somehow sort of equality of access, and now it appears that that’s not actually the case. And, of course, if we think about it for two seconds it could never have been the case.
BOB GARFIELD: I have to ask you, considering the influence that Facebook has, presumably on elections and everything else, what scenario for the future in publishers’ relationships with social media for you looks most ominous?
EMILY BELL: What feels most problematic at the moment is just this lack of transparency. So we already have this odd relationship where news companies that ought to be holding the power to account are actually now subsumed in a really extensive new system of power. So we’ve got to kind of solve this conundrum in a way by agreeing that there has to be a greater level of transparency, because I think without that you really do start to lose the essence of what makes democracy function well. And I think, you know, if we don't grasp the moment and maybe make some interventions, then it runs the risk of calcifying into a set of new practices, which are actually even more obscure and potentially more corrupt than what we’ve had in the past.
BOB GARFIELD: Emily, thank you so much.
EMILY BELL: Thank you, Bob.
BOB GARFIELD: Emily Bell is professor at the Columbia University School of Journalism.