BROOKE GLADSTONE: Does the Internet cocoon us with likeminded people or do the hours we spend online expose us to more points of view? In his new book The Filter Bubble: What the Internet is Hiding From You, author Eli Pariser contends that the Internet is actually forcing us into echo chambers, that it looks at our Google searches, our emails, our Facebook posts, studies the links that we click on and decides what we want to know.
Even a small search yields different results for different people, says Pariser, as when two people he knows searched “BP” during the oil spill. ELI PARISER: And one person saw information about the oil spill - what you can do about it, the environmental consequences - and another person saw stock tips - here’s how the BP stock is doing, here’s investment information. You know, this could lead to some pretty bad decisions.
And the thing is that it’s not just Google that is doing this now. You know, it’s happening on Facebook, it’s happening on Yahoo! News, and increasingly it’s happening on the front of major newspapers. The New York Times now has a “Recommended For You” section that employs these same kinds of algorithms. BROOKE GLADSTONE: Google was supposed to be the ultimate democratizing source of information, elected by the world. ELI PARISER: Right, and if you look at Larry Page, one of the Google founders, and what he writes about the PageRank algorithm that Google started with, it was explicitly this kind of democratic project. Let's have the Web vote for which page is the right page.
And Google’s really moved away from that and they've moved toward this kind of personalized user behavior-driven way of presenting information, which is pretty good if you’re searching for pizza. But when Google is the way that a lot of people experience information and news, you know, I think it’s a bigger problem. BROOKE GLADSTONE: Now, you’re a former director of MoveOn.org. You’re a liberal. And you've noticed that your conservative friends have sort of been disappeared from your Facebook newsfeed, right? I have two questions. First of all, you have conservative friends? [LAUGHTER] And second of all, they've been disappeared? ELI PARISER: I actually went out of my way to like meet and befriend and Facebook-friend people who I thought were interesting, who had really different viewpoints.
And then I logged on one day and I noticed that, oh, you know, I haven't seen any of these people in a really long time. And what Facebook was doing was essentially looking at who I was actually clicking on and which links I was actually clicking and they were saying, yeah, you know, we know you say you’re interested in these people but actually we know you’re interested in the more progressive links and, frankly, like the entertaining stuff. We know you want to see videos of cats. [BROOKE LAUGHS] And, so, you know, they - BROOKE GLADSTONE: And do you? ELI PARISER: And I do, yeah, and I still click on them, I'll admit. BROOKE GLADSTONE: [LAUGHS] Now, how in general does Facebook work to keep us on Facebook? ELI PARISER: For example, they know that if you’re a 30-something woman and you see that your female friends have uploaded pictures of themselves, you’re likely to upload a picture of yourself in the next month. And they know that if you do that, that your male friends are very likely to comment on that picture, and they know that if your male friends comment on that picture, they're likely to stay on Facebook for months to come.
And so, what Facebook does, according to one person I talked to there, is they actually kind of run that in reverse. They say, oh, this guy looks like he’s kind of getting bored of Facebook. Let's find one of his friends, show her pictures of her friends that they've uploaded so that she uploads a photo so that he comments on it so that he stays on Facebook more. BROOKE GLADSTONE: Diabolical! ELI PARISER: [LAUGHS] BROOKE GLADSTONE: Now, none of this personalization would have been possible if we were completely anonymous online. And you note in your book that there is a greater and greater tendency towards less anonymity. ELI PARISER: There’s this race going on to compile the most complete profile of each person possible. You know, Axiom, the sort of 800-pound gorilla in that industry, has about 1500 points of data on each person. They had more data on the 9/11 hijackers on September 12th than the FBI did.
And they're mostly just trying to provide companies with data that they can use to do these kinds of targeted mailings, but increasingly the same kind of data is being used to personalize what you see and the content that you see. BROOKE GLADSTONE: This is really, really bizarre. You recently got a friend request from an attractive young woman who didn't exist. ELI PARISER: That's right. I was looking at my Facebook friend requests and there was this woman who, you know, was sort of a buxom young woman in the little icon. And I clicked on it, I'll admit, and I realized that there was something weird with her face. It took me a minute to realize that it was actually a computer-generated image that had been generated by a company that was basically trying to use these, essentially, people who were ads, advatars, to get my personal information. As the filter bubble gets stronger and stronger and it’s harder for companies to just blanket and broadcast with their products, you’re gonna see them building little code robots to run around, befriend people, and then try to get them to give data. [LAUGHS] This is the future that we're walking into. BROOKE GLADSTONE: So Eli, what should we do? What should Internet companies do? ELI PARISER: Well, I think for the companies, we need to call on them to really get serious about taking responsibility for this enormous amount of power that they have. They like to pretend, you know, that they're just doing our bidding, they're tools. Nothing to see here. That was kind of cute when they were insurgents and Mark Zuckerberg was 22, and now that he’s 26 and Facebook is - [OVERTALK] BROOKE GLADSTONE: He has to take some responsibility. [LAUGHS] ELI PARISER: He has to take some responsibility. You know, what that means is building algorithms that don't just show you the things that you click on the most, but that show you the things that we need our media to show us, which is the things that are important. BROOKE GLADSTONE: You’re asking Facebook to social engineer our view of the world. ELI PARISER: Facebook already does social engineer our view of the world. It just does it based on one very narrow variable, which is “Likes.” Facebook can make a choice, and they can choose to either kind of give us sort of a likeful, happy, positive Facebook world or they can actually step up and say, we're providing information to, at this point, nearly a billion people, and we need to take that seriously.
For example, you could have Facebook add an “Important” button, so you'd have “Like” and you'd have “Important.” And what shows up in the newsfeed is some mix of those two things. BROOKE GLADSTONE: And “Dislike?” ELI PARISER: And “Dislike.” You know, that would be great. The quote that really got me thinking about this with Mark Zuckerberg was he said, “A squirrel dying in your front yard may have more relevance to your interests right now than people dying in Africa.” And that’s true, but if all you’re showing people is the squirrel, you know, you’re going to have a real problem as a society. [LAUGHS] BROOKE GLADSTONE: Eli, thank you very much. ELI PARISER: Thanks so much for having me on. BROOKE GLADSTONE: Eli Pariser is the author of The Filter Bubble: What the Internet is Hiding From You.