Eli Pariser, the former executive director of moveon.org, is concerned about how Google gives us customized results that may put us in "news silos" and maintain our world views. His new book The Filter Bubble takes up this concern, claiming that Google is no longer an "enormous library" (don't worry, it still is) and that social media in particular is cradling us in our particularist worldviews and keeping us in an ideological cocoon (yes, they are, but people debate much more on Facebook or Google than you ever could on moveon.org).
In an interview on amazon.com, he sums up his ideological problem with Facebook's ideological cocooning:
Mark Zuckerberg perfectly summed up the tension in personalization when he said “A squirrel dying in your front yard may be more relevant to your interests right now than people dying in Africa.” But he refuses to engage with what that means at a societal level – especially for the people in Africa.
See the problem? While rightfully exposing the problem of the filtration of Facebook, which we can easily recognize (people chose their friends and colleagues; they tend to be in a comfort-zone of ideological compatability), Pariser intrudes with his own notion -- that "we must" somehow care for the people of Africa; these people are some sort of unified mass with some objective set of problems we "must" sympathize with -- if we care about a squirrel in our front yard and not the people dying in Congo over cell-phone metals, we must be callous white northern Ice People -- well yes, all these associations are indeed underlying a seemingly innocuous phrase like "refuses to engage with what the means at a societal level."
Erm, how would you have him engage? Be fed moveon.org or PCCC tracts every morning in email so he can care more about Africa than a squirrel? But which Africa? By whom? About what? And what will the test of "engagement" really mean, at the end of the day? Coming round to a full-fledged "progressive" set of views like Eli Pariser has? At the end of the day, the people in Africa care more about an animal in their front yards than those dying all around them, too.
So, I suspect he is seized with this filtration concern because he sees it as an obstacle to converting "the masses" to his own "progressive" world view. He says in the amazon.com interview, in a clear instance of Betterworldism:
I’ve always believed the Internet could connect us all together and help create a better, more democratic world. That’s what excited me about MoveOn – here we were, connecting people directly with each other and with political leaders to create change
But that more democratic society has yet to emerge, and I think it’s partly because while the Internet is very good at helping groups of people with like interests band together (like MoveOn), it’s not so hot at introducing people to different people and ideas. Democracy requires discourse and personalization is making that more and more elusive.
And that worries me, because we really need the Internet to live up to that connective promise. We need it to help us solve global problems like climate change, terrorism, or natural resource management which by their nature require massive coordination, and great wisdom and ingenuity. These problems can’t be solved by a person or two – they require whole societies to participate. And that just won’t happen if we’re all isolated in a web of one.
Again, see the inherent problem here with the demand of progressives not to leave media alone, not to leave it free, not to allow the marketplace of ideas to function, but to get it to *live up to that connective promise* by *connecting a certain way*. The Norwegian terrorist Anders Behring Breivik used the Internet to live up to that connective promise too -- only he connected up to the English Defense League and various hate sites, some of which found him too weird even for their extremities. Yet Pariser is confident that the Internet as a machine can be tweaked to produce *good* connections that will enable "us all to work together" on urgent problems like climate change.
While Pariser seems to like the idea of the Internet "introducing people to different ideas," what he wants is a very definitive "progressive" result based in hard ideological precepts: they *must* come to a recognition that climate change is a danger; they *must* work together for "national resource management" -- planned socialist methods and outcomes (to be cloaked, of course, first in saying it is merely about "a conversation" and then about "alternatives" and then after that about "reforms").
His frustration with the cadre-run top-down non-social-media organization moveon.org is palpable (I don't know the story of why he left his position as executive director, but probably because it could never attract more than a few million people or...27,000 answers to polls...and so he decided to look for how he could fry bigger fish.) (I'm a big critic of moveon.org and have been since I was initially a supporter, in participating in the campaign to oppose the impeachment of Bill Clinton in the early days of email.)
In a book review of Pariser's new book by Tangled Web's Luke Allnutt, Radio Free Europe/Radio Liberty, Allnutt explains how Google "knows" his wife is expecting a baby and serves him up baby stuff -- maybe he and his wife will click and buy. Makes sense. Google is basically a giant ad agency, with a loss-leader called "search," and Wikipedia as its unpaid serf.
But this is creating some kind of "news silo" that is keeping us tethered to world views, says Allnutt, describing Pariser:
The consequences of this social engineering, Pariser argues, is that we interact more with people who think like we do. Rather than fulfilling the early Internet dreams of diversity and freedom of choice, we are living in an echo chamber. As a result, there’s less room for “the chance encounters that bring insight and learning.” Where once we had human news editors who would temper the Britney coverage with a foreign war or two, now algorithms select our news for us based on what we click on and what we share.
Actually, I'm more inclined to think that rather than "keeping us in an echo chamber," the views of the founders of Google -- and the new social media platforms -- are seeping in through their platforms. We now have human editors with bots and algorithms -- and they've set it up so that we get Wikipedia entries on every search.
Allnutt takes the approach that there "isn't any scientific proof" that what Pariser is saying is really the case (nobody wants to think they're a zombie because they get custom search results). He cites a new Pew report, “Social Networking Sites and Our Lives,” that says there is no relationship “between the use of social networking services and the diversity of people’s overall social networks.”
He also chides Pariser (yes, cool-hand Luke is capable of chiding, just not Anonymous) for wearing "Imagined Analogue Past" rose-coloured eyeglasses, when "things were better before the Internet". If anything, I find the "progressives," especially the gurus like Clay Shirky, making up very customized notions of the past to fit their theories of media, like his fond analogy of the TV age to the industrial age where supposedly everyone was drunk from readily-available gin carts because of the monotony of their machine jobs -- and then eventually (somehow, despite being drunk and risking life and limb on that machinery!) they made their way to the post-war boom of suburbia and roads and cars only to become zombified by the TV set and acquire "surplus value" which they are now, um, accessing for a "better world" because they can now interact on social media.
Luke is only guilty himself of anti-anti bias here because he imagines the past as a time when everyone was solidly in their view-silo, fed by their information-silo, with its limited print form. That strikes me as entirely ahistorical, because he is forgetting (or perhaps never saw, if he was too young), how debates used to work. Alcove 1 and Alcove 2 debaters couldn't endlessly link to factoids on the Internet, in selective and specious ways. They couldn't Fisk, as they were in verbal and not textual warfare. They had to debate face to face with their wits. I wasn't in Alcove 1 or 2, although I would encounter their denizens and political descendents in New York in the 1970s and 1980s. But I remember the book tables in Sid Smith hall at the University Toronto, my alma mater, at which was perhaps a milder version. Perhaps this was among my earliest exposures as to how a variety of ideologies could be presented.
There was the Spartacus League; there was an anti-war movement; there were supporters of Cesar Chavez and the lettuce boycott; there was the Newman Club. I remember myself, as a member of a campus prayer group, manning for a time a table of religious literature -- it was a mix of C.S. Lewis Christianity and evangelical tracts. Naturally, people fought fiercely for their particular ideology or cause -- books, posters, flags, heated debates. People handed out flyers and tracts. This was the little free speech zone that probably administrators had supplied to accommodate some angrier student protest of the previous decade.
But what was sacred, underlying this book-table hall, was the notion that there *was* a hall. There was a substrate, a place open to any ideology (although it was too early in 1972 for LGBT to venture out safely and nobody would have permitted the Nazis to set up a table). It was understood that tolerance of the public space, pluralism, was the tabula resa of this proliferation of causes -- most of which were on the hard left, because those were the ones that bothered to set up book tables. Sitting at my table with the C.S. Lewis and batches of Campus Crusade for Christ tracts, I knew that I was "the enemy" to the Spartacus League guy, but that he wouldn't knock over my table or be able to have me removed. The Save the Whales people may have thought that Spartacus guy was a crank, but they ignored him. And so on. All of us were appealing for the limited attention of hurrying students, mainly in the sciences, who were rushing to class or finishing up last-minute homework assignments or flirting, and were mainly uninterested in having a better-world pamphlet pressed upon them. Even so, the principle was clear: the pamphlet about religion as the "opiate of the people" could be countered by the pamphlet about the saving power of Christ.
Eli Pariser is on Google+, and is flogging his book and his ideas there (as everyone is), and today referenced this article.
My post:
Eli, I am among those people who have noticed this "customizing" of Google for years and have always been annoyed and concerned about it. I've always felt that it would lead to uniformity of political views as the developers of the platforms inevitably bake their views into the tools -- the opposite of what you appear to fear, which is that too many people will retain diverse views of the world instead of "the right" view.
Indeed, I often go on other people's computers to do searches, or chat with friends in Second Life and coordinate searches among different people in different areas to see what comes up. Going on Bing also reveals this problem as entirely different results, particularly on controversial political topics. Example -- type in the key words around a Tea Party demonstration in Google and they don't appear at all; type them in Bing, and they do. (I don't cite this as a supporter of the Tea Party, but as an example of how searches get handled due to the dominance of Wikipedia as the fulcrum.) Type my Second Life avatar into Google, and get the hate pages; type into Bing, and there are no hate pages, even though I haven't done a thing about them in terms of some "SEO". Type "Communist Party" into Google, and on the first view, get a page that sells t-shirts. On Bing, in the same spot is something about Canada's Communist Party. And so on. You could play this game all day.
However, I would like to ask whether the reason this phenomenon troubles you so much is because you don't actually believe in pluralism and a variety of viewpoints and approaches to different issues, like many "progressives". In fact, you may believe that there is some "scientific" or "rational" way of delivering search results so that people will see "the truth". The Google problem bothers you precisely because you cannot "reach the masses" with the truth you wish to deliver, but find them atomized. Really, much depends on whether you believe the truth is absolute but man is imperfect and there may be many different approaches to it, or whether you believe that because the truth is absolute, one approach will be highlighted and now modern Internet technology and our social media "paradise" can lead us to this "way". Or should lead us to the way, if only you can remove this "pluralism algorithm" that gets in the way.
From reading the interviews with you about this, I wonder if you are hoping that somehow, you can harness these tools to deliver "the facts" more readily to "the people" so that they can see "the truth" and hoping you can persuade and harness Google for this purpose.
What would be your solution to the problem of customization otherwise? It must annoy you that some people read the Wall Street Journal and the National Review and watch Fox news and other people, despite reading the New Republic and Huffington Post still can't come around to seeing things as you do on the rigidly ideological moveon.org or the Nation or Amy Goodman's Democracy Now. Can you accept that people have views and gravitate to the publications that continue to feed and reinforce those views and permit them to go on doing this in a free society? Or do you insist that they give up their publications, along with their guns and religion, and get the "truth package"? Or are you willing to leave them with their little magazines, as long as Google delivers the right returns on Palin's "death panels"? Perhaps you don't need to tinker as much as you believe, as Google already helpfully cates to your belief by putting a New York Times story on the issue, with the headline "False 'Death Panel' Rumor Has Some Familiar Roots". Bing by contrast puts in the view "NYT Brooks Promotes Eugenics “Death Panels” Amid Budget Crisis".
If anything, given that Google's devs are closer to your worldview than not, what would you have them do? They can't sell their ads without this customized search return.
Ultimately, a huge variety of searches does turn up the fact that Google doesn't really skew things that much, although it does something that unfortunately helps reinforce the left-of-center worldview you're hoping to bolster -- and that is turn up Wikipedia entries as the first or second return on most topics, merely because it is the most linked, and then because it was once the most linked, it becomes more linked by people looking for an easy solution. Yet Wikipedia has many biased pages and little recourse, although the tentative reform that will enable readers to rate pages may hold out some hope.
And a reply -- typical and predictable -- from someone named Edouard Rabel who imagines that the problem is "reading comprehension":
I think you completely missed the point. There is no "one" filter. Google and others personalize the search results based on the profile they have of you. Regarding politics and many other subjects: this could lead one to ostracize themselves from the rest of the world, only being served with for example: Tea-Party related content.
Another downside of these filters is that in the above (political) example it reinforces existing biases. The argument that increased availability to information of a political nature doesn't lead to a more educated political opinion unfortunately still stands. The trend in this regard is something to be worried about, as it doesn't improve but gets worse.
Dare imagine what happens when someone is only exposed to alternet or foxnews content.
I don't think this article, or Eli himself has anything to do with "selling truth" or "forcing an opinion down everyone's throat" as you seem to insinuate.
And my reply:
No, I didn't miss the point because I...use Google, duh. As do many people. And the first thing you always see on a Google search is Wikipedia, as biased as it is. And that's one of those self-fulfilling prophecies of the algorithm -- it's there because people click on it because it's there, it's there because people link to it because it's there, etc. I just cited some examples of what happens on two different search engines. Google leads you instantly to a criticism of the "death panels" concept; Bing admits a more nuanced view. Google sells you a t-shirt on the Communist concept; Bing gives you yet another return on a Communist Party. So your notion that I'm being given a customized worldview-shaping notion that will inevitably lead me to some worldview doesn't quite happen. I'm a big critic of Communism, Google must "know this", yet it gives me a t-shirt to buy.
You can argue endlessly about the concept of filtration, and you can endlessly find samples to fit your particularly bias, as could I. Perhaps some list of neutral "test terms" could be devised that a double-blind experiment could be performed on (and I will eventually get Pariser's book and see if he does anything like that).
The real question, as I noted, however, is what Pariser intends to be done about this. Fix it so that we all see only the search results that fit his "progressive" and "scientific" worldview whether we wish to or not? Indeed, Pariser's concern about this is because he wants to do something about it to correct it. It infuriates him that people are reinforcing their world views (gasp!) and that those are views he does not share, and are in the majority. That sets him to think how we could "scientifically" crowbar them away from what he sees as their pablum.
Recent Comments