Had the opportunity to listen to Eli Pariser speak at the Shorenstein Center this week, where he gave an updated/small room version of this talk he presented at the 2010 Personal Democracy Forum on the “Filter Bubble.”  (Pariser is a charismatic speaker who does really nifty Powerpoint slides, but if you don’t have 15 minutes to listen to the whole talk, Ethan Zuckerman has written an excellent summary).

Pariser is a political activist who rose to prominence when an online petition that he launched, calling for a nonmilitary response to the 9/11 attacks, attracted half-a-million signatures in less than a month.  As former Executive Director – and now Board President – of MoveOn.org, Pariser knows a thing or two about mobilizing people and (perhaps more importantly) raising money over the Internet.

Beware the Bubble

In a nutshell (in case you have not read Zuckerman’s summary either!) Pariser’s concept of the filter bubble is about how each Internet user is now presented with customized content.  If two individuals used the same search term, Google would give different results.  Facebook apparently gives you newsfeeds from friends who share your views.  (It seems to me that I get all my friends in my Facebook newsfeed, which probably means that I don’t have enough friends for the algorithm to kick in!)

Pariser’s concern is that customization is bad for citizenship, because public conversation would end if each of us only sees data that conforms to what or who we are.  He admits that filters are needed, because the information created since the beginning of human history through to 2003 (five Exabyte’s worth, apparently) is now produced every 3 days.  Media professionals have traditionally served as gatekeepers, deciding what the public sees.  While they may have had biases, they were expected to behave ethically.

Pariser shared that engineers at Google had not considered the issue of ethics, because they are convinced that their algorithms are “neutral.”  He argues, however, that code is not neutral but is inherently political.

Code can be neutral

I am with the Google engineers on this one.  For two reasons:

First, filters have always existed.  Our parents decide what to feed us and teach us.  Our friends may decide to withhold bad news.  Our governments have a whole bunch of secrets we will never know about.  (Like who really killed JFK?).  IMHO, one key filter is language.  Simply put, if a message not in Chinese and Spanish, you would have alienated the world’s two largest audiences.  Google’s translation software has in fact done more than most to break down this filter.

[And just to make my point, here is this same line in Chinese and Spanish, courtesy of Google Translate:

谷歌的翻译软件已经做了,实际上比大多数人更要打破这种过滤器。

el software de traducción de Google, de hecho, hecho más que la mayoría de romper este filtro.]

Second, if we define neutrality as the absence of bias, we can probably agree  (assuming no deliberate shenanigans at Google and Facebook) that code is neutral.  After all, it points some people one way, and others another way.  The code simply tries to help you prioritize – out of the 5 Exabytes – the pieces of data you may wish to see (first).

The real issue

Which brings me to what I think is Pariser’s core point – that balanced views are needed for good citizenship.  Pariser himself is trying to seek out people who have different views from his own.  While that is admirable, one could also argue that it stems from his professional interest as a political activist.  I think, however, that there are many, many more of us who would prefer to remain in our comfort zones.

To use an analogy, a record producer would listen to various genres and different acts.  He/she might personally prefer jazz, but would listen to Britney Spears and Lady Gaga because of the need to keep up with the trends in the business.  The individual jazz listener, however, may have never listened to Lady Gaga (nor wish to).

Ultimately, it is not a bad citizen who does not seek out divergent views; it is a typical citizen.

Pariser suggests that Facebook could add a slider to let us see stuff that’s more (or less) homogenous.  I think that’s fine.  Facebook has lots of stuff I don’t use anyway.

Advertisements