Ever since Clay Shirky (https://www.youtube.com/watch?v=LabqeJEOQyI) proclaimed that there is no such thing as information overload, there’s only filter failure, the role of filters in the network knowledge economy has taken center stage. With over 3 billion searches per day, Google’s search engine is probably the most used filter in the world. Google’s PageRank algorithm – and 250 other criteria that are much less publicized – seem to work so well for filtering that knowledge out of the ocean of information in the web that is relevant and reliable for our questions and concerns that we have come to believe that Google is presenting us a complete and unbiased view of the world. We tend to forget that there is indeed a problem of filter failure and that perhaps no filter, not even the algorithm searching for Google, can be a mirror of the world.
Filter problems are not new to media studies. It has long been known that traditional mass media, newspapers, magazines, radio and TV, are influenced by the editorial policies of their owners. What makes the filtering done by algorithms different from that of editors? Nothing! Nevertheless, it seems natural to ascribe motives, more or less secret agendas, and business as well as political strategies to people rather than to machines. Isn’t the search engine merely a tool that we can use as we wish? And if we are not getting objective, complete, and reliable information on a topic we are the ones to blame for either being unwilling or unable to use the tools properly.
Science and technology studies have shown that tools are not neutral; they have their own programs of action, their own affordances. Many internet services including Google track users online activities, assemble profiles, and “personalize” the results they present to users. Critics have warned that “filter bubbles” (see Eli Pariser, The Filter Bubble: What the Internet is Hiding from You, New York, 2011) enclose users in an information environment that agrees with their preferences and interests and thus automatically excludes information that is contrary to their opinions, preferences, and beliefs. Many algorithms tend to filter out everything that runs counter to what one expects and wants to see, whether it be commercial products and services or news. Internet users, especially in online communities and social media sites, can become isolated in “echo chambers” in which search results and information distribution merely reflect their prejudices. The affordances of such algorithms lead to a situation in which an unbiased confrontation with the facts and impartial public deliberation on political and social issues is at least hindered if not completely undermined.
Apart from the fact that filter bubbles and echo chambers existed before the algorithmic automation of information services as well as the fact that there has never been and probably never can be such a thing as a completely objective, impartial, and unfiltered access to information, these criticisms should be taken seriously. They tell us that any layer of knowing and acting that a filter opens up is only one among many other possible layers. By its very nature, a filter selects; it includes and excludes. It presents us with a field of knowing and acting that is only one layer of a multilayered reality. Algorithms searching for Google and other providers offer us interpretations and not some kind of impossibly value free and completely objective knowledge. Filters are hermeneutic machines. They give us interpretations and not objective truth. Their complexity comes from the fact that they operate neither as neutral tools nor as masters commanding us to believe and obey. They are actors and mediators in the sense of Actor-Network Theory, that is, quasi-objects emerging from the networked participation of owners, developers, users, infrastructures, competitors and much more. If one opens up the black box that Google’s search engine seems to be, one finds many different actors associated into a heterogeneous network that can always become other than it at any moment appears to be. Perhaps the role that we the users should play in this network is to be aware of what we can do to prevent ourselves from being black-boxed, to be actors and mediators and not mere functions. As mediators we can change things, even if we are momentarily quite satisfied and happy with what Google presents us.