Pages

Wednesday, November 18, 2009

Assessing Algorithmic Authority

Clay Shirky has recently written "A Speculative Post on the Idea of Algorithmic Authority," based on a talk at Yale's recent conference on Journalism & The New Media Ecology. Shirky notes that "people trust new classes of aggregators and filters, whether Google or Twitter or Wikipedia (in its ‘breaking news’ mode)," and calls this trend "algorithmic authority."

Shirky's ideas about authority have many interesting implications for legal scholars. They also lead me to worry that the same types of opacity that infected our financial system may increasingly influence our public sphere.



Shirky characterizes "algorithmic authority" as "the decision to regard as authoritative an unmanaged process of extracting value from diverse, untrustworthy sources, without any human standing beside the result saying 'Trust this because you trust me.'" For Shirky, "authority is a social agreement, not a culturally independent fact." He mentions the poor performance of certain "sources everyone accepts"--for example, "the ratings agencies Moodys, Standard & Poor’s, and Fitch"--and an error in Encyclopedia Brittanica. He implicitly contrasts these older, traditional authorities with "Google’s PageRank algorithm, Twitscoop’s zeitgeist measurement, and Wikipedia’s post hoc peer review," which are examples of algorithmic authority.

Both traditional and algorithmic sources face the problem of unreliable inputs. For algorithmic authorities,

[T]he “Garbage In, Garbage Out” problem [is handled] by accepting the garbage as an input, rather than trying to clean the data first; it provides the output to the end user without any human supervisor checking it at the penultimate step; and these processes are eroding the previous institutional monopoly on the kind of authority we are used to in a number of public spheres, including the sphere of news.


Here are some reasons to worry about that process:

a) Shirky notes in the piece that there are several kinds of knowledge out there unsusceptible to assessments of accuracy. He cleverly calls these "epistemological potholes." My worry is that the potholes are in fact larger than the road itself, and that we should be particularly concerned about the accumulation of algorithmic authority in news. If we merely relied on journalists for facts, perhaps a wikipedian directive of objectivity and neutrality could permit algorithmic authorities to separate the wheat from the chaff. But the media is more an engine than a camera, the font of ultimate political reality it pretends merely to mirror.

b) Now the question becomes: are these algorithmic authorities any worse than the corporate goliaths they are displacing? I'm not going to argue that they are, because of a deeper problem: at least one of them (Google) utilizes trade secret protected algorithms that aren't open to public inspection (and are likely so dynamic that a snapshot of them would give us little chance of assessing their biases). I can't imagine how a modern-day Herbert Gans could write an account of "Deciding What's Google News" (though I'm deeply impressed by Dawn Nunziato's incisive account of some problems in the service). I've earlier worried that algorithmic sorting could allow prejudices to enter spheres of life where once people had to “launder preferences” by giving some explicit reason for action.

c) Algorithmic authority probably has Hayekian and democratic foundations--an idea that the uncoordinated preferences of the mass can coalesce into the "wisdom of crowds" once old elites step out of the way. A power law distribution of attention on the web, like ever-more-extreme polarization of wealth and poverty, has to be legitimated by markets, democracy, or some combination of the two. Such forms of spontaneous coordination are perceived as fair because they are governed by knowable rules: a majority or plurality of votes wins, as does the highest bidder. Yet our markets, elections, and life online are increasingly mediated by institutions that suffer a serious transparency deficit. Black box voting continues to compromise election results. The Fed asserts extraordinary emergency powers to deflect journalistic inquiries about its balance sheets. Compared to these examples, the obscurity at the heart of our "cultural voting machines" (as I call dominant intermediaries) may seem trivial. But when a private entity grows important enough, its own secret laws deserve at least some scrutiny.

I have little faith that such scrutiny will come any time soon. But until it does, we should not forget that the success of algorithmic authorities depends in large part on their owners' ability to convince us of the importance--not merely the accuracy--of their results. A society that obsesses over the top Google News results has made those results important, and we are ill-advised to assume the reverse (that the results are obsessed over because they are important) without some narrative account of why the algorithm is superior to, say, the “news judgment” of editors at traditional media. (Algorithmic authority may simply be a way of rewarding engineers (rather than media personalities) for amusing ourselves to death.)

Moreover, if personalized search ever evolves to the point where someone can type into their gmail “what job should I look for,” and receives many relevant results, new media literacy demands that the searcher reflect on the fact that his or her very idea of relevance has probably been affected by repeated interactions with the interface and the results themselves. As Nicholas Carr and Jaron Lanier have pointed out (recalling Sherry Turkle and Sven Birkerts), tools aren’t just adapting to better serve us-–we are adapting in order to better compete in the environment created by tools. Algorithmic authority can be just as disciplinary as the old forms of cognitive coordination it's displacing. To paraphrase Foucault: "Responding precisely to the revolt of the [netizens,], we find a new mode of investment which presents itself no longer in the form of control by repression but that of control by stimulation". . . and search engine optimization.

X-Posted: Madisonian.