Jack Balkin: jackbalkin at yahoo.com
Bruce Ackerman bruce.ackerman at yale.edu
Ian Ayres ian.ayres at yale.edu
Corey Brettschneider corey_brettschneider at brown.edu
Mary Dudziak mary.l.dudziak at emory.edu
Joey Fishkin joey.fishkin at gmail.com
Heather Gerken heather.gerken at yale.edu
Abbe Gluck abbe.gluck at yale.edu
Mark Graber mgraber at law.umaryland.edu
Stephen Griffin sgriffin at tulane.edu
Jonathan Hafetz jonathan.hafetz at shu.edu
Jeremy Kessler jkessler at law.columbia.edu
Andrew Koppelman akoppelman at law.northwestern.edu
Marty Lederman msl46 at law.georgetown.edu
Sanford Levinson slevinson at law.utexas.edu
David Luban david.luban at gmail.com
Gerard Magliocca gmaglioc at iupui.edu
Jason Mazzone mazzonej at illinois.edu
Linda McClain lmcclain at bu.edu
John Mikhail mikhail at law.georgetown.edu
Frank Pasquale pasquale.frank at gmail.com
Nate Persily npersily at gmail.com
Michael Stokes Paulsen michaelstokespaulsen at gmail.com
Deborah Pearlstein dpearlst at yu.edu
Rick Pildes rick.pildes at nyu.edu
David Pozen dpozen at law.columbia.edu
Richard Primus raprimus at umich.edu
K. Sabeel Rahmansabeel.rahman at brooklaw.edu
Alice Ristroph alice.ristroph at shu.edu
Neil Siegel siegel at law.duke.edu
David Super david.super at law.georgetown.edu
Brian Tamanaha btamanaha at wulaw.wustl.edu
Nelson Tebbe nelson.tebbe at brooklaw.edu
Mark Tushnet mtushnet at law.harvard.edu
Adam Winkler winkler at ucla.edu
Social sorting is big business. Bosses and bankers crave "predictive analytics:" ways of deciding who will be the best worker, borrower, or customer. Our economy is less likely to reward someone who "builds a better mousetrap" than it is to fund a startup which will identify those most likely to buy a mousetrap. The critical resource here is data, the fossil fuel of the digital economy. Privacy advocates are digital environmentalists, worried that rapid exploitation of data either violates moral principles or sets in motion destructive processes we only vaguely understand now.*
Start-up fever fuels these concerns as new services debut and others grow in importance. For example, a leader at Lenddo, “the first credit scoring service that uses your online social network to assess credit," has called for "thousands of engineers [to work] to assess creditworthiness." We all know how well the "quants" have run Wall Street---but maybe this time will be different. His company aims to mine data derived from digital monitoring of relationships. ITWorld headlined the development: "How Facebook Can Hurt Your Credit Rating"--"It's time to ditch those deadbeat friends." It also brought up the disturbing prospect of redlined portions of the "social graph."
There's a lot of value in such "news you can use" reporting. However, I think it misses some problematic aspects of a pervasively evaluated and scored digital world. Big data's fans will always counter that, for every person hurt by surveillance, there's someone else who is helped by it. Let's leave aside, for the moment, whether the game of reputation-building is truly zero-sum, and the far more important question of whether these judgments are fair. The data-meisters' analytics deserve scrutiny on other grounds.
Privacy and Power
First, there's the power issue. Note that companies like Lenddo and Klout want access to a complete list of your friends, and their financial profiles, but brag that their own algorithms are "proprietary and secret." If they really believed in the Silicon Valley hype about "transparency" and "openness," why not reveal them? Or, if they fear someone will game the algorithms, why not release them after one, two, or five years? And if even that is too trying, how about establishing third party entities to audit the process? James B. Rule highlights the unfairness:
This country’s consumer credit reporting industry ascribes to the great majority of adult Americans a three-digit score epitomizing their potential profitability as charge-account customers, credit card users, or mortgage applicants. As in virtually all systems of mass surveillance, credit tracking and scoring enables institutions to make ever-finer distinctions in their treatment of the people they deal with.
But note that American consumers have no remotely comparable monitoring system to help them choose among retailers, products, and services. This is hardly for lack of need. A consumer-friendly tracking system could furnish the same comprehensive, instantaneously available data to buyers that credit reporting provides to lenders and retailers.
All of this would cost money, though consumer savings would likely make up for the public costs. What’s more problematic is that such a system would require manufacturers and sellers to provide crucial data. They will, of course, insist that such information is proprietary—that is, they own it, and they’re not giving it up. The reasons for such resistance are obvious: Better information for consumers spells potential disadvantage for sellers. . . .
The dramatic discrepancies between these two surveillance potentials—one an ultra-sophisticated reality, the other grossly underdeveloped—are by no means imposed by technology. They reflect sponsorship. This country’s lending and retail industries are simply better organized and more resourceful interests than consumers.
When Big Data's cheerleaders rhapsodize about understanding our social world better than ever, remember that they are often talking about enhanced methods of monitoring and manipulating those too politically weak to demand privacy (or recompense for its invasion).
[H]ere's the big problem with [current FTC] privacy audits. When they were first being discussed, consumer groups like the Electronic Privacy Information Center (EPIC) asked that any audits be made public. The response from the Federal Trade Commission was not encouraging. They told the groups the audits would not be published but "the public may have access to the submissions required pursuant to the order" using tools like the Freedom Of Information Act (FOIA).
[FOIA exempts trade secrets in many cases.] So the company may be using innovative strategies to violate consumer privacy and will demand that the FTC hide those methods from the public by deeming them "trade secrets." The joke here is that these companies are systematically violating consumer privacy but are demanding secrecy for the regulatory review of those violations.
As I noted in another context: there is one rule of privacy law for the powerful, and quite another for the powerless. For US courts, trade secrecy remains sacred, even as privacy is eroded at every turn.
Reporters tend to worry that people will change their behavior once the full negative impact of "deadbeat" friends becomes clear. I don't share that worry presently, mainly because monitoring now is so pervasive that it would be a herculean mental feat simply to keep track of all the ways one could misbehave in the eyes of some digital sensor (or censor). Rating tools may also be so opaque that gaming them seems to be a Sisyphean task. But I do worry that we won't adequately appreciate the ways in which these services make the world more congenial for certain personality types and less so for others. For example, a person who automatically cuts off contact with "friends in need" may get cheaper credit and more opportunities if Lenddo becomes very successful. Quantified selves who tend to quickly conform to such a gamified social life will also "score." I've already heard stories of Twitterati churning through followers to maximize "Klout." Both reinforce troubling trends in the US economy's reward structure.
Of course, we all have such tendencies in us; it's not as if there's a certain calculative ideal-type out there ready to take advantage of the new social gamescape of reputation enhancement. Sadly, even that complexity may ultimately be flattened by a world of constant monitoring. Mark Zuckerberg memorably said that:
You have one identity…The days of you having a different image for your work friends or co-workers and for the other people you know are probably coming to an end pretty quickly… Having two identities for yourself is an example of a lack of integrity.
In response, Aaron Bady has drawn on the thought of W.E.B. Dubois to defend "the multitudes of identities we each contain:"
Why is it that people want to control their privacy? It isn’t so much that people want to “hav[e] a different image for your work friends or co-workers,” as [Zuck] sort of innocuously puts it; it’s not an issue of choice for people who need to have a different image for their boss than the one they have in real life. The less the people who sign your paycheck know about you, after all, the less they know that you’re not simply a simple worker-drone toiling away in their sugar fields, and that can be an urgent thing in a time where everyone who works for someone else could be replaced at any time.
But even the less dire firewalls we try to build in our lives are fundamentally about asserting our ability to choose; we hide things from our friends and family to the extent we fear they’ll disapprove and make that disapproval meaningful by intervening. We compartmentalize not because we’re split between different notions of ourself, but because the multitudes of identities we each contain bump up against people’s expectations that we each be a particular way.
When individuals resist the pervasive monitoring of services like Klout or Lenddo (or more traditional data brokers and credit bureaus), it's not necessarily because they have something to hide. Rather, it's because they already feel amply manipulated and controlled by existing constellations of knowledge and power. To paraphrase Tarleton Gillespie, "We don’t have a clear sense of how to talk about the politics of th[e] algorithm[s]" now vying to credit or discredit our digital selves as powerfully and profitably as entities like credit bureaus and DHS evaluate our physical selves. Until norms of reciprocal transparency render them as legible as they'd like to make us, it is wise to keep a cautious distance.