E-mail:
Jack Balkin: jackbalkin at yahoo.com
Bruce Ackerman bruce.ackerman at yale.edu
Ian Ayres ian.ayres at yale.edu
Corey Brettschneider corey_brettschneider at brown.edu
Mary Dudziak mary.l.dudziak at emory.edu
Joey Fishkin joey.fishkin at gmail.com
Heather Gerken heather.gerken at yale.edu
Abbe Gluck abbe.gluck at yale.edu
Mark Graber mgraber at law.umaryland.edu
Stephen Griffin sgriffin at tulane.edu
Jonathan Hafetz jonathan.hafetz at shu.edu
Jeremy Kessler jkessler at law.columbia.edu
Andrew Koppelman akoppelman at law.northwestern.edu
Marty Lederman msl46 at law.georgetown.edu
Sanford Levinson slevinson at law.utexas.edu
David Luban david.luban at gmail.com
Gerard Magliocca gmaglioc at iupui.edu
Jason Mazzone mazzonej at illinois.edu
Linda McClain lmcclain at bu.edu
John Mikhail mikhail at law.georgetown.edu
Frank Pasquale pasquale.frank at gmail.com
Nate Persily npersily at gmail.com
Michael Stokes Paulsen michaelstokespaulsen at gmail.com
Deborah Pearlstein dpearlst at yu.edu
Rick Pildes rick.pildes at nyu.edu
David Pozen dpozen at law.columbia.edu
Richard Primus raprimus at umich.edu
K. Sabeel Rahmansabeel.rahman at brooklaw.edu
Alice Ristroph alice.ristroph at shu.edu
Neil Siegel siegel at law.duke.edu
David Super david.super at law.georgetown.edu
Brian Tamanaha btamanaha at wulaw.wustl.edu
Nelson Tebbe nelson.tebbe at brooklaw.edu
Mark Tushnet mtushnet at law.harvard.edu
Adam Winkler winkler at ucla.edu
I've done a series of posts at the Health Care Blog on the unexpected consequences of data analysis in clinical settings. Not enough policymakers have recognized how pervasively predictive analytics can utilize data from one setting in another, unexpected one. As Scott Peppet argues, once you've "quantified yourself," it may not be easy to opt out of invasive uses of your digital doppelganger. We all too often have "delusions of control" about technology once it is introduced. But sooner or later, many key technologies end up disciplining us.
Situating these controversies in a broader analysis of social trends, Ira Basen's article in the weekend's Toronto Globe and Mail is an excellent survey of the many ways that we end up "programming our lives away:"*
Increasingly, algorithms are used to determine whether we can get access to credit, insurance and government services. They are posing a challenge to human decision-making in the arts. They are being used by prospective employers to decide if we should be hired. They can determine whether your online business will succeed or fail, and they have revolutionized the world of high finance.
And yet these algorithms remain a mystery to us, their inner workings protected by various intellectual property and trade-secrecy laws. Critics are beginning to wonder if we are surrendering too much human agency to the all-powerful gods of mathematics.
After discussing troubling uses of algorithms in employment, credit, finance, and other fields, Basen quotes some skeptical experts:
[Jaron Lanier's] book, You are Not a Gadget: A Manifesto, . . . calls for “a new digital humanism” to counteract the trend toward “cybernetic totalism.” Mr. Lanier urges readers not to succumb to an ideology being peddled by the gurus of Silicon Valley that seeks to devalue human creativity.
He believes that they are asking us to abandon our faith in ourselves and, instead, to put our trust “in the crowd, in the algorithms that remove the risks of creativity in ways too sophisticated for any mere person to understand.” They want us to believe, he concludes, “that the computer is evolving into a life form that can understand people better than people can understand themselves.”