an unanticipated consequence of
Jack M. Balkin
Jack Balkin: jackbalkin at yahoo.com
Bruce Ackerman bruce.ackerman at yale.edu
Ian Ayres ian.ayres at yale.edu
Mary Dudziak mary.l.dudziak at emory.edu
Joey Fishkin joey.fishkin at gmail.com
Heather Gerken heather.gerken at yale.edu
Abbe Gluck abbe.gluck at yale.edu
Mark Graber mgraber at law.umaryland.edu
Stephen Griffin sgriffin at tulane.edu
Bernard Harcourt harcourt at uchicago.edu
Scott Horton shorto at law.columbia.edu
Andrew Koppelman akoppelman at law.northwestern.edu
Marty Lederman marty.lederman at comcast.net
Sanford Levinson slevinson at law.utexas.edu
David Luban david.luban at gmail.com
Gerard Magliocca gmaglioc at iupui.edu
Jason Mazzone mazzonej at illinois.edu
Linda McClain lmcclain at bu.edu
John Mikhail mikhail at law.georgetown.edu
Frank Pasquale pasquale.frank at gmail.com
Nate Persily npersily at gmail.com
Michael Stokes Paulsen michaelstokespaulsen at gmail.com
Deborah Pearlstein dpearlst at princeton.edu
Rick Pildes rick.pildes at nyu.edu
Alice Ristroph alice.ristroph at shu.edu
Neil Siegel siegel at law.duke.edu
Brian Tamanaha btamanaha at wulaw.wustl.edu
Mark Tushnet mtushnet at law.harvard.edu
Adam Winkler winkler at ucla.edu
Harvard Professors Jim Greiner and Cassandra Pattanayak have posted a remarkable randomized experiment (“What Difference Representation?”) with evidence showing that offers for free legal representation from the Harvard Legal Aid Bureau (HLAB) ended up hurting unemployment claimants.
HLAB is a “student-run, faculty-overseen” legal service clinic at Harvard Law School. It is “the oldest student legal services organization in the country.” In the experiment, unemployment benefit claimants (who were pursuing “first-level” appeals) were randomized into one of two groups: a treatment group which was offered free HLAB representation, and a control group which was not offered representation. Prior to randomization all claimants agreed to participate in a randomized study. (“If the randomization was not to offer, the student-attorney so informed the claimant by telephone and provided her with names and telephone numbers of other legal services provides in the area who might take her case.”)
The claimants who were offered representation were no more (or less) likely to win their administrative appeal – but “the offer caused a delay in the proceeding.” The claimants offered representation had to wait on average 42 percent longer (53.1 vs. 37.3 days) before they received a decision of an Administrative Law Judge.
The results are particularly striking because not everyone who was offered representation was represented, and because those who were not offered HLAB representation were sometimes represented by alternative organizations.
The study highlights, again, the simple power of randomized control studies. There is a persuasive transparency to randomized control trials. The randomization doesn’t tell us why the offers caused a delay, but we should be fairly confident that those who were lucky enough not to be offered free legal assistance by HLAB had a better shot of cashing unemployment checks sooner. This initial study’s main limitation is that its sample size is only 207. Still, that is sufficient to raise the serious concern that HLAB’s offers of representation are hurting its potential clients. In medicine, iatrogenic effects are adverse side effects caused by medical treatment – this study points to a legal analogue in which well-intentioned legal assistance ends up resulting in adverse “side effects” for the clients.
This study raises deep ethical questions both for HLAB and other legal service providers. Does HLAB have a duty to stop offering representation or to change its modus operandi? Does it at least have an ethical duty to disclose the results of the study to prospective clients? Can other student legal service organizations ethically ignore the results of the study?
Will other organizations submit themselves to the institutional risk that their services will be found lacking? When in doubt, bet on narrow “head in the sand” self-interest. The authors report that another Boston-based provider of similar services “did not limit its opposition to a refusal to participate on its own part. Instead, when it discovered that HLAB was conducting a randomized evaluation, it halted its previous practice of suggesting that clients it could not itself represent call HLAB. And as of the time of this writing, this provider is currently using its power over the intake system of a third organization to prevent this third group from conducting its own randomized evaluation.”
A Last Minute Charitable Gift Suggestion
The HLAB story also motivated me to redirect some of my year-end charitable giving. In the past, I’ve given to causes (such as A Better Chance) which made me feel good but which turned out to have an abysmal record or at least no reputable evidence of success. But this year, I’ve given money to two charities, MIT’s Poverty Action Lab (PAL) and Innovations for Poverty Research (IPA), which are dedicated to using randomized control studies to find out which public policy interventions work to alleviate poverty. (Disclosure: I have number-crunching friends at both charities.) Would microcredit organizations do better using statistical credit scores instead of traditional subjective committee decision-making? Does providing free chlorine dispensers at water sources reduce child diarrhea? Scholars associated with these charities ran randomized experiments (described here and here) to find out. What I love about these charities is that they add to our knowledge – even when they establish that a particular intervention doesn’t work. As a reader of this blog, if you’re inclined to support data-driven decisionmaking, you could do a lot worse than contributing to these non-profits.