E-mail:
Jack Balkin: jackbalkin at yahoo.com
Bruce Ackerman bruce.ackerman at yale.edu
Ian Ayres ian.ayres at yale.edu
Corey Brettschneider corey_brettschneider at brown.edu
Mary Dudziak mary.l.dudziak at emory.edu
Joey Fishkin joey.fishkin at gmail.com
Heather Gerken heather.gerken at yale.edu
Abbe Gluck abbe.gluck at yale.edu
Mark Graber mgraber at law.umaryland.edu
Stephen Griffin sgriffin at tulane.edu
Jonathan Hafetz jonathan.hafetz at shu.edu
Jeremy Kessler jkessler at law.columbia.edu
Andrew Koppelman akoppelman at law.northwestern.edu
Marty Lederman msl46 at law.georgetown.edu
Sanford Levinson slevinson at law.utexas.edu
David Luban david.luban at gmail.com
Gerard Magliocca gmaglioc at iupui.edu
Jason Mazzone mazzonej at illinois.edu
Linda McClain lmcclain at bu.edu
John Mikhail mikhail at law.georgetown.edu
Frank Pasquale pasquale.frank at gmail.com
Nate Persily npersily at gmail.com
Michael Stokes Paulsen michaelstokespaulsen at gmail.com
Deborah Pearlstein dpearlst at yu.edu
Rick Pildes rick.pildes at nyu.edu
David Pozen dpozen at law.columbia.edu
Richard Primus raprimus at umich.edu
K. Sabeel Rahmansabeel.rahman at brooklaw.edu
Alice Ristroph alice.ristroph at shu.edu
Neil Siegel siegel at law.duke.edu
David Super david.super at law.georgetown.edu
Brian Tamanaha btamanaha at wulaw.wustl.edu
Nelson Tebbe nelson.tebbe at brooklaw.edu
Mark Tushnet mtushnet at law.harvard.edu
Adam Winkler winkler at ucla.edu
In a recent post I raised several doubts about the validity of the new "Scholarly Impact" Ranking. My basic argument is that citation counts are a poor measure of "scholarly impact." Exhibit A in support of my argument came this week in the announcement that Harvard Law Professor Annette Gordon-Reed was awarded a MacArthur Fellowship:
Annette Gordon-Reed has an important new honor to add to her long list of recent accolades: a MacArthur Fellowship, sometimes called the "genius award." In 2010 she was awarded the National Humanities Medal. In 2009, her latest book The Hemingses of Monticello: An American Family won the Pulitzer Prize, the National Book Award, the George Washington Book Award, and she was also awarded a Guggenheim Fellowship.
Compare the above (remarkable, spectacular) achievements with Professor Gordon-Reed's citation count (from 1/1/2005 to present): 25.
re: St. Thomas, "Table 2: Detailed Scholarly Impact Ranking of Law Faculties, 2010", 7th ranked lawschool UC Berkeley. Among the top 10 Boalt names listed is professor J.Yoo.
Perhaps the discrepancy arises because the MacArthur Foundation and the scholarly impact rankings are measuring different things. MacArthur might be measuring impact, but it might also be aiming to measure potential, or the quality of the scholar's past work (as opposed to impact). Similarly, while winning the Pulitzer might be one way of measuring impact (you presumably have to have made an impact on the public, or on others in your field, to get the award), it's probably more about the voters' estimation of the quality of your work than about its impact.
Also, I'm not a historian, but I'm sure there's a difference between citation counts in history (few citations to secondary materials, perhaps) and law (way too many citations to secondary materials). This is not to say that I disagree with your basic thesis, but I'm not sure this is the strongest evidence in your favor.
I agree with your comments, which confirm the argument in my original post. It is not just that some academic fields do not share the promiscuous citation practices of law reviews (which inflates meaningless citations); it is also that citations in other fields are not picked up in the law journal data base. This means that law professors who publish, and are cited, in history journals or economic journals (or other fields) are under-counted in this ranking.
My broader point is that a scholar can have an important impact in ways that are not reflected in citations at all. Professor Gordon-Reed's work falls in this category. Her first book (over a dozen years ago) had a major impact of its own (also not reflected in citations in law journals), so the MacArthur Award is not just about this recent work.
As a side point, most law reviews these days disfavor citing long-form works (books), preferring instead to work with material appearing in periodicals... and the primary work that Professor Gordon-Reed is known for is book-length, not a law review article one can download from Lexlaw.
Conversely, many (if not most) history faculties discourage reliance on periodicals, and instead prefer to rely upon books — even when a book is merely a reprint of previously published periodical articles.
I agree with everything Brian Tamanaha says about Professor Annette Gordon-Reed, about her scholarly accomplishments, and about how the scholarly impact method we have employed is not well-suited for her type of work. That is, I agree with everything Brian says except his conclusion. He argues that this single episode shows that “[o]bviously there is something wrong with relying upon citation counts to measure ‘scholarly impact.’” But to say that scholarly impact scores do not say everything about legal scholarship and about every legal scholar does not mean that they say nothing important at all about any legal scholarship.
I emphatically agree that this case reminds us that a metric based upon citations in a law review does not capture scholarly influence in every field of scholarly endeavor within the legal academy. In our full report on scholarly impact score ranking, as well as in regular statements by Brian Leiter who refined the scholarly impact scores, we have repeatedly acknowledged and emphasized the limitations behind this particular measure of scholarly success. As evidenced with Professor Gordon-Reed, legal history is one of those distinct fields in which scholarly influence tends to manifest itself in ways other than citation in the traditional law reviews. Indeed, the leading history journals in which the best legal historians are likely to publish are not included in the law review database on Westlaw.
As but one other example, the discipline of legal writing is coming into its own in scholarship, but here too that scholarly work often reaches an audience differently than the work of the traditional doctrinal or theoretical legal scholar. Legal writing scholars have most effectively used SSRN as a means of distributing work to other legal scholars and drawing attention to the most valuable works on legal writing. Indeed, when a legal writing professor finds a pedagogical work of value, he or she is likely to recommend it as well to students, by providing the link to the paper on SSRN. For these reasons, if I were to conduct a study on scholarly influence of legal writing faculty, I would expect SSRN downloads to be a better measure than law journal citations.
While Brian Tamanaha sees the failure of scholarly impact scores to reflect scholarly prominence in legal history as yet another reason to doubt the validity of any scholarly impact analysis, I suggest instead that it simply demonstrates the limitations that those of us involved with scholarly impact ranking have carefully and repeatedly stated. When evaluating law faculties as a whole, which is our primary purpose in these rankings, the imperfection of the scholarly impact measure in isolated circumstances does not undermine its robustness as a general measure of faculty quality as a whole. And Professor Gordon-Reed’s law faculty confirms that point. The Harvard law faculty, of which she is a member, evaluated collectively, does just fine – quite excellent as a matter of fact – on scholarly impact scores.
But isn't it also possible that her work isn't really very influential, and the award is based on political or other criteria that would rightly be discounted in a survey of "scholarly influence?"