Balkinization  

Monday, April 02, 2012

When True Numbers Mislead: 98% Employment "Not Fully Accurate Picture," ASU Dean Says

Brian Tamanaha

Last week I asserted that law schools continue to report dubious employment numbers and I explained why the ABA transparency reforms will not work. One of the schools I raised questions about was ASU, which claimed that 98.2% of its 2010 graduates obtained employment, up from 89.8% in 2009. This leap in employment is remarkable (and suspicious) given that we remain mired in the depths of most dismal job market for law graduates in decades.

In an announcement of his appointment as permanent dean, and celebrating ASU’s spectacular rise in ranking from 40th to 26th, Dean Sylvester elaborates:

According to the U.S. News formula, [ASU college of law] succeeded in placing more than 98 percent of its graduates in employment within nine months of graduating from the law school.

“However, we do not feel this percentage paints a fully accurate picture of the financial and employment difficulties our graduates face,” Sylvester said. “We are exceedingly proud of the efforts of our Career Strategy office and the work they put in to find employment for our students, but the 98 percent is more a reflection of the formula used by U.S. News than it is reality.”

Did he really say that?

In response to my post last week, I was sharply criticized by a law professor for uncharitably and irresponsibly impugning the integrity of a bunch of law schools without proof. He is right, I confess—I offered no direct evidence to show a problem with the numbers.

But my skeptical questions were not raised in a vacuum. Law schools have been excoriated in the national press for reporting inflated employment numbers, with three Senators writing to the ABA demanding that these practices stop. Nonetheless, a year later (after the implementation of new ABA reporting rules), we are still advertising unbelievably high employment numbers (like 98%!).

The questions I raised in the post were based upon anomalies that call out for explanation. In particular, I applied the “Yale Test,” questioning mid and lower ranked law schools that claimed an employment rate higher than or equal to Yale’s 91.8%.

Now I will provide some data to back up my assertions.

I will focus on following paragraph from the post, which I’m told was especially offensive:
Incredibly, a number of law schools ranked far beneath Yale reported a notably higher employment rate, including George Mason (96.4%), Loyola Marymount (94.1%), Kentucky (94.2%), and UNLV (93.2%). Even a few bottom schools reported employment on a par with Yale: Florida International (90.1%), Baltimore (91.2%), Akron (91.8%), Toledo (90.1%), and Atlanta’s John Marshall (91.6%). It is absurd to think that any of these schools exceeded or matched Yale’s employment rate.
The deans who submitted these numbers will (like Dean Sylvester) no doubt swear (at that damn Tamanaha) that they are absolutely true. Before addressing that assertion, let me step outside the law school bubble for a moment to say what these numbers would look like in the eyes of a 22 year old thinking about law school.

The natural conclusion a person would draw from these employment numbers is: AT LEAST NINE-OUT-OF-TEN GRADUATES FROM THESE LAW SCHOOLS LANDED A JOB AFTER GRADUATION. Ergo, “If I go there I have a high probability of landing a job—a higher probability of getting a job than, say, at a law school that advertises an 80% or 70% employment rate.” That job, furthermore, would be a lawyer job, or at least a well paying job (worth the high tuition) that the law degree was helpful in landing. (I am describing how most people would understand these employment percentages. The scambloggers would derisively mock these numbers as complete rubbish.)

It is because the employment percentages carry this meaning that US News accords it substantial weight in the final score that produces the ranking (employment 9 months after graduation constitutes 14% of the score and at graduation is 4%). Owing to the employment numbers they claimed, ASU, George Mason, Loyola Marymount, Kentucky, and UNLV had an employment score in the ranking that was higher than Yale’s, and higher than about 180 other law schools, including many law schools far above them in the overall ranking.

Employment percentages have a major impact because they are taken at face value by the ranking as a valid, directly comparable measure among law schools.

Let’s now dig beneath the surface of these claimed employment percentages to see whether George Mason, etc., merit a raw score higher than or equivalent to Yale’s. I will provide just three crucial numbers: the claimed employment percentage, the percentage of the class that obtained “JD required” jobs (including judicial law clerks), and the percentage of these lawyer jobs that were part time.

Yale: employment 91.8%; percent of class with lawyer jobs 86.9; percent of lawyer jobs part time: zero.

George Mason: employment 96.4%; percent of class with lawyer jobs 59.8; percent of lawyer jobs part time 3.8.

Loyola Marymount: employment 94.1%; percent of class with lawyer jobs 84.3; percent of lawyer jobs part time 28.8.

Kentucky: employment 94.2%; percent of class with lawyer jobs 80; percent of lawyer jobs part time 2.1.

UNLV: employment 93.2%; percent of class with lawyer jobs 79.5; percent of lawyer jobs part time 6.

Florida International: employment 90.1%; provided no information on percentage of lawyer jobs and part-time jobs.

Akron: employment 91.8%; percent of class with lawyer jobs 61.2; percent of lawyer jobs part time 12.2.*

Toledo: employment 90.1%; percent of class with lawyer jobs 55.3; percent of lawyer jobs part time 39.3.*

Atlanta John Marshall: employment 91.6%; percent of class with lawyer jobs 39.1; percent of lawyer jobs part time is erroneous.

(* An article in the Ohio Lawyer by Jason Dolin, disclosing statistics he obtained in a public records request, reports that 53% of Akron graduates and 40.3% of Toledo graduates had full time lawyer jobs.)

Given the wide disparity in underlying results, it is immediately apparent that the claimed overall employment percentages are meaningless. Rather than provide useful information, they obfuscate the employment results of graduates. (Want to guess what Florida International’s real job numbers look like?)

Before saying a few words about what this information shows, I must emphasize a crucial piece of information it doesn’t tell us: what percentage of these lawyer jobs were funded by the law school itself. This is when the school in effect pays the salary of a person working in a firm or some other legal setting. As Bernie Burk shows, many of the top fifty law schools subsidized a significant chunk of their “employed” graduates. At Minnesota, for example, when you subtract the 14.08% of graduates whose jobs were paid by the school, its 91.90% employed drops to 77.82%. At ASU, 12.2% of the 98.2% employment percentage was school funded. (This is why Dean Sylvester admitted that his employment percentage did not provide an accurate picture of the job situation.) It is important to know how many jobs are school funded because they are low-paying temporary positions, often part time, and are not genuine jobs for the students who have them.

What the numbers show is that none of the above schools matched the employment success of Yale, and most were far worse in fact, although they all received a higher or roughly equal score from US News for this factor (14% of the overall score).

Take George Mason. Although it handily beats Yale in employment (96.4% to 91.8%), in terms of lawyer jobs they are miles apart: 86.9% at Yale (none part time) versus 59.8% at GM (4% part time). Or look at UNLV, which also claims a higher employment than Yale, but landed significantly fewer lawyer jobs, 79.5% (6% part time). And let’s not forget Atlanta’s John Marshall, nearly equal to Yale in employment percentage, but with only 39.1% graduates working as lawyers (we aren’t told the part time percentage). (It bears mention that Loyola’s 28.8% part time lawyer jobs number is unusually high, indicating placement weakness.)

Now let me repeat the phrase in my previous post that was identified by a critic as offensive and unfair on my part: “It is absurd to think that any of these schools exceeded or matched Yale’s employment rate.” I invite readers of this post to tell me if my statement was indeed unfair—comments open.

Let me also state why the employment percentages reported by these law schools, although truthful, are nonetheless misleading to people who read them: IT IS NOT IN FACT TRUE THAT A STUDENT WHO ATTENDS THESE SCHOOLS HAS A 9 OUT OF 10 CHANCE OF LANDING A LAWYER JOB (OR SOMETHING EQUIVALENT). That is what schools represent when they list employment rates in the 90% range. That was true at Yale, but not at the other schools I mentioned.

The jobs funded by law schools are not real jobs (low pay and temporary) and no one who attends law school thinks a part time job is a good result. (I am setting aside the “academic,” “business,” and “JD preferred” categories that are counted in the overall employment percentage because they are highly susceptible to abuse by law schools—George Mason claims a combined 25% employment in academic and business; Loyola claims 20%.)

The riposte to my argument takes three tracks: 1) we are telling the truth (so stop calling us liars!); 2) it’s not our fault—we are just following the reporting categories and rules set up by US News and the ABA; 3) our listed employment percentage is not misleading because prospective students can obtain a more detailed picture on our website.

On #1, I agree that legal administrators are not liars (I apologize if I gave that impression), although some make more liberal judgments than others when deciding what counts for “business,” etc. Again, my core point is that truthful information can still mislead, and that is precisely the situation here.

Dean Sylvester in his statement alluded to response #2—“We’re just following the rules set up by US News. Don’t blame us if we get a higher score than Yale despite worse job results.” This response is not quite correct, however. US News indeed decides which categories it will count and how much weight to allocate to each. But law schools are not passive in the matter, with no responsibility for the outcome. They maximize their scores in whatever way they can within the terms (if not spirit) of the rules—and some schools are far more aggressive than others. The employment numbers advertised by a bunch of law schools are in the ninety percent range because law schools take a host of strategic actions to goose their scores, from hiring their own graduates to subsidizing temporary jobs at law firms.

If response #2 is sound, it would produce a ridiculous result: All law schools would do everything they possibly could to boost their numbers (tempered only by financial resources), and we would all "truthfully" report 97% to 100% employment rates. This is exactly what over a hundred law schools were doing two-to-three years ago, which is what got us into trouble. No one buys the “it’s not our fault” line—certainly not the public and not the Senators who expressed their concerns to the ABA.

On #3, I agree that law schools that provide detailed information on their websites are doing better than schools that do not. But I do not agree that this “transparency” absolves law schools of responsibility for the misleading impression produced by their advertised employment percentages.

It is not fair to expect prospective law students to dig into the numbers at each school they are considering in order to decipher the true underlying numbers—and moreover this is not easy to do because the information on many web pages is difficult to apprehend (as Burk confirms). It takes skilled reading, the use of a calculator, and knowledge of which numbers modify others.

Signs of weaknesses beneath advertised employment percentages do not always show straightforwardly. A school might look good with a relatively high percentage of students in lawyer jobs, but this must be discounted if a sizable percentage of these jobs are part time (as with Loyola). And it’s hard to compare schools: Is a 70% lawyer rate, of which 5% are part time, better than an 80% lawyer rate with 20% part time? When discounting, the student would also have to factor in what percentage and what types of jobs are subsidized by the school. All of this requires a lot of savvy on their part and close, skeptical reading of the numbers.

The downside of the transparency push is that law schools are now providing lists of detailed information, the implications of which cannot be fully understood (anyone who scoffs at this should read a few). Adding more numbers to the mix doesn’t necessarily make misleading employment percentages less misleading for consumers, but it does allow law schools to proclaim that they have provided “full disclosure.”

That is why I proposed in the prior post that only one number should be highlighted and should count in the ranking: the percentage of the graduating class that obtained full time “JD required” jobs (not funded by the law school) nine months after graduation. With this standard, every prospective student would be able to compare every law school on the same criterion, with no further searching or discounting or figuring required. Any additional categories will be treated by law schools as opportunities for gaming. Law schools will hate this suggestion because the full time “JD required” percentage will be shockingly low at many places.

I’ll close by returning to the bottom line. The job market for law graduates is horrendous; more than 35% of law graduates in 2010 did not land full time lawyer jobs. It is so bad that even students at top 15 law schools are having difficulty finding jobs. Yet many law schools continue to advertise deceptively high employment rates (85%, 88%, 90%, 93%, 96%).

It perfectly epitomizes our collective failure to solve this problem that the dean of a respected law school would publicly celebrate a noteworthy jump in the ranking, from 40 to 26, achieved on the strength of a stunningly high 98% employment rate, then in the next breath tell us that this number is “not an accurate” reflection of “reality.”

That says it all.


Postscript:

Dean Sylvester crowed in the press release about ASU’s increase in reputation among academics. After this incident, ASU’s reputation has plummeted in my eyes. A law school must stand by the accuracy of its public representations. This is a professed value of academia as well as of the legal profession. (See ABA Model Rules 7.1—“Truthful statements that are misleading are also prohibited.”) If saying this out loud makes me uncharitable to another institution, I do so reluctantly. Our silent acquiescence to actions like this will insure that they continue, inflicting further damage on the already sullied reputation of law schools generally.

Comments:

The practice of law is adversarial.

Law schools train students to be lawyers.

Ergo, the business of law schools is adversarial.

(That's the best "sillygism" I could come up with this early in the morning. By the way, I have added to my reading list A. Benjamin Spencer's recent "The Law School Critique in Historical Perspective" available at SSRN:

http://ssrn.com/abstract=2017114

I have only read Section "II. From Blackstone to Langdell" that covers from colonial days to early 20th century, for a research project of mine. But when I get the time, I plan to read the entire article. Hopefully, it will be of help in better understanding the serious issues that Brian has been raising.)
 

This comment has been removed by the author.
 

I think this is an excellent start and that more important work should be done to enhance accountability at law schools. I would tweak your Yale placement standard and make it a top-six placement standard to account for the fact that there are probably a not insubstantial number of Yale grads who eschew practice for other ambitions, and also to add a bit of geographic diversity. I really wish there were more data points along the lines of those you've started collecting.
 

I read Prof. Spencer's article (78 pages in length) and found it quite interesting. He states in the first paragraph: "We thus have what appears to be a perfect storm in legal education: Law school graduates are underemployed, over-indebted, and under prepared for practice." He does not focus on student debt in the course of the article. Nor does he spend any time on the "numbers" provided by law schools, with only casual mention of U.S. News & World Report classifications. Rather, the focus is on the educational failures of law schools, putting this into historical perspective, which I found to be quite interesting. The last footnote, 414 on page 78, does reference Brian Tamanaha's "My 'Dean's Vision' Speech" posted on this Blog on Nov. 16, 2010.

Because of the length of the article, perhaps those following Prof. Tamanaha's posts at this Blog should first check the Table of Contents on the first page in determining whether to read it.
 

I think this is much clearer and more fair than your original post, Brian, so thanks for that.

I still think its important to make it clear, though, that it is not "law schools" that are posting misleading employment stats. They report ALL of the relevant data to US News--in fact that's how you got it--and US News then chooses to emphasize the "overall employment" above all others. If we need to bring more pressure to bear, it is probably on the folks at US News to be more sophisticated in how they use employment in the rankings.

I know that my institution, for example, doesn't necessarily approve of the the US News strategy, and so posts ALL of the stats, in very easily digestible form, on the prospective students section of the webpage. This includes data for the last four years, and includes everything from school funded jobs, to nonprofessional Starbucks type stuff.

Maybe if schools did more to publicize this stuff, and to pressure US News, we could do more to dispel the misleading part of the numbers....
 

Ian,

I'm glad you find this post more acceptable than the last. Having said that, I do not believe there was anything unclear or unfair about my previous post. The only difference is in that one I pointed out a series of anomalies that raised questions, while in this one I provided underlying data to show that those questions were indeed merited. The two posts are consistent.

As for US News, there is no doubt that they can produce a better ranking. But we can't continually point the finger at a rating service when law schools post sparkling 93%, 96%, 98% employment rates during a dismal job market.

We are the ones that look bad in the eyes of the public, not US News. And until we take responsibility for the situation things will not improve.
 

The very fact that law schools spend so much time and money thinking up white lies, finessing, and manipulating their placement statistics tells you all you need to know about whether they think prospective students rely on their manipulated statistics, or whether they actually believe prospective students are rational actors that are able to obtain the real, unbiased truth.

Seems like they want to have their cake and eat it too: produce skewed placement statistics to lure applicants, but disclaim any responsibility because prospective students should all know that they're lying.
 

Post a Comment

Older Posts
Newer Posts
Home