Month: August 2017

As Charlotte Closes, a Plea for Data Integrity

The ABA Journal heralds the closure of Charlotte Law School. I have no editorial beyond, well, it was an honestly dishonest student loan funnel, struggling since January, and Betsy DeVos couldn’t save it. If we’re unlucky, it’ll bounce back.

As a tie-back to last week’s post on the ABA’s Council of the Section of Legal Education and Admissions to the Bar’s decision to simplify law-school employment data, which it’s walked back, I write to express worries about how the ABA manages data for closed or merged law schools.

As of now, users of the Standard 509 Reports page can merrily explore information on bygone law schools such as Hamline, but anyone interested in the adventures of post-merger schools such as Rutgers-Camden will find no separate information on it. It has no 509 reports, it doesn’t appear in the spreadsheets for past years, and in some years the “Rutgers” (merged) entry contains no information at all.

This poses a problem for researchers because the 509 reports reflect law schools as they exist today and not how they existed in the past. I guess it would take more effort to maintain information on old law schools, but doing so anachronistically raises the question of why the ABA bothers keeping reports for past years up.

I try to download a set of the 509 information reports annually as a backup (yes, it’s tedious) and because it’s partly how this blog found its footing. I don’t do so for the employment summary reports (because, yes, it’s tedious). I would prefer not to change my habits.

Thus, I ask that the ABA maintain it’s information reports on law schools consistently for the sake of researchers. Indiana Tech, Charlotte, Whittier, and the schools that have merged may not rise again, but I’m sure someone might want to know more about their existences, even for trivial information like application deadlines.

Advertisements

Will Law-School Employment Outcomes Be Politicized Under Trump?

Today’s headline spoofs a February 2, 2017, Atlantic article by Gene Sperling about what might happen when the U.S. government’s statistical agencies produce data that His Emolumence doesn’t like. It may still be a legitimate fear, but before anyone could neuter GDP aggregates, another statistical agency has unexpectedly taken the lead in the race to deprecate data: the American Bar Association’s Council of the Section of Legal Education and Admissions to the Bar. As you’ve likely read elsewhere—I strongly recommend Jerry Organ’s post on TaxProf Blog—the council has chosen to greatly reduce the data it requires law schools to report and present in their respective employment-outcomes tables and spreadsheets.

I won’t recapitulate Organ’s arguments, but I will try to highlight how the changes would affect specific topics I’ve reported on and opinions I’ve developed.

One, in his memo to the council (pdf), University of Virginia School of Law professor Paul Mahoney discusses my data mishap in my post, Class of 2016 Employment Report (Corrected), as evidence of “confusion” caused by the employment data. Unsurprisingly, neither Mahoney nor anyone at the council asked me for a response, so I must start there.

The issue was the ABA’s earlier decision to separately account for school-funded positions in the above-the-line employment “statuses” (“Employed-bar-passage required,” “employed-professional position,” etc.), which collectively add up to the total number of graduates. I had forgotten the ABA made this change, so I subtracted the school-funded positions from the number of bar-passage-required jobs. I don’t relish drawing attention to my mistakes, especially when my rebuttal makes me look sloppier than the accusation: I wasn’t “confused”; I plumb forgot the ABA changed the employment table. I work with these data once a year, so I slipped into the old habit. (In fact, just last week I added a note to myself to avoid that mistake again next year.) That’s an illogical reason to revert to the previous practice. Subtracting out funded jobs is a tedious process, and I don’t want to start doing that again. For these reasons, eliminating the employment status category for school-funded positions is a bad idea.

On to other aspects of the changes.

Two, eliminating the total number of graduates from the reports makes them more difficult to use. I regularly derive the percentages of graduate employment outcomes by various categories, and now calculating the total number of graduates at a given school or at all of the schools will be a chore. Another benefit of including the number of graduates in the reports is that it gives data users an opportunity to double-check the ABA’s and law schools’ numbers—an especially important process because I do not believe the ABA does so itself. If the number of graduates in the employment status section do not add up to the total, then I know there’s a problem. It’s something I’ve tracked behind the scenes, but it’s still useful. Now, everyone will need to do more math to arrive at what should be a foundational number.

Three, the new “Employed-Other” category is absurd. A law graduate working in a professional, non-law job like at a hedge fund is not the same as flipping burgers. If anything, the ABA should put more effort into fairly defining “Employed-JD advantage” and less into consolidating categories that make law schools look bad.

Four, the new “Unemployed or status unknown” category is also absurd. Again, it’s consolidating categories that may make law schools look bad rather than carefully specifying which ones make them look good. Moreover, I detest disjunctive definitions, e.g., “Social Security/Medicare/Medicaid,” as though they’re a unified program. Disjunctive definitions encourage composition fallacies and false dilemmas, something lawyers should avoid.

Five, I have used the broad “employment type” categories (e.g., “solo,” “2-10 lawyers,” etc.) to draw conclusions about how graduates’ jobs change from year to year. For example, in the post, “Change in Graduate Outcomes Driven by Small Jobs,” which analyzes the class of 2015, discussed how most of the decline in graduates that year was felt in 2-10-lawyer jobs. I also noted how jobs were distributed among law schools in, “Law Grad Jobs Unequal Like Income in Corrupt Countries.” The new changes hamper these detailed analyses—and in a way that masks underperforming law schools. Thus, the new 10-100-lawyer-practice category is overbroad.

Six, I recognize that some school-funded positions are more durable than others as Mahoney argues and even that applicants might prefer to attend schools that reemploy their own grads as an insurance policy against unemployment. Part of the problem is that “long-term” means at least one year, which includes positions lasting only one year, but until that term is replaced by a concept that cannot be manipulated by law schools, there’s every reason to see long-term school-funded jobs as dubious attempts at padding employment outcomes. Prospective applicants actually want to know their opportunities for indefinite employment, not one- or two-year gigs, before applying. (And yes, for this reason, I somewhat discount the value of clerkships too.) Moreover, separating law-school-funded positions based on income ($40,000) is simply arbitrary. Fixed dollar amounts eventually need to be updated according to inflation, they do not necessarily reflect the cost of living for a particular location, and they do not speak to the value of legal education. The median bachelor’s-degree holder in the 25-34 age bracket earned $46,099 in 2015.

Of Organ’s other criticisms, longitudinal data and consistency with NALP data stand out. I won’t repeat them.

It’s amazing that the council is so easily swayed by the proposal of one law-school professor without much deliberation. It’s an entire level worse than if the council had secretly produced the changes on its own because then at least there may have been some give and take in the process. Now it just looks as though the council is uninterested in its responsibilities and merely beholden to individual professors’ whims. I thought the ABA wanted to improve its image as a more transparent, responsive organization, but by rubber-stamping a  professor’s (and only a professor’s) wish list, it further tarnishes its credibility.

As for my opinions on the employment questionnaire—which the council would scrutinize as I’m not a law prof—although there may be reasons to simplify the employment survey, I would very much prefer to let it rest for several years. The fact that it’s adjusted so frequently indicates lack of seriousness about its purpose.

I recommend joining Organ’s petition in his post.