2017-10-02 Site Update

Those interested in Student Debt Data can find it in the link.

The changes are not substantive, so if that doesn’t excite you, then listen to Sleaford Mods instead.

Fin.

Advertisements

2016: The Middle-School Premium Returns With a Vengeance

Mid-September, the Census Bureau publishes its Income, Poverty, and Health Insurance tables for the previous year. I spent a few hours combing through the latest update to see what they say about young people’s incomes by education level. Going back to 1991, the data tend to validate my position that college education is not raising people’s earnings with human-capital superpowers. This can be shown by observing how more people go to college while their aggregate income isn’t rising.

Okay, well, it rose a little bit this year.

Here’s the table comparing income growth by education level for people in the 25-to-34 age bracket. It’s the mean average of the annual growth rates of both aggregate earnings and per-capita earnings. We want college grads’ per-capita earnings to be growing at least as fast or faster than their aggregate earnings because it would show that the population effects aren’t being swamped by human-capital effects. Alas, they are.

In most years, high-school graduates’ incomes have risen more per capita than college grads’. Over a prolonged time period, this doesn’t bode well for college graduates.

But this year—whoa! Dig those less-than-9th graders! They received a more than one-quarter wage hike! When was the last time you got a quarter raise? Long live the middle-school premium!

Yes, this last one is horse-race reporting with erratic data, but until the consensus acknowledges that college is not producing positive outcomes in the aggregate, I’m not apologizing.

**********

Past coverage:

Class of 2016 NALP Data

Happy post-Labor Day. Now back to work, Peasants!

Or, read on.

A few weeks ago, the National Association for Law Placement (NALP) published the national summary report for its Employment Report and Salary Survey (ERSS) (pdf). As with the last two years, I comb the data for more information that the NALP may not have commented on. Much of the NALP report focuses on year-over-year changes to percentages of employed graduates that aren’t very illuminating, especially when the resulting percentages of employed graduates are barely budging. Here’s what they look like.

I’m aware that we now have three consecutive years of data showing graduate employment outcomes ten months after graduation rather than nine, but I really don’t think that makes much of a difference.

It appears that the percentage of graduates not working fell a whopping 0.8 percent. Whoa.

Here’s also the number of graduates employed by status.

We’re seeing a pretty steep fall in total graduates, but the number and proportion of them not working is still higher than the peak employment year of 2007. A lot of this is elevated bar failure rates, but even so the JD-advantage category is still elevated. The NALP says 40 percent of grads in these jobs are seeking other work, which tells me these positions aren’t worth much. In fact, much of their growth (not shown) is visible in business-and-industry positions, further suggesting the definition of JD-advantage is overbroad. They also strongly correlate negatively with bar-passage-required jobs and positively with grads not working.

Here’s the contribution to the percent change in law grads by employment status since 2007 and going back to 2001. We can see that despite falling total grads, a greater proportion of them are either not working or in JD-advantage positions (which are probably not legal jobs themselves).

Meanwhile, with bar-passage-required jobs contributing -15.7 percent to the -14.6 percent change in law-grad outcomes, here’s how private-practice positions have fared (-9.2 percent to all 2007 grads).

The class of 2016 is the first one to be wholly below the 2007 line, meaning that even tiny firms aren’t hiring grads like they did in the peak year. Supply of law grads does not create demand for legal services, strongly indicating that grads in past years who found these jobs only worked in them transiently until they left the legal labor market.

The NALP’s selected findings (pdf) discuss “tightness” in the job market now or at least compared to the pre-recession market. The large fall in bar-passage-required jobs and private-practice jobs argues otherwise. A tighter market would see more grads working in bigger firms and smaller firms raising wages, something the NALP’s own data don’t depict.

********************

Prior reporting on this topic:

As Charlotte Closes, a Plea for Data Integrity

The ABA Journal heralds the closure of Charlotte Law School. I have no editorial beyond, well, it was an honestly dishonest student loan funnel, struggling since January, and Betsy DeVos couldn’t save it. If we’re unlucky, it’ll bounce back.

As a tie-back to last week’s post on the ABA’s Council of the Section of Legal Education and Admissions to the Bar’s decision to simplify law-school employment data, which it’s walked back, I write to express worries about how the ABA manages data for closed or merged law schools.

As of now, users of the Standard 509 Reports page can merrily explore information on bygone law schools such as Hamline, but anyone interested in the adventures of post-merger schools such as Rutgers-Camden will find no separate information on it. It has no 509 reports, it doesn’t appear in the spreadsheets for past years, and in some years the “Rutgers” (merged) entry contains no information at all.

This poses a problem for researchers because the 509 reports reflect law schools as they exist today and not how they existed in the past. I guess it would take more effort to maintain information on old law schools, but doing so anachronistically raises the question of why the ABA bothers keeping reports for past years up.

I try to download a set of the 509 information reports annually as a backup (yes, it’s tedious) and because it’s partly how this blog found its footing. I don’t do so for the employment summary reports (because, yes, it’s tedious). I would prefer not to change my habits.

Thus, I ask that the ABA maintain it’s information reports on law schools consistently for the sake of researchers. Indiana Tech, Charlotte, Whittier, and the schools that have merged may not rise again, but I’m sure someone might want to know more about their existences, even for trivial information like application deadlines.

Will Law-School Employment Outcomes Be Politicized Under Trump?

Today’s headline spoofs a February 2, 2017, Atlantic article by Gene Sperling about what might happen when the U.S. government’s statistical agencies produce data that His Emolumence doesn’t like. It may still be a legitimate fear, but before anyone could neuter GDP aggregates, another statistical agency has unexpectedly taken the lead in the race to deprecate data: the American Bar Association’s Council of the Section of Legal Education and Admissions to the Bar. As you’ve likely read elsewhere—I strongly recommend Jerry Organ’s post on TaxProf Blog—the council has chosen to greatly reduce the data it requires law schools to report and present in their respective employment-outcomes tables and spreadsheets.

I won’t recapitulate Organ’s arguments, but I will try to highlight how the changes would affect specific topics I’ve reported on and opinions I’ve developed.

One, in his memo to the council (pdf), University of Virginia School of Law professor Paul Mahoney discusses my data mishap in my post, Class of 2016 Employment Report (Corrected), as evidence of “confusion” caused by the employment data. Unsurprisingly, neither Mahoney nor anyone at the council asked me for a response, so I must start there.

The issue was the ABA’s earlier decision to separately account for school-funded positions in the above-the-line employment “statuses” (“Employed-bar-passage required,” “employed-professional position,” etc.), which collectively add up to the total number of graduates. I had forgotten the ABA made this change, so I subtracted the school-funded positions from the number of bar-passage-required jobs. I don’t relish drawing attention to my mistakes, especially when my rebuttal makes me look sloppier than the accusation: I wasn’t “confused”; I plumb forgot the ABA changed the employment table. I work with these data once a year, so I slipped into the old habit. (In fact, just last week I added a note to myself to avoid that mistake again next year.) That’s an illogical reason to revert to the previous practice. Subtracting out funded jobs is a tedious process, and I don’t want to start doing that again. For these reasons, eliminating the employment status category for school-funded positions is a bad idea.

On to other aspects of the changes.

Two, eliminating the total number of graduates from the reports makes them more difficult to use. I regularly derive the percentages of graduate employment outcomes by various categories, and now calculating the total number of graduates at a given school or at all of the schools will be a chore. Another benefit of including the number of graduates in the reports is that it gives data users an opportunity to double-check the ABA’s and law schools’ numbers—an especially important process because I do not believe the ABA does so itself. If the number of graduates in the employment status section do not add up to the total, then I know there’s a problem. It’s something I’ve tracked behind the scenes, but it’s still useful. Now, everyone will need to do more math to arrive at what should be a foundational number.

Three, the new “Employed-Other” category is absurd. A law graduate working in a professional, non-law job like at a hedge fund is not the same as flipping burgers. If anything, the ABA should put more effort into fairly defining “Employed-JD advantage” and less into consolidating categories that make law schools look bad.

Four, the new “Unemployed or status unknown” category is also absurd. Again, it’s consolidating categories that may make law schools look bad rather than carefully specifying which ones make them look good. Moreover, I detest disjunctive definitions, e.g., “Social Security/Medicare/Medicaid,” as though they’re a unified program. Disjunctive definitions encourage composition fallacies and false dilemmas, something lawyers should avoid.

Five, I have used the broad “employment type” categories (e.g., “solo,” “2-10 lawyers,” etc.) to draw conclusions about how graduates’ jobs change from year to year. For example, in the post, “Change in Graduate Outcomes Driven by Small Jobs,” which analyzes the class of 2015, discussed how most of the decline in graduates that year was felt in 2-10-lawyer jobs. I also noted how jobs were distributed among law schools in, “Law Grad Jobs Unequal Like Income in Corrupt Countries.” The new changes hamper these detailed analyses—and in a way that masks underperforming law schools. Thus, the new 10-100-lawyer-practice category is overbroad.

Six, I recognize that some school-funded positions are more durable than others as Mahoney argues and even that applicants might prefer to attend schools that reemploy their own grads as an insurance policy against unemployment. Part of the problem is that “long-term” means at least one year, which includes positions lasting only one year, but until that term is replaced by a concept that cannot be manipulated by law schools, there’s every reason to see long-term school-funded jobs as dubious attempts at padding employment outcomes. Prospective applicants actually want to know their opportunities for indefinite employment, not one- or two-year gigs, before applying. (And yes, for this reason, I somewhat discount the value of clerkships too.) Moreover, separating law-school-funded positions based on income ($40,000) is simply arbitrary. Fixed dollar amounts eventually need to be updated according to inflation, they do not necessarily reflect the cost of living for a particular location, and they do not speak to the value of legal education. The median bachelor’s-degree holder in the 25-34 age bracket earned $46,099 in 2015.

Of Organ’s other criticisms, longitudinal data and consistency with NALP data stand out. I won’t repeat them.

It’s amazing that the council is so easily swayed by the proposal of one law-school professor without much deliberation. It’s an entire level worse than if the council had secretly produced the changes on its own because then at least there may have been some give and take in the process. Now it just looks as though the council is uninterested in its responsibilities and merely beholden to individual professors’ whims. I thought the ABA wanted to improve its image as a more transparent, responsive organization, but by rubber-stamping a  professor’s (and only a professor’s) wish list, it further tarnishes its credibility.

As for my opinions on the employment questionnaire—which the council would scrutinize as I’m not a law prof—although there may be reasons to simplify the employment survey, I would very much prefer to let it rest for several years. The fact that it’s adjusted so frequently indicates lack of seriousness about its purpose.

I recommend joining Organ’s petition in his post.

LSAC Report Calls Into Question ‘Law-School Tipping Point’

One discovery I’m fond of is identifying the “law-school tipping point,” the moment when people with LSAT scores in hand decided to forgo law school entirely. I hypothesized that one could detect the tipping point by comparing over time the ratio of LSAT takers to subsequent applicants. Sure enough, the ratio spiked in the 2010 application cycle, which is the last applicant peak.

Here’s an updated version of the chart I created for that post:

(Source: LSAC (here for total LSATs and here (pdf) for first-time takers)

The arrow focuses on the pronounced gap between first-time LSAT takers in the 2009 calendar year and applicants for fall 2010. I took this as evidence of what I suspected had happened: Many people with LSAT scores in hand chose not to apply to law school the following year, presumably because they realized it was a really bad idea. In the three prior years, the ratio between first-time test takers and subsequent applicants was 1.08 on average. In 2010, it jumped to 1.19, accounting for 8,800 test takers who were not found in the 2010 application cycle. The ratio has since fallen to about 1.15 going forward.

The LSAC, however, recently published a report titled, “Analysis of LSAT Taker Application Behavior: Testing Years 2009-2010 Through 2015-2016” (pdf). It is an update of a similar report published in 2013, which I don’t recall seeing and cannot find, and it contains a table showing when test takers applied to law school.

Curiously, there’s scant evidence of a drop in test takers applying to law school, going by the first three columns of the table. People who took the LSAT between June 2009 and February 2010 pretty much all applied, save for 1 percent (~1,400 test takers). It doesn’t look like they delayed their applications either, which would cause them to appear in the rightward columns. (I don’t think the fact that I’m looking at calendar-year LSATs as opposed to June-February LSATs changes the results significantly.)

I have a hard time explaining the diverging results. It would certainly help to see previous years’ data, but my best hunch is that test-takers’ application behavior stayed the same while the frequency of their test-taking rose. In other words, perhaps many of these first-time takers simply retook the LSAT. As evidence, the LSAC report provides another figure (not shown) indicating that non-applicants tend to do very poorly on the LSAT, though they overlap with the low-end of applicants. These non-applicants may have doubled-down and chosen to retake the test again in 2010-11, and applied with whatever score they got then. Moreover, the ratio for second-time test takers to subsequent applicants (not shown) remained elevated after 2009-10 at about 0.42. Rather than walking away from the process potential applicants simply tried harder to beat the pack.

It’s a discouraging thought, but either hypothesis is valid at least to some degree until better information comes along. In the meantime, one thing the LSAC report teaches is that by and large people who do poorly on the LSAT are not as unsophisticated as they’re often portrayed. They tend to self-select by dropping out of the system. That doesn’t matter much to me since I care more about applicants, admits, and matriculants than LSAT takers, but whenever folks focus their attention on the smart people not applying to law school, just remember that many people who aren’t so good at standardized tests have been making the right choice all along.

LSAT Tea-Leaf Reading: June 2017 Edition

For those of you who were as curious as I about whether a secular trend in law-school interest was causing the uptick in LSATs, well, too bad! Count me in on the bandwagon attributing it to His Emolumence’s perfidy. I try to caution against reading the minds of potential law-school applicants, but what other explanation is there? In June 2017, 27,606 people took the LSAT, up an amazing 19.8 percent from a year ago (23,051).

The four-period moving sum rose by 4.2 percent to 113,909, a record not seen since December 2012 (115,348). To put these numbers in context, the last time there were this many June LSAT-takers was June 2010 (32,973). You know, back when I first joined the crowed warning people that law school was usually a bad idea. In fact, the year-over-year growth rate is the highest going all the way back to 1988—the second year for which the LSAC reliably publishes LSAT-administration information. The 4.2-percent growth rate for the moving sum is comparable to December (4.1 percent) and September/October (6.5 percent) 2009 . If I had seen the June 2017 LSAT without knowing anything else, I’d’ve thought the economy was in a recession (or L.A. Law came on the air, which is what some claim caused the ’80s surge).

I note three more items. One, the LSAC sure released this information a lot more quickly than last year, when it took until August to tell us about the June LSAT results. If I didn’t know any better, I’d think it was happy to discuss good news…

Two, I acknowledge that the 19.8-percent LSAT jolt was leaked last week. Uh-huh. That’s consistent with the times…

Three, if the political cause for the renewed enthusiasm is true, then bless these LSAT-takers idealistic hearts. However, next to nothing has changed in the U.S. or legal economies since Inauguration Day to warrant a more optimistic outlook on the legal profession (unless you’re defending His Emolumence and his family, in which case, you’re probably teetering in the character-and-fitness department). Meanwhile, going by my predictions from earlier this year, it appears Congress is going nowhere, so don’t expect much reform of Grad PLUS loans. Instead, maybe Betsy DeVos will whip up yet another income-directed repayment plan. Or maybe she’ll get high on Ben Carson’s glyconutrients stash.

Final word: I can’t imagine the renewed interest in law school lasting as long as our dear leader’s tenure in office, but it may be a while yet.