Uncategorized

How to Falsify the Bennett Hypothesis

The Bennett hypothesis asserts that colleges and universities absorb increases in federal student aid and pass them back onto students as higher tuition charges. Put differently, the hypothesis states that the quantity of higher education is inelastic (insensitive) to its price, so higher-education institutions collect subsidies to students as rents instead of expanding enrollments. Importantly, the hypothesis does not in itself explain why colleges and universities absorb federal student aid (contrary to what its namesake might have believed); it merely offers the mechanism. I regularly cite the hypothesis as a guide to understanding the relationship between the federal-student-loan program and law school costs and debt.

As stated, the Bennett hypothesis is easy to falsify: Find examples where increases in federal student aid did not correspond to higher tuition charges, or show that those subsequent higher charges were due not to the government’s intervention but other factors. Time sequence, correlation, and non-spuriousness. There is a contentious literature on this topic.

Via TaxProf, I see that a research paper, “An Empirical Examination of the Bennett Hypothesis in Law School Prices” (“Empirical Examination”), attempts to test the Bennett hypothesis on private law-school tuition costs. The paper’s author, Robert Kelchen of Seton Hall University (not the law school), argues that he finds no empirical evidence of the Bennett hypothesis’ effects. I believe “Empirical Examination” arrives at unsound conclusions because it mischaracterizes the Bennett hypothesis, and it insufficiently addresses the dynamic history of legal-education financing since 2005.

(I note that “Empirical Examination” is published via AccessLex, which is related to Access Group, the erstwhile private student-lending organization that financed many law students’ legal educations before the advent of Grad PLUS loans.)

Mischaracterizing the Bennett Hypothesis

“Empirical Examination” poses two research questions that mischaracterize the Bennett hypothesis:

(1) Did tuition/fees or living expenses for law school students increase at a faster rate following the creation of the Grad PLUS program in 2006 and the expansion of income-driven repayment in 2007?

(2) Did the student debt burden of law school graduates increase at a faster rate following the creation of the Grad PLUS program in 2006 and the expansion of income-driven repayment in 2007?

These questions appear to assume that the Bennett hypothesis is disproven by discovering a lower rate of increase in law-school costs and student borrowing, as though law-school cost growth can go on indefinitely. This is unscientific. “Empirical Examination” cites no formulation of the Bennett hypothesis discussing growth rates in costs and borrowing, I know of none, and I don’t think any would be correct.

Rather, the way to test the hypothesis empirically is to take away Grad PLUS loans from students at some law schools and not others. If costs and borrowing stay the same at the Grad PLUS-less law schools, then that tends to discredit the hypothesis. In fact, “Empirical Examination” cites a study that conducted a similar test of for-profit colleges and found “some support” for the Bennett hypothesis. For law schools, the closest test case is Charlotte Law School, which lost its access to federal loans earlier this year. There is no Charlotte Law School anymore, and while this may or may not be related to federal loans, it is congruent with the Bennett Hypothesis.

The History of Law-School Lending Is Consistent With the Bennett Hypothesis

“Empirical Examination” understands correctly that around 2004, law students could borrow money from the federal government and private lenders. However, thanks to a rapid series of changes, law students could borrow all of their cost of attendance plus living expenses from the government without any private lenders (whose loans became mostly nondischargeable).

A theoretical examination of Grad PLUS loans with regard to the Bennett hypothesis would compare these changes to a hypothetical baseline without them. For example, one could try to find similar situations today in which a lender would offer an unsecured consumer loan of around $100,000 at 7 percent interest for three years. I lack the finance background to perform such an estimate, but intuition suggests the answer is not good for skeptics of the Bennett hypothesis.

To illustrate, in 2005, Seton Hall University School of Law charged $32,620 for full tuition to full-time students, and law students could borrow only up to $18,500 in federal loans, leaving them to generate the extra $14,120 (43 percent of the cost). Last year, Seton Hall charged $52,022 to full-time students. Under the pre-Grad PLUS loan system, law students would need to cover $33,522 (64 percent of the cost). I doubt private lenders would be so willing to cover more than $100,000 in unsecured loans after three years of legal education, especially given law students’ repayment prospects.

Other Weaknesses in ‘Empirical Evidence’

There are other problems with how “Empirical Examination” explores the Bennett hypothesis and legal education. For one, its focus on the increase in the rate of charging and lending ignores the fact that demand in legal education has plummeted. Last year, Seton Hall received only 1,387 full-time applications, about half as many as in 2007 (2,638). The Bennett hypothesis addresses the supply-side of education, assuming demand is constant, but when demand falls on its own, then you have law schools teetering financially. As a result, assuming demand for legal education had been constant for the last decade, it’s possible that the rate of increase in cost and debt would have continued at their pre-Grad PLUS loan pace nonetheless.

“Empirical Evidence” also discusses how many law schools funnel their revenue to parent universities. This phenomenon, certainly greatly diminished today, is also consistent with the Bennett hypothesis. If law schools were not absorbing tuition as rents, then these transfers would be unsustainable—again, assuming demand is constant. Similar arguments can be made for zero-sum tuition discounting and the dysfunctions in the law-student transfer market.

*****

To conclude, “Empirical Examination” does not challenge the Bennett hypothesis as it applies to law schools. It mischaracterizes the hypothesis as predicting an increased growth rate in costs and borrowing, which implicitly assumes that growth in those measures is natural when they should reach a limit at some point. “Empirical Examination” does not address facts in the history of legal-education finance tending to show the Bennett hypothesis is correct.

Advertisements

Class of 2016 NALP Data

Happy post-Labor Day. Now back to work, Peasants!

Or, read on.

A few weeks ago, the National Association for Law Placement (NALP) published the national summary report for its Employment Report and Salary Survey (ERSS) (pdf). As with the last two years, I comb the data for more information that the NALP may not have commented on. Much of the NALP report focuses on year-over-year changes to percentages of employed graduates that aren’t very illuminating, especially when the resulting percentages of employed graduates are barely budging. Here’s what they look like.

I’m aware that we now have three consecutive years of data showing graduate employment outcomes ten months after graduation rather than nine, but I really don’t think that makes much of a difference.

It appears that the percentage of graduates not working fell a whopping 0.8 percent. Whoa.

Here’s also the number of graduates employed by status.

We’re seeing a pretty steep fall in total graduates, but the number and proportion of them not working is still higher than the peak employment year of 2007. A lot of this is elevated bar failure rates, but even so the JD-advantage category is still elevated. The NALP says 40 percent of grads in these jobs are seeking other work, which tells me these positions aren’t worth much. In fact, much of their growth (not shown) is visible in business-and-industry positions, further suggesting the definition of JD-advantage is overbroad. They also strongly correlate negatively with bar-passage-required jobs and positively with grads not working.

Here’s the contribution to the percent change in law grads by employment status since 2007 and going back to 2001. We can see that despite falling total grads, a greater proportion of them are either not working or in JD-advantage positions (which are probably not legal jobs themselves).

Meanwhile, with bar-passage-required jobs contributing -15.7 percent to the -14.6 percent change in law-grad outcomes, here’s how private-practice positions have fared (-9.2 percent to all 2007 grads).

The class of 2016 is the first one to be wholly below the 2007 line, meaning that even tiny firms aren’t hiring grads like they did in the peak year. Supply of law grads does not create demand for legal services, strongly indicating that grads in past years who found these jobs only worked in them transiently until they left the legal labor market.

The NALP’s selected findings (pdf) discuss “tightness” in the job market now or at least compared to the pre-recession market. The large fall in bar-passage-required jobs and private-practice jobs argues otherwise. A tighter market would see more grads working in bigger firms and smaller firms raising wages, something the NALP’s own data don’t depict.

********************

Prior reporting on this topic:

As Charlotte Closes, a Plea for Data Integrity

The ABA Journal heralds the closure of Charlotte Law School. I have no editorial beyond, well, it was an honestly dishonest student loan funnel, struggling since January, and Betsy DeVos couldn’t save it. If we’re unlucky, it’ll bounce back.

As a tie-back to last week’s post on the ABA’s Council of the Section of Legal Education and Admissions to the Bar’s decision to simplify law-school employment data, which it’s walked back, I write to express worries about how the ABA manages data for closed or merged law schools.

As of now, users of the Standard 509 Reports page can merrily explore information on bygone law schools such as Hamline, but anyone interested in the adventures of post-merger schools such as Rutgers-Camden will find no separate information on it. It has no 509 reports, it doesn’t appear in the spreadsheets for past years, and in some years the “Rutgers” (merged) entry contains no information at all.

This poses a problem for researchers because the 509 reports reflect law schools as they exist today and not how they existed in the past. I guess it would take more effort to maintain information on old law schools, but doing so anachronistically raises the question of why the ABA bothers keeping reports for past years up.

I try to download a set of the 509 information reports annually as a backup (yes, it’s tedious) and because it’s partly how this blog found its footing. I don’t do so for the employment summary reports (because, yes, it’s tedious). I would prefer not to change my habits.

Thus, I ask that the ABA maintain it’s information reports on law schools consistently for the sake of researchers. Indiana Tech, Charlotte, Whittier, and the schools that have merged may not rise again, but I’m sure someone might want to know more about their existences, even for trivial information like application deadlines.

Will Law-School Employment Outcomes Be Politicized Under Trump?

Today’s headline spoofs a February 2, 2017, Atlantic article by Gene Sperling about what might happen when the U.S. government’s statistical agencies produce data that His Emolumence doesn’t like. It may still be a legitimate fear, but before anyone could neuter GDP aggregates, another statistical agency has unexpectedly taken the lead in the race to deprecate data: the American Bar Association’s Council of the Section of Legal Education and Admissions to the Bar. As you’ve likely read elsewhere—I strongly recommend Jerry Organ’s post on TaxProf Blog—the council has chosen to greatly reduce the data it requires law schools to report and present in their respective employment-outcomes tables and spreadsheets.

I won’t recapitulate Organ’s arguments, but I will try to highlight how the changes would affect specific topics I’ve reported on and opinions I’ve developed.

One, in his memo to the council (pdf), University of Virginia School of Law professor Paul Mahoney discusses my data mishap in my post, Class of 2016 Employment Report (Corrected), as evidence of “confusion” caused by the employment data. Unsurprisingly, neither Mahoney nor anyone at the council asked me for a response, so I must start there.

The issue was the ABA’s earlier decision to separately account for school-funded positions in the above-the-line employment “statuses” (“Employed-bar-passage required,” “employed-professional position,” etc.), which collectively add up to the total number of graduates. I had forgotten the ABA made this change, so I subtracted the school-funded positions from the number of bar-passage-required jobs. I don’t relish drawing attention to my mistakes, especially when my rebuttal makes me look sloppier than the accusation: I wasn’t “confused”; I plumb forgot the ABA changed the employment table. I work with these data once a year, so I slipped into the old habit. (In fact, just last week I added a note to myself to avoid that mistake again next year.) That’s an illogical reason to revert to the previous practice. Subtracting out funded jobs is a tedious process, and I don’t want to start doing that again. For these reasons, eliminating the employment status category for school-funded positions is a bad idea.

On to other aspects of the changes.

Two, eliminating the total number of graduates from the reports makes them more difficult to use. I regularly derive the percentages of graduate employment outcomes by various categories, and now calculating the total number of graduates at a given school or at all of the schools will be a chore. Another benefit of including the number of graduates in the reports is that it gives data users an opportunity to double-check the ABA’s and law schools’ numbers—an especially important process because I do not believe the ABA does so itself. If the number of graduates in the employment status section do not add up to the total, then I know there’s a problem. It’s something I’ve tracked behind the scenes, but it’s still useful. Now, everyone will need to do more math to arrive at what should be a foundational number.

Three, the new “Employed-Other” category is absurd. A law graduate working in a professional, non-law job like at a hedge fund is not the same as flipping burgers. If anything, the ABA should put more effort into fairly defining “Employed-JD advantage” and less into consolidating categories that make law schools look bad.

Four, the new “Unemployed or status unknown” category is also absurd. Again, it’s consolidating categories that may make law schools look bad rather than carefully specifying which ones make them look good. Moreover, I detest disjunctive definitions, e.g., “Social Security/Medicare/Medicaid,” as though they’re a unified program. Disjunctive definitions encourage composition fallacies and false dilemmas, something lawyers should avoid.

Five, I have used the broad “employment type” categories (e.g., “solo,” “2-10 lawyers,” etc.) to draw conclusions about how graduates’ jobs change from year to year. For example, in the post, “Change in Graduate Outcomes Driven by Small Jobs,” which analyzes the class of 2015, discussed how most of the decline in graduates that year was felt in 2-10-lawyer jobs. I also noted how jobs were distributed among law schools in, “Law Grad Jobs Unequal Like Income in Corrupt Countries.” The new changes hamper these detailed analyses—and in a way that masks underperforming law schools. Thus, the new 10-100-lawyer-practice category is overbroad.

Six, I recognize that some school-funded positions are more durable than others as Mahoney argues and even that applicants might prefer to attend schools that reemploy their own grads as an insurance policy against unemployment. Part of the problem is that “long-term” means at least one year, which includes positions lasting only one year, but until that term is replaced by a concept that cannot be manipulated by law schools, there’s every reason to see long-term school-funded jobs as dubious attempts at padding employment outcomes. Prospective applicants actually want to know their opportunities for indefinite employment, not one- or two-year gigs, before applying. (And yes, for this reason, I somewhat discount the value of clerkships too.) Moreover, separating law-school-funded positions based on income ($40,000) is simply arbitrary. Fixed dollar amounts eventually need to be updated according to inflation, they do not necessarily reflect the cost of living for a particular location, and they do not speak to the value of legal education. The median bachelor’s-degree holder in the 25-34 age bracket earned $46,099 in 2015.

Of Organ’s other criticisms, longitudinal data and consistency with NALP data stand out. I won’t repeat them.

It’s amazing that the council is so easily swayed by the proposal of one law-school professor without much deliberation. It’s an entire level worse than if the council had secretly produced the changes on its own because then at least there may have been some give and take in the process. Now it just looks as though the council is uninterested in its responsibilities and merely beholden to individual professors’ whims. I thought the ABA wanted to improve its image as a more transparent, responsive organization, but by rubber-stamping a  professor’s (and only a professor’s) wish list, it further tarnishes its credibility.

As for my opinions on the employment questionnaire—which the council would scrutinize as I’m not a law prof—although there may be reasons to simplify the employment survey, I would very much prefer to let it rest for several years. The fact that it’s adjusted so frequently indicates lack of seriousness about its purpose.

I recommend joining Organ’s petition in his post.

Office of Management and Budget: +$725 Billion in Direct Loans by 2027

Every year in July the Office of Management and Budget (OMB) publishes its Mid-Session Review of the federal budget, which normally includes the Federal Direct Loan Program and projects its future. This year, the MSR (pdf) was only 22 pages because Director Mick Mulvaney said there were only “limited budget developments” since the administration released its misopauperous budget on May 23, 2016. So let’s take a look at that instead…

It’s titled, “A New Foundation for American Greatness.” My favorite part reading it thus far is the entry, “Invest in Cybersecurity,” which features an unspecified commitment.

Anyway, the budget has the Federal Direct Loan Program information we’re looking for, so back to that. The federal government’s direct loans consist primarily of student loans, but there are a few other programs in there as well. However, federal direct loans do not include private student loans, but these are a small percentage of all student loans. Thus, the OMB’s measure is both over- and under-inclusive of all student debt, but it covers most of it.

The OMB classifies direct loan accounts as financial assets net of liabilities totaling $1.227 trillion in 2016. According to the office’s projections, by 2027 this figure will grow to $1.952 trillion—59 percent.

(Source: Budget of the U.S. Government Fiscal Year 2018 (pdf))

As with previous years, the current (2016) direct loan balance is below the OMB’s past projections, but not by much. For example, in FY2012, it predicted the balance would be $1.486 trillion by 2016, $259 billion (21 percent) higher than what actually occurred. Here are the OMB’s direct loan projections going back to FY2010.

Indeed, the most notable difference between His Emolumence’s OMB and Barack Obama’s is that it is now predicting far less student lending going forward. Total direct loans won’t even exceed $2 trillion. This, I think, is a more realistic assessment of where federal student lending is going. Whether this has something to do with the new administration or is standard practice for the OMB is outside of my knowledge base.

The OMB’s measure of direct loans is the net amount owed to the government, and the annual changes to that amount are not the same as the amount lent out each year to students. The Department of Education tracks its lending, which I discuss on the Student Deb Data page.

UK Media: Adam Smith Was a Marxist

Before the UK parliamentary election a few weeks ago, the Internet kept directing me to what I surmise was mostly conservative media reporting on the Labour Party’s proposed *horrors* land-value tax, aka “garden tax.” I’m not sure if “garden” here is the Anglo version of what I would think of as a grassy backyard, but it’s very surprising that collecting location rents for public finance was both a significant issue in the days before the election and didn’t result in a loss for for the party advocating it. That’s probably the best news I’ll report on all year.

Labour’s platform included a position in favor of “considering” replacing some taxes with a land-value tax. “We will initiate a review into reforming council tax and business rates and consider new options such as a land value tax, to ensure local government has sustainable funding for the long term,” reported the Mirror.

Yet the Conservative opposition treated “consider new options” as though Labour had a bill in hand, and the sputtering from Boris Johnson and Chancellor of the Exchequer Philip Hammond was illuminating. Articles published by The Telegraph and the Daily Mail quoted these men as saying that the tax would:

  • “Bring misery to every single family in Britain”
  • “Wreck the economy”
  • “Devastate farmers”
  • “Increase food costs”
  • “Attack land on Marxist principles”

A good chunk of this is politicized anti-tax paranoia. “They talked about taxes in their manifesto. That must mean they’re coming for you!” It’s interesting in itself, however, that the Tories’ audience must be yeomen landowners as opposed to renters or condo dwellers. Then there’s the Marxism stuff, which just demonstrates their ignorance. If they were familiar with capitalism’s talisman, Adam Smith’s Wealth of Nations, they would have come across book V, chapter 2, in which the author advocates taxes on the ground-rent of houses. So yes, Tories think Adam Smith was a Marxist.

More bizarre is the obsession with farmers. Supposing that a land tax did raise food-production costs, which it would not, why would it be somehow worse than the other taxes farmers pay? Is it because farmers are really connected to the land? Does that mean I work at a floating desk? Why don’t we hear about overtaxed urbanites who are really connected to commerce? Don’t taxes on their incomes raise their labor costs? (Hint: They do.)

Oh, and did I mention that Labour’s not advocating a tax on gardens but on the locational value of the space occupied by gardens?

Anyhow, the articles mistakenly cite 3 percent of the value of people’s property as the tax’s rate. Rather, the foundation for this statement originates with the Labour Land Campaign, which explicitly recommends beginning with a 0.85 percent rate on owner-occupied real estate—not 3 percent. (More here.) Neither of the articles’ factual errors have been corrected.

The Telegraph article also says, “Opponents of the tax say it would cause house prices to plummet, putting homeowners at risk of negative equity and forcing families to sell off their gardens to developers to lessen their tax burden.” First, it’s not a tax on houses—just where they happen to be. Second, why are housing shortages caused by undeveloped real estate a good thing? Third, why are the opponents against affordable housing? Notwithstanding the possibility of negative consequences to some homeowners, the Telegraph doesn’t find anyone to answer these questions.

And you thought American media was bad.

The good news is that Labour won (and no, I’m not a Corbynite), and the Labour Land Campaign’s thankless work resulted in positive coverage when anyone bothered to check their facts. It’s well earned.