Yale, Harvard law schools pulled out. Why now?
Colin Diver has never really liked the U.S. News & World Report’s ranking system for colleges and universities. But at least he’s come to this opinion honestly. As dean of the University of Pennsylvania Law School from 1989 to 1999, he dreaded the rankings dance every year. First off, it was a lot of work. He had to answer hundreds of questions about grades and test scores for incoming applicants, and about their salaries once they got out. And then he had to do something called a peer review, in which he gave his own personal opinion about 190 different law schools. “I had not even heard of many of those law schools,” Diver said. “What we were induced to do by the desire to improve our rankings is put ourselves in the top quintile and put our close competitors in the bottom quintile. At some point, I just gave up filling that form out altogether.”
Eventually, Diver left Penn for Reed College, a school that had shunned college rankings entirely. But he couldn’t get his old colleagues to join him in ditching them. Diver says they told him, “Unilateral disarmament is suicide.”
Earlier this year, Diver published a whole book about the way he says college rankings are distorting higher ed. But he’d come to accept that he was something of a lone voice in the wilderness here. So, you can imagine his surprise when, a couple of weeks back, he got this news: First Yale Law School, then Harvard were pulling out of U.S. News’ ranking game altogether.
On a recent episode of What Next, I spoke with Diver about why some of the country’s top law schools decided to pull out of U.S. News’ influential ranking system. Are colleges next? Our conversation has been edited and condensed for clarity.
Mary Harris: Can you lay out how the idea of ranking universities, colleges, and law schools started in the first place? How did U.S. News become what it is today?
Colin Diver: The idea actually goes back over a century. There were some ad hoc rankings back in the early 1900s that were a reflection of various desires. For example, a group of graduate schools wanted a ranking of undergraduate schools so that they would know whose students they should admit. And they actually asked the U.S. government, the Bureau of Education, which is the predecessor of the Department of Education, to prepare a ranking of undergraduate schools, which then got leaked. And the schools raised holy hell about it. And President Taft decreed that it would no longer be publicly available.
I always wondered if the rankings became as important as they did because elite education was opening up to more people. It used to be that places like Harvard and Yale were finishing schools for people who went to fancy boarding schools. But when they started letting in kids from all over, those kids had to understand the value of the schools they were applying to. Like, in a way, was it well intentioned?
You could say it was. The other thing that happened was that the market for higher education went from a local and regional market to a national and now increasingly international market. It used to be that most of the people who applied to Ivy League schools came from the Northeast, and most of the people who went to Stanford came from California. But as information became more readily available and transportation costs fell, the market for elite schools became national. And arguably it became more important to have a national source of information so that somebody in California could choose among Ivy League schools, or somebody in Massachusetts could choose whether to go to Stanford.
U.S. News stepped into the breach in 1983, initially by simply polling a large group of university presidents and asking them to name the top 10 schools in their field. And they then published that. It was popular. A number of schools that were ranked highly by that method used it in their promotional material. My own college, Amherst College, came in No. 1 among the liberal arts colleges that year, and it sent a copy of the U.S. News magazine to something like 20,000 potential applicants.
So U.S. News said, “Gee, this is really working. It’s really popular. And besides, we don’t have to advertise our services because all these colleges are going to do it for us.”
So this is a symbiotic relationship. It’s good for the magazine; it’s good for the universities. Everybody’s happy initially.
But of course, a lot of the schools weren’t happy, who weren’t ranked at the top.
But I’m sure at the beginning maybe it was: “Oh, sour grapes.”
Sour grapes, yeah. And initially, U.S. News only ranked the top 25 schools.
Today, U.S. News ranks 1,500 colleges. That’s far too many to rely on word-of-mouth recommendations. Instead, the magazine’s developed a statistical ranking system, incorporating test scores, grades, and class sizes.
They claimed they made it much more scientific, much more objective—that it wasn’t just a beauty contest or an opinion poll.
I get the sense you disagree with that, though.
Well, I don’t like the beauty contest part of it either. As I told you, I think that the college presidents and deans don’t know nearly enough about the hundreds of schools in their category to be able to confidently rank them.
Statistical ranking does something else, too. It gives the project an air of certitude. It makes one magazine’s opinion seem like a fact. But the way the magazine acquires the information that it feeds into its statistical model has always been dubious. Universities self-report. This has led to one controversy after another.
Just last year, a former dean at Temple University was convicted of lying about the number of students who took an entrance exam prior to being admitted to Temple’s online business school. He was sentenced to 14 months in federal prison for the scheme. This spring, top-ranked Columbia University ended up embroiled in one of these false reporting messes, too.
The Columbia controversy sticks out to me because it involves the most visible, prominent, prestigious school of all the schools that have gotten caught. It also stands out, frankly, because there was an internal whistleblower—namely a professor of statistics in their mathematics department—who took the trouble to do research. And he looked at all the internal documents that he could get his hands on and compared the data that Columbia reported in its internal documents to the data that they submitted to U.S. News and found that there were huge discrepancies.
You really don’t have to be doing something illegal to game the system. My alma mater, the University of Pennsylvania, did something a little sneakier a few years back. They upped the number of people they were admitting early decision to the undergraduate school. And it made their general admissions numbers look more selective, and they cruised up the rankings. I think it was seen as smart at the time.
There are various kinds of gaming of the rankings. The Temple business school is at one end of the spectrum—that’s outright lying. And I think a lot of people do that, frankly. It’s very hard to catch them. A much, much more common form of gaming is the example you give, which is you change your practices or your policies so as to improve your score on a ranking measure, even if it isn’t something that you would otherwise do. In fact, it may be something that actually undermines your values or your academic quality.
That example of the dramatic increase in the number of students admitted through binding early decision, it drives down your acceptance rate because you’re only accepting people who have committed to coming. But it favors rich applicants, because poorer applicants, even middle-income applicants, can’t afford to commit to a school until they’ve had a chance to see what other schools will offer them in financial aid.
Part of the logic for the law schools’ dropping out of the rankings involved some of the gaming of the system and how that played out over years. Like, one of the metrics that U.S. News used to judge law schools was employment of graduates. And the problem was that it incentivized graduates to go to big firms, and some schools started employing graduates themselves, so it looked like their students were getting jobs when they weren’t.
U.S. News tightened up this loophole. But then the law schools said: “Now that makes it look like these people whom we’ve given fellowships to go be public interest attorneys rather than going to big firms are unemployed. It makes us look worse.” You can see how trying to fix the loopholes gets problematic after a while.
That’s a good example. For years, the law school rankings were based heavily on the employment rate of graduates. And, as you say, law schools gamed that system by creating phony jobs or short-term jobs to hire their unemployed graduates. Then, the American Bar Association cracked down on that. But, as you say, then schools that try to foster public service careers through fellowships and so forth got penalized for it.
These law schools dropping out all at once, was it coordinated at all?
The antitrust lawyers are already writing articles about whether there’s an antitrust violation here.
Really?
Yes. And apparently there might have been if these deans actually got together in a smoke-filled room and said, “Let’s agree among ourselves to boycott the U.S. News rankings.” I doubt very much that that happened. But they all have the same view of the rankings. They all would love to get out, but they’re afraid of doing it on their own. So, as soon as one of them breaks the ice dam, then the rest of them follow.
Some would call that groupthink.
It’s a form of groupthink. When I was a college president, we used to talk about the rankings in our professional meetings. And we would almost unanimously say that we didn’t like them, that we thought that they were arbitrary, that they were phony. We complained about the fact that we all felt that if we were honest, we were going to suffer by comparison to the dishonest competitors. It’s like playing in a rigged game. And so I’m sure Harvard and Stanford and Columbia and Berkeley and so forth were delighted when Yale took the step it took.
The funny thing about law schools leading the charge against the U.S. News rankings is that law students have become particularly reliant on this annual list. Many even have a nickname for the very best schools: They’re “T14,” or “top 14.” Reliably, these are the schools whose students boast the highest scores on the law school admissions test, also known as the LSAT, because that’s how U.S. News sorts programs in its statistical model.
So, why did all this happen now?
One theory that’s been expressed, which is plausible, is that the law schools that are intent on maintaining socioeconomic and racial diversity are anticipating that the Supreme Court, in the spring of 2023, is going to declare racial preferences in admission unconstitutional and illegal. When that happens, these schools are going to be concerned about how they can achieve the degree of diversity that they care about. So, perhaps the law schools have been feeling as though, if they’re still subject to the U.S. News ranking formula that privileges high LSAT scores, for example, they will be trapped into essentially admitting lily-white classes, and they don’t want to do that.
Explain how that works. How would the rankings influence admissions and interact with affirmative action?
To the extent that your ranking as a law school depends on having an entering class with very high LSAT scores, the sad reality of education in America is that most of the people with high LSAT scores are white and Asian, and not Black and Hispanic. If you’re feeling the pressure to score well on the rankings and to maintain your rankings, then your admissions office is going to feel the pressure to put the, say, Black or Latino applicant with a low LSAT on the waitlist rather than admit them.
Now the question is what withdrawing from the U.S. News rankings will really mean. Because while law schools can refuse to answer the magazine’s questions, it’s not like they’ll get left off the list entirely. Instead, as it’s done in the past, the magazine is likely to plug in its own data. Usually, that drops a school down a few rungs on the rankings ladder.
So, if these rankings are just going to keep chugging on anyway, is there a way to make them better? I get the utility of them. As a consumer, the idea that you can make a simple list of the “best” colleges is almost a relief.
Yeah: It promises to simplify what is a complicated decision. The decision of where to go to college is one of the most important, consequential, complicated decisions you’re going to make in the first half of your life. And it shouldn’t be simple. It should be complicated, and you should do your homework.
Huh.
There are several rankings that rank schools by social mobility—which is to say, what percentage of your students come from lower-income or lower- and middle-income groups? What percentage of them actually graduate? And what percentage of them end up making postgraduate incomes that are higher than their family income when they attended college?
Who’s No. 1 on that list?
State colleges like the City University of New York and the California state schools. Mostly public schools that have good programs but admit very high percentages of lower-income students. An outfit called Third Way has recently published a social mobility ranking. And it’s fascinating to look at, because the top schools are schools from the City University of New York system and the Cal State system. And the Ivy League schools are ranked around 1,800th.
Subscribe to What Next on Apple Podcasts
Get more news from Mary Harris every weekday.
I saw this statistic that caught my eye this weekend. It looked at kids from New York City and where they end up in high school and college. It compared kids who go to Horace Mann, a really elite private school that costs tens of thousands of dollars a year. About a third of those kids end up in the Ivy League. But if you look at Stuyvesant, also a very elite school with comparable test scores and everything, most of those kids are ending up in the SUNY system, the public system. And it made me realize that elite schools are a little bit back where they started as finishing schools for elite kids and super achievers.
The elite schools want to be elite: Let’s face it. And one of the ways they want to be elite is by having very, very high rankings, as long as rankings exist. Another way they want to be elite is to generate gigantic amounts of money, not only by admitting students who will pay the full tuition and charging very high tuitions, but also by admitting students whose parents will make donations to the school, and students who will graduate and go on to very high-paying, financially rewarding careers and then give back to their alma mater. The top schools are spending well over $100,000 per student per year on what they’re producing. This is not what they’re charging. This is what they’re spending.
Wow.
And these schools don’t want to give up on that. They just don’t. People talk about higher education as an arms race, but the arms race that the U.S. and the Soviet Union engaged in was a race to have more missiles. It wasn’t to spend more per missile. But in higher education, the arms race is not to have more students in your entering class. It’s to spend more per student. Over the past 20 years, most of the elite schools have doubled the amount they spend per student. Imagine a world in which they doubled the number of students they educate instead. But the rankings would punish them for doing that.