In 2013 President Obama hatched a plan. He wanted to call out the colleges where students waste their money.
"We're going to start rating colleges ... on who's offering the best value so that students and taxpayers get a bigger bang for their buck," said Obama.
The plan was to make a rating system that would let prospective students see which colleges were a good deal or a bad deal. It was going to look at data the government was already collecting from its federal loan program — critical information on how much students were actually paying for school and how much students earned after they graduated and how much debt they ended up shouldering.
Last week the Department of Education released something called the College Scorecard, an enormous data set with 1,730 variables on over 7,000 schools. One thing was missing: a rating.
The Obama administration decided not to rate colleges. So we asked three different higher-education experts this question: If you could design a rating, what would it be?
Anthony Carnevale is an economist at Georgetown and the author of a well-known report on earnings after graduation. He says we should focus mostly on income. By his view, education data are murky, and money is the clearest and easiest way to evaluate a college. How well off are graduates? That's the question.
But he concedes this isn't totally fair to all colleges because, he says, the biggest determinant of your income after college isn't the choice of school, but your choice of major. In his ideal world, you would be able to compare wages of English majors across different schools.
Amy Laitinen is the deputy director of the education policy program at the New America Foundation. She thinks a good ranking would focus on mobility: schools that excel at offering students from less well-off backgrounds a shot at a good education.
Laitinen notes, though, that these measures can be flawed. For instance, she says leaning on graduation rates too heavily can unfairly discount schools that serve nontraditional students, many of whom go on to transfer to better programs.
Peter Cappelli is a professor at the University of Pennsylvania's business school. He worries about college as a financial decision. In his view, a good school would get students out the door as fast as possible, and leave them with little debt and good financial opportunities to pay that debt off.
Clearly, the definition of a "good" college changes by what you want the institution to achieve. A ranking focused on money favors schools with more engineering majors, while a ranking focused on mobility favors state universities. It just depends on your rubric.
These rankings only show the top 50 out of 1,829 four-year programs. Most students in the nation aren't attending these schools. And most students narrow their search not by any of these criteria but by geography. We've omitted the bottom of the charts in this post. Those worst performers are arguably more important to spotlight as they are the most proportionately harmful. Even if it is hard to come to agreement on the best way to rank schools, spotlighting the worst actors by any criteria would be closer to Obama's original intent.
A note about the data and methodology. We filtered for schools where the predominant degree granted is a four-year bachelor's degree. We also filter out trade-specific schools and religious programs like seminaries and yeshivas. There are also a few limitations to the income data that the government released. Incomes reported in the data set include only students who took out federal loans from the government. Each institution's score is based on a weighted sum of z-scores for each variable mentioned in the list. Ratings cannot be compared across lists.