What US News & World Report’s High School Rankings Missed
There’s a saying that if the only tool you have is a hammer, everything looks like a nail. Another, perhaps more humorous one, is the proverbial story about the drunk looking for his keys under the street lamp.
The meaning of the sayings are similar — if you only have one resource to identify and solve a problem, you’re never going to solve the actual problem that you may be facing.
Such is the problem with the U.S. News & World Report ranking of the best high schools in America, as identified by education researcher Nat Malkus.
For Malkus, USNWR does a decent job with the tools it has to measure the performance of more than 20,000 U.S. public high schools. The problem, however, is that it only uses one tool, over and over again, which doesn’t accurately measure outcomes in educating students.
Each year, U.S. News teams up with RTI International to run 20,000 public high schools through a four-step process to rank which are the best. In step one, they evaluate schools’ proficiency rates on state math and reading tests against statistical expectations given their student poverty rates. Passing schools move to step two, in which U.S. News assesses whether historically disadvantaged students performed better than the state average. In step three, U.S. News cuts all schools whose graduation rate is below 75 percent (somewhat odd, given that the national average is 83 percent). In step four, schools are ranked on a ‘College Readiness Index,’ which is based entirely on their success in Advanced Placement courses.
What makes a school ‘best’ in the U.S. News rating system? A school’s broader performance on state tests has to be moderately above average to clear the first three steps, but that left more than 29 percent of the schools moving on to step four this year. After that, it all comes down to AP passage rates. … No doubt, AP success is a high bar for high school students, and since the AP tests are the same nationwide, it provides a usable metric for academic excellence. But is it a good enough indicator to decide which high schools are best?
The answer is no. The reason U.S. News leans so heavily on AP is that the data are available. But that is like the proverbial drunk looking for his keys underneath the street lamp. The rankings promote the notion that the best high schools are the ones with the highest outcomes, and because AP success is the only outcome measure they have, they use it, even if the way the top schools generate those outcomes is dubious practice.
Several schools who outperformed the average in the USNWR study, specifically the BASIS charter schools in Arizona, push their students in the area that USNWR looks at — AP studies — so they will naturally look like they are turning out better results than schools that use other means of educating or getting students from A to Z, so to speak.
The problem with looking under the street lamp is that the rankings primarily gauge where students end up, not where they start from or how much they learn. The BASIS schools dominating the top ten push advanced academics hard and are transparent about the fact that the workload is not a fit for all students. Other schools in the top ten have GPA requirements for enrollment. It’s good that there are hard-charging schools for advanced students, but it’s irresponsible to ignore how selective they are. In focusing narrowly on AP outcomes, U.S. News leaves the impression that all schools have equivalent starting points when, in reality, it’s nearly impossible for non-selective schools to end up at the top of this list.
In fairness, U.S. News is arguably doing the best it can with the available data. Data needed to gauge student learning growth are not available in ways that could be applied to all schools. And the rankings do incorporate some measures of student disadvantage, although these only apply weakly in the first two steps. The problem is that their work is branded as ranking which schools are best, but their methods don’t back that up.
What to do about it? According to Malkus, the change has already started. With the Every Student Succeeds Act, states now have the freedom to decide on their own measurements of growth – including how far students have come – on top of mere proficiency to evaluate schools’ performance in educating children. Six of 18 states have plans in place for these measurements, as well as for consequences for schools that don’t live up to state standards.
More states need to come up with appropriate evaluations. And this new data offers USNWR another tool to determine which schools do the best job giving students an adequate education. From there, we can see how well our kids are doing by comparison when faced with a variety of challenges or limited learning options.
Read the full report on the U.S. News & World Report rankings.