Failing schools, or failing to consider multiple indicators?
It’s nearly school report card time in Ohio. One thing to watch for when examining school performance is whether there are conflicting ratings. For the 2013-14 school year, schools will receive ratings along up to ten dimensions of performance, though no overall letter grade. For example, one might observe a school that receives an “F” on the state’s performance index but at the same time, also receives an “A” on the state’s value-added rating. Or vice-versa. How in the world can this happen?
Keep in mind that these two key ratings—a school’s performance index and value-added—are not the same. The performance index is an indicator of raw student achievement, weighted across a continuum of achievement levels. Value-added, on the other hand, is a statistical estimate of a school’s impact on student progress—expressed as learning gains—over time. Although both measures are based on state test scores, they are different creatures: Achievement tells us more about how students perform; value-added provides evidence on how a school performs (i.e., the productivity of the school staff).
Hence, to understand the quality of a school, we really need both measures. Outside observers—parents, taxpayers, and others—should know whether a school’s students, on average, possess literacy and numeracy skills—that’s achievement. And they should know whether a school is contributing to learning over time—that’s progress.
Now back to the question of mixed ratings. How many schools in Ohio have conflicting results, particularly of the low-achievement but high-progress variety? Moreover, how should we think about the overall quality of these schools?
To uncover high-progress but low-achievement schools, I rank all schools statewide by their numerical scores on value-added (progress) and performance index (achievement). I then look for schools ranked in the top 20 percent in value-added and the bottom 20 percent in performance index. In terms of A-F ratings, all schools in the top 20 percent in value-added were rated an “A”, while the bottom 20 percent schools in performance index received a “C” rating or lower.
Table 1 lists forty-three high-progress, low-achieving schools in Ohio, as defined by the criteria above. The far-right column shows that virtually all of these schools were high poverty, which may explain, in part, the low achievement ratings of these schools. It’s widely understood that achievement is typically associated with the characteristics of a student’s family. Fifteen of these schools were charters. All of these schools are located in urban or inner-ring suburban areas. It also bears noting that these forty-three schools represent a fairly small fraction of very high-poverty schools in Ohio—just 13 percent of schools reporting 90 percent or more ED.
Table 1: High-progress, low-achievement schools in Ohio - 2012-13
Source: Ohio Department of Education Note: Schools shaded in green are public charter schools. ED denotes the percentage of students flagged as “economically disadvantaged,” a widely-used proxy for students’ family backgrounds.
Were someone to consider only the achievement measure, they might have labelled these schools a “failure.” That would have been a mistake. These schools are helping their students make gains over time, even when facing challenging circumstances. (Starting with 2012-13 results, the value-added gains are the average over the three most recent school years, if available.) But at the same time, it is not justified to call these schools outright exemplars, or schools deserving of an overall “A” rating either. Student achievement still remains too low.
Think of it this way: We wouldn’t call a two-year college a “failure” just because it doesn’t produce the same number of Rhodes Scholars as Case Western or Ohio State. When we determine the effectiveness of a community college we’d consider how well it prepares a student for a career or a four-year university. Community colleges that prepare students for whatever comes next are getting their job done. And to some degree, that’s what these schools are doing: giving children, who generally come from less privileged backgrounds, a lift in life.
(On the other hand, there are public schools that could rightly be called “failing.” Last year, 133 schools in Ohio displayed both low achievement and low progress ratings—bottom 20 percent on both indicators. If school ratings remain low along both indicators for years on end, intervention or closure ought to happen.)
In conclusion, the point is this: Achievement ratings alone don’t tell the whole story of a school. We need a balanced look at school quality. As is evident from this analysis, there are schools that do well on progress but poorly on achievement. How do we view such “mixed” rated schools? They are productive schools and promising ones, too. But at the end of the day, their students still need much help to succeed in the long run.
 Other school report-card measures are more closely correlated such as the performance index and the “indicators met” dimensions, both indicators of achievement.
 Arguably, it is more important from a policy perspective to know about low-achieving schools that contribute big gains than about high-achieving schools that produce trivial gains on state tests.
 Using the A-F ratings would have been somewhat unwieldy, since roughly 40 percent of Ohio schools receive an “A” in value-added. By using the 20 percent threshold, I restrict the analysis to schools that receive higher “A” value-added scores. For the PI distribution, I only include schools that received a VAM score in 2012-13 (n = 2,558).