The case for scrapping "indicators met"

Duplication is not always a good thing. Think about it, most of us don’t carry two cell phones. In a world with limited pants-pocket space, two phones would be senseless, right? Ohio’s school report cards have two essentially-the-same achievement components, both of which receive an A-F letter grade. It’s time to toss one of them for parsimony’s sake.

The first, the indicators-met component, is determined by whether 75 percent of a school’s test-takers reach proficiency on the state’s twenty-four assessments (85 percent for eleventh grade). The second, the performance-index component (PI), is a composite score weighted by the proportion of test-takers who attain each of the state’s five achievement levels.

Though the two indicators differ slightly, they produce very similar results for any given school. In other words, if a school gets a low PI letter grade, it is nearly assured that it will receive a low indicators-met grade. The same is true in the reverse—high PI schools will likely get a high indicators-met grade. Here’s the evidence.

Table 1 shows the letter grades of Ohio’s 3,089 schools by indicators met and PI. As you can tell, the grades correspond closely. For example, 99 percent of schools that received an A for indicators met received either an A or B on PI. One-hundred percent of schools that received a B on indicators met received a B or C on PI. Well over one-thousand schools received an A/B grade combination. There are very few schools that received mixed, high-low ratings: 302 schools received an F/C; 15 schools received an A/C; 48 schools received a D/B.   

Table1: Practically all schools receive similar grades - Number of schools by their indicators met and performance-index grade, Ohio schools, 2012-13

When we consider the indicators met versus PI as percentages instead of A-F categories, the correspondence becomes even more evident.[1] Chart 1 shows a strong, positive relationship between the two variables. Schools with a lower percentage of indicators met have lower PI percentages. Vice-versa, schools with a higher percentage of indicators met have higher PI percentages. The correlation is 0.91, indicating that the two variables are strongly associated (+1.0 indicates perfect correlation; 0 indicates perfect non-correlation).

Chart 1: Strong positive relationship between report-card components – Performance-index percentage versus percentage of indicators met, Ohio schools, 2012-13

Source: Ohio Department of Education

Already, Ohio’s school report cards are crammed with a lot of information, with more to come in the next few years. That is good, but more information can also render the report cards cluttered and confusing for parents and educators. “Information overload,” some might say. What should the state do? Keep the performance index—it better accounts for the variation in students’ test results—and do away with the indicators-met as an A-F component. (But still make the proficiency data available for reporting and comparison.) 


[1] The percentage of indicator met = indicators met/applicable number of indicators for a given school. The PI percentage = PI score for a given school/120, the highest score possible. A school’s letter grades are based on these percentages.

 

More By Author

Related Articles