Ohio has been a national leader in using value-added measures of student academic growth. The current value-added system was piloted in 2007, and in August 2008 value-added was fully integrated into Ohio's academic accountability system. Value-added analysis, in the Buckeye state, uses complex calculations to report school-wide and district-wide student academic growth in reading and math, in grades four through eight.?? Schools and districts are assigned one of three ratings:
- Above expected growth ??? indicates that the students in a school or a district made greater progress than expected. These schools and districts are ???adding value.???
- Met expected growth ??? indicates that students made the amount of expected academic progress in one school year. Districts and schools in this category are still adding value, but not as much as those schools rated Above expected growth.
- Below expected growth ??? indicates that students in the school or district made less academic progress than the state expected.
Chart 1 shows the distribution of Ohio's public schools by overall value-added rating for the past three school years. Note the fluctuation in the percentage of schools making Above expected growth and Meeting expected growth during the last three academic years. In 2008-9, almost two-thirds of the schools in Ohio made above expected growth while in 2010-11 this number dropped to just about 1 and 4 schools. During this same period of time, the percentage of schools Meeting expected growth almost doubled from 27 percent to 59 percent. The percentage of schools Below expected growth has remained relatively stable.????
Chart 1: Distribution of Ohio Schools by Value-Added Rating (2008-09, 2009-10, 2010-11)
What should one make of these numbers? It is clear that determining value-added gains is as much art as it is science. We documented this in our 2008 primer on the valued-added measure, and education researcher Doug Clay wrote about the fluctuation, or ???yo-yo effect,??? for our Ohio Education Gadfly.
Beginning with last school year's data, the Ohio Department of Education has begun tweaking its value-added metric in order to better balance results and make them more accurately reflect what is happening in the state's schools. As a result, while the 2008-09 results reflect a ski slope racing down from Above to Below, the 2010-11 numbers are more closely shaped to a bell curve. It appears as though the state has succeeded in adjusting its value-added metric to be a more accurate and balanced representation of growth than in previous years.
And while this adjustment is interesting to policy wonks and education researchers like me, it also matters mightily to schools and districts. In Ohio, a school's state rating can be bumped up or down based on value-added results, and these ratings largely define the quality of a school to its parents and community. Because of the state's efforts to smooth out the curve of value-added results, a number of schools and districts used to getting the bump up didn't see it this year.?? Seeing their stellar rating seemingly drop a notch may not make teachers and school leaders feel better, but it should help them better understand what's going on in their schools and thus better explain it to their parents and community.
We'll be writing much more about Ohio's value-added results throughout our two-week blog series on Ohio's local school report card data.?? Stay tuned to Flypaper for more!