The high-quality implementation of the Common Core standards, its aligned exams, and an evaluation framework that measures how effectively teachers teach these new standards ought to be the goal for Ohio’s public schools. This is a heavy lift, however, and there is little doubt that the implementation of these reforms—all of which are intertwined and taken on simultaneously—will pose challenges for state education and school leaders.
One such implementation challenge is the switch in assessments. Starting in 2014-15, Ohio’s schools will implement new math and reading exams that align to the Common Core standards. The Buckeye State is presently one of twenty-two member states of PARCC, a consortium that is working together to develop these new assessments.
Goodbye OGT, hello PARCC
It is expected that the PARCC exams will differ considerably in comparison to Ohio’s old math and reading exams (the OAAs and OGTs), which are being phased out. The differences include anything from the content and difficulty of the tests, to the “cut score” that is required to pass them, to the online format in which the tests will be administered.
Ohio’s switch to the PARCC exams is likely to affect teachers’ value-added scores, for their scores are based on students’ present and past test results. (Value-added is a statistical model that estimates a teacher contribution to her students’ learning over a school year, and is part of a teacher’s evaluation.) Due to the change in Ohio’s assessments, a teacher who does great, value-added-wise, on the old assessments may do poorly when her students take the PARCC exams. Does this mean the teacher is suddenly less effective? Maybe, maybe not. But at the very least, the switch in tests casts some degree of doubt on the reliability of the teacher’s value-added score.
The Carnegie Foundation for the Advancement of Teaching, in a recent research brief, makes this point:
[S]tates should prepare for larger year-to-year changes in value-added in the 2014-15 school year when they switch to tests aligned with the Common Core. They should also be prepared for greater year-to-year variability in value-added for a few years after the change when they will be using both old and new tests. Large year-to-year variability makes value-added hard to interpret (p. 8).
The report comes to no clear conclusion on what states, including Ohio, should do to deal with calculating a teacher’s value-added score when states transition to exams aligned to the Common Core. The report, however, does provide a handful of suggestions—not mutually exclusive—for how Ohio can more smoothly transition teachers’ value-added scores from the old exams to PARCC. These include:
- Delay the use of value-added scores in teacher evaluations. The principle is this: Having two or three years of PARCC test-score data is more reliable than having only one year of PARCC data. (Recall, Ohio’s value-added model takes into account students’ prior year test scores.) One scenario is that the PARCC exams would be administered for two or three years, without any stakes attached to a teacher’s value-added score. And, after such period, value-added could be reintroduced as a component of high-stakes teacher evaluations as a more reliable estimate of teacher effectiveness.
- Reduce the weight that value-added has as a component in a teacher’s evaluation.  This would reduce the risk of misidentifying either a high-performing or low-performing teacher, due to an unreliable value-added score. A “noisy” value-added score seems more likely to occur if two exams are involved.
- Continue with value-added scores that mix Ohio’s old test score results with PARCC results, but somehow mathematically adjust for the change in the exams.
If Ohio goes ahead with the value-added model as a component for teacher evaluation, then enacting a moratorium for a couple years seems reasonable—so long as it isn’t a death sentence for any future test-based component of a teacher’s evaluation. And, yes, regardless of whether the delay is enacted or not, it is sensible to reduce the weight placed on the value-added scores—a policy that we’ve pushed for within the past year.
It is true that educational reforms must be done speedily and with urgency. Nevertheless, one must also balance speed with high-quality implementation. Per the Common Core, many in Ohio’s education realm believe that these standards have staying power (81 percent of Ohio superintendents believe that the Common Core will be in place five years hence). Yet, the implementation is hardly fait accompli: high-quality implementation of the standards is still a must—and one piece of this is linking teachers’ value-added scores to the PARCC exams. In this case, slowing the reform train to iron out the finer technical details of implementing very new and different standardized exams might just be a very suitable strategy.
 Presently, Ohio law prescribes 50 percent of a teacher’s evaluation to be based on value-added, though an amendment in the state’s budget bill would give schools the flexibility to reduce it to 35 percent.