Pulling their punches: How Achieve's “Expectations Gap” report falls short
In 2005, Achieve and the National Governors Association hosted a National Education Summit on High Schools where forty-five governors came together with business leaders to address an ongoing challenge in American education: the gap between what students need to master to earn high school diplomas, and the knowledge and skills they need to be prepared for college and careers. Every year since, Achieve has released its annual “Closing the Expectations Gap” report, aimed at highlighting the progress states have made—and need to make—to better align K-12 and postsecondary education expectations.
The challenge is that tracking implementation is tricky.
The first report, released in 2006, focused primarily on whether high school academic standards and graduation requirements were aligned to “college and workplace expectations.” (In all but two states, they hadn’t been, though as many as thirty-five states were working towards it.) This year, the landscape has obviously shifted dramatically: Thanks in part to the Common Core, schools in every state and the District of Columbia are guided by standards that are aligned to College and Career Ready (CCR) expectations.
Of course, that means that the report must shift to match the changing landscape. To that end, this year’s report has, for the first time, begun to track state progress towards implementation of the standards. According to the authors, the report “provides an overview of the progress states are making” and it “draws attention to key issues states should consider as adoption and implementation work continues.”
The challenge is that tracking implementation is tricky. There is no one right way to implement standards from the state level and, while there are promising practices from across the country, no state has gotten standards implementation exactly right. So, while the authors are right that “every state clearly as an important role to play,” it’s not at all clear what that role should be.
Yet, Achieve wades into the muddy waters of state implementation and tracks a small handful of indicators that it thinks will help focus state implementation efforts. Unfortunately, by failing to make judgments about the quality of state implementation plans, the report fails to say much of anything about the actual progress states are making towards CCSS/CCR standards implementation.
For instance, Achieve reports that forty-five states and D.C. are providing “high-quality processes, protocols and exemplars,” such as rubrics and implementation tools that can be used by school and district leaders and teachers. But, absent any information about the quality of these tools, what does that really tell us?
Similarly, thirty-nine states are developing their own curriculum and instructional materials for voluntary use and five are requiring that schools use specific standards-aligned materials. But, again, what does this tell us? Perhaps the state is not ideally positioned to develop materials in-house and should focus more attention on helping curriculum directors and teachers vet materials (as sixteen states are doing)? Or perhaps states would do well to stay out of the curriculum business entirely. After all, there is little indication that the time and energy states have spent on statewide textbook adoption has had any impact on student achievement.
What made the Achieve report useful in the past is that they made tough judgment calls or they called attention to a failure to meaningfully link standards and accountability to student achievement. They called out states whose K-12 standards weren’t rigorous enough to meet CCR expectations, for instance. And they named states who had failed to link accountability to college and career readiness.
Of course, it’s always tough to develop criteria that judge the quality of complicated and nuanced systems and standards. But their failure to discuss the quality of state CCSS/CCR implementation plans or to contextualize state practices in terms of student achievement makes this report less useful than it could be.