The Education Department has been slowly gathering itself together over the past decade to review states’ mandatory annual IDEA “performance plans” on the basis of student outcomes, in addition to bureaucratic compliance with sundry procedural and data-reporting requirements.
In giving feedback to the states a year ago, for example, Melody Musgrove (who directs the Office of Special Education Programs at ED) forewarned chiefs that ED was redesigning their monitoring system into “a more balanced approach that considers results as well as compliance.”’
Yesterday, they made considerable news by basing their latest round of feedback on criteria that include how a state’s disabled students fare on NAEP and the size of achievement gaps that separate those pupils from “all children on regular statewide assessments.” Further changes are promised for subsequent years, including student-growth data based on statewide assessments. Also promised is a reduction in compliance-style reporting and data burdens.
Based on this analysis, the feds then sort states into three buckets labeled “meets requirements,” “needs assistance,” and “needs intervention.” And the inclusion of outcomes data really does turn out to make a difference. Whereas in previous years almost every state and territory (forty-one last year, to be specific) fell into the first bucket, this year just eighteen do. (There’s a fourth bucket entitled “needs substantial intervention,” but at present, no state has been placed there.)
Among the many “sinkers”: Ohio, which went from bucket 1 to bucket 2, and Delaware, which declined from 2 to 3....