One reason we wonks love state-level—and now city-level—data from the National Assessment of Educational Progress is that it helps us identify jurisdictions that are making strong progress compared to their peers. We assume those places are leading the pack because something they are doing is working. (Plus, it makes us feel better about our chosen profession. Policy matters!)
But does it? Matthew DiCarlo of the Shanker Institute, among others, has wondered whether the ups and downs of test score trends are necessarily related to what’s happening in the classroom, much less what’s happening in the statehouse or the school board. NAEP results, DiCarlo wrote, “can’t be used to draw even moderately strong inferences about what works and what doesn’t.”
He may be onto something. Take a look at the two graphs below, which show the relationship between changes in eleven cities’ NAEP results and changes in their median incomes:
Cities furthest to the left (San Diego, Chicago, Charlotte, and Cleveland) have seen their median incomes decline most dramatically from 2005 to 2011. Cities toward the right have seen their incomes increase (Boston and Houston) or increase dramatically (Washington, D.C.)....