More By Author
June 08, 2011
June 09, 2011
November 05, 2008
The media gleefully reported the news that a big interim Reading First study??from the U.S. Department of Education's Institute for Educational Sciences (IES) found the program to have no impact on reading comprehension.
And it's hard to blame the media, for three reasons. First, it loves to pile on the increasingly-unpopular Bush administration. (Contemplate this AP headline: "Bush administration's reading program hasn't helped.") Second, IES head Russ Whitehurst--who has earned a great deal of respect and credibility for moving the federal research and evaluation function toward a new level of rigor and professionalism--stands firmly behind the report. And third, Secretary of Education Margaret Spellings's press office totally bungled the response, coming up with nothing better than the "popularity" of the program. (This is hardly the first time Spellings dropped the Reading First ball.) See this, from Amanda Farris,??deputy assistant secretary,??and printed in??the AP story:
Secretary Spellings has traveled to 20 states since January. One of the consistent messages she hears from educators, principals and state administrators is about the effectiveness of the Reading First program in their schools and their disappointment with Congress for slashing Reading First funds
Here's what Spellings's team should have said:
This study provides important insights into the Reading First program, but readers should be cautioned that it's not nationally representative. Because IES launched the study after the program was up and running, the evaluators had to settle for a very imperfect design. The schools selected for study might have similar demographics to Reading First schools in general, but they were different in important ways.
First, none of the states that won the first Reading First grants could participate in the study because their programs got started in advance of the evaluation. These states were the ones most enthusiastic about the program--and most prepared to implement it well. It's quite likely that Reading First schools in these states are having a major impact.
Second, the schools selected for study were the ones that just barely won grants under the program, which were compared to schools that just barely missed funding. (Schools are ranked according to various criteria, such as poverty, need, etc. Let's say there was enough money in a given district to fund 10 schools; then the study compared the 10th-ranked school, which got money under the program, to the 11th-ranked school, which did not.) But here's the rub: the schools where you would expect the greatest impacts from Reading First are the poorest ones, enrolling students who are further behind in reading--schools that would have been ranked at the top of the priority list. Simply put, these schools weren't included in the study.
The bottom line is that evaluators looked for schools that met their study design conditions, not schools that were nationally representative of the program. So we can't say anything definitive about the effectiveness of Reading First--all we know is about the effectiveness (or lack thereof) of a handful of Reading First schools.
Yes, explaining this stuff to the media is difficult. But Spellings and her team should have tried.