The CREDO conundrum
June 24, 2009
A perennial question: How does the performance of students in charter schools and students in traditional schools compare? CREDO set out to answer this question in a longitudinal analysis of roughly 2,400 charter schools, operating in 16 states and comprising roughly 70 percent of the US charter school population. Let's break it down. Fundamental study design: Sound. Findings: Mixed. Explanation of analysis: Sloppy. Let's hit them in turn.
The methodology is based on a "virtual twin" approach. Specifically, each charter school student was matched by demographics and test scores with a student from the traditional public school (TPS) he or she had attended before switching schools (i.e., the "feeder school"). Then, gains in math and reading for the two groups of students were evaluated and the student-level results extrapolated to determine whether a charter school was serving its students better, the same, or worse than its matched TPS. Absent a randomized study, this is a reasonable approach to this kind of comparative analysis, and one that helps ameliorate, though not eliminate, selection bias (i.e., inherent differences between charter and traditional public school students).
Next, the findings. On average, 46 percent of charter schools have math gains that mirror the gains of the matched TPS; 17 percent of charter schools made more progress in math; and 37 percent post math gains lower than their TPS counterpart. In reading, charter schools students do a bit worse than their TPS peers but the difference is so small (less than 1 percent of a standard deviation) that it's not meaningful. Notably, low-income students and English Language Learners fare better in charter schools than in TPSs, though black and Hispanic students do not.
It's surely troubling that over a third of charters do worse than traditional schools in boosting math skills, but we should keep a few things in mind. Most importantly, the negative findings can largely be explained by the fact that over 50 percent of charter school students in the study were brand-new to their schools. Past research (see here for instance) and common sense tell us that kids in their first year at a new school don't initially perform well, and often maintain achievement gaps left over from previous instructional deficiencies. These same students will start to show gains in their second, third, and fourth years, and beyond. CREDO's results confirm this: When student performance is disaggregated by length of enrollment, first year charter students experience negative impacts on learning; second year students show no difference in learning gains; and third year charter pupils experience small but significant gains in reading and math. CREDO should have made this more explicit.
Unfortunately, we're never told exactly how many students fall into each of these three buckets--just that "more than half of the records" are in bucket one. For a well-regarded research shop, that's a critical number to gloss over. And it isn't the only startling omission: Sample sizes for schools and for students in both the overall analysis and the state-level analyses are also missing. It's not even clear how many and which years of data were collected. Then there's the mystery of Massachusetts. The study is supposedly a "16 state" analysis--but results for only 15 are presented (see page 9). It's possible that the Bay State was used as a base comparison for other states--in other words, as a "reference category" when creating the dummy variables for the analysis--but that doesn't explain why it was left out in the state-by-state findings (see pages 35-37). Did CREDO forget about it?
Frankly, the entire report needs some analytic housekeeping, including an upfront admission that the charter sample is mostly composed of recent transfer pupils. But it's still an important contribution to the field, and one that has gotten ample media attention. Its final recommendation is particularly on target: The charter movement must remove barriers to entry and exit, for high-performing schools in the first case and low-performing ones in the latter. You can read the report, including state-level findings, here.