A reverse Sputnik moment?
In December 2010, the latest results from PISA (Programme for International Student Assessment) revealed that—compared to our OECD peers—American fifteen-year olds are (at best) in the middle of the pack. Among the thirty-four participating nations, we ranked fourteenth in reading, seventeenth in science, and twenty-fifth in math. This news, coupled with Shanghai’s epic success on the exam (the first time any part of mainland China had taken it), rocked the education-policy community. For those still smarting, the latest results from two other international assessments offer some liniment. TIMSS and PIRLS (Trends in International Mathematics and Science Study, Progress in International Reading Literacy Study) are given in more countries—including many that are poorer and less developed than those in the OECD—and are actual appraisals of student learning at two grade levels. (PISA purports to assess skills in a country’s overall fifteen-year-old population and does not claim to be curriculum-based or school-aligned.)
U.S. fourth graders are definitely looking better. From 2006 to 2011, their math performance on TIMSS bumped up twelve points and now trails that of their counterparts in just seven other lands (in East Asia and Northern Ireland). Even more remarkable results come from PIRLS: Of the fifty-three systems participating, only four from abroad bested the U.S.’s score (Hong Kong, Russia, Finland, and Singapore). Further, Singapore is the only foreign system to surpass us at the “advanced” level, where an impressive 17 percent of American fourth graders can be found.
It’s not all great news. U.S. eighth graders, for example, remained stagnant in math achievement, as they (and fourth graders, too) did in science. Still, let’s take good news where we can. As noted above, and as others have observed, TIMSS and PIRLS are stronger indicators of student achievement (and much more closely aligned to NAEP) than PISA. While PISA takes a “skills-based approach,” which heralds students’ abilities to manage the process of knowledge-acquisition more than the actual knowledge itself, TIMSS and PIRLS—along with NAEP—ask students to understand content and utilize that knowledge.
The new PIRLS results also hint at the efficacy of strong, scientifically based reading policies in the early grades—including the much-missed Reading First program. (NAEP’s 2011 results drop the same hint.) Recall that Reading First (a $1 billion-a-year program initiated under No Child Left Behind and foolishly defunded in 2008) provided intensive support to high-poverty schools seeking to teach reading to K-2 students in a scientifically based manner. The first Kindergarten cohorts taught via Reading First programs would have hit fourth grade around 2007 and eighth grade around 2011—just when we see PIRLS scores begin to jump. Consider, too, Florida’s results. (It and eight other states participated in this round of PIRLS and TIMSS testing as if they were countries.) The Sunshine State has emerged as a bastion of smart, scientifically based reading instruction—and has required a “third-grade reading guarantee” for over a decade. It outstripped the U.S. average on PIRLS by thirteen points! It also ranked second, behind Singapore, in the percentage of “advanced” students. And there may well be more policy implications for analysts to investigate. Although Florida was a first-time participant (as were Alabama and Colorado) in TIMSS and PIRLS in 2011, six other states—Colorado, Connecticut, Indiana, Massachusetts, Minnesota, and North Carolina—have partaken in one or more previous iterations of the assessment, allowing for comparisons over time.
“U.S. Math, Science Achievement Exceeds World Average,” Education Week, December 11, 2012.