Standards, Testing & Accountability

A new study from the National Bureau of Economic Research examines how Louisiana’s statewide voucher program affects student achievement. The Pelican State expanded its program statewide in 2012; by 2014, twelve thousand students had applied for more than six thousand slots to attend 126 private schools. Because the program was oversubscribed, the vouchers were randomly assigned such that some kids were offered vouchers and some weren’t. This study focuses on roughly 1,400 grade students in grades 3–8 who applied in fall 2012—the first application cohort after the program expanded.

The primary (and surprising) finding is that attending a voucher-eligible private school reduces voucher students’ test scores in math, ELA, science, and social studies (though ELA is not significantly lower). Math scores go down by 0.4 standard deviation one year after the lottery, and for other subjects, the drop is between one-quarter and one-third of a standard deviation. Voucher use also reduces the probability of being promoted to the next grade and shifts students into lower state performance categories. The outcomes are even bleaker for younger children.

In short, this is all very bad news. But remember that these are first-year outcomes, and first-year evaluations of anything ought to be...

I encountered a bit of advice this week that my dear mother would have welcomed during her brief and inglorious career as my pre-Algebra tutor: When it comes to assisting kids with their math assignments, parents can afford to do less.

After struggling to help her first grader with some unfamiliar addition and subtraction formats, the Hechinger Report’s Kathleen Lucadamo sought advice from teachers and parents on how to cope with changing curricular materials and methods. The group recommendation was basically to act as the highway patrol rather than a chauffeur—that is, be on the lookout for breakdowns and give directions when necessary, but don’t pick the route and do the driving yourself. In the words of Jason Zimba, a physicist and the lead writer of the Common Core math standards, “The math instruction on the part of parents should be low. The teacher is there to explain the curriculum.”

This consensus is more than just a remedy for the brain-melting feuds erupting at American kitchen tables over the spiffiest way to factor a polynomial. It also offers a shortcut around one of the least enlightening discourses of modern education politics, which is the squabble over why none of us can...

As everyone knows, the Every Student Succeeds Act (ESSA)—the long-overdue reauthorization of the Elementary and Secondary Education Act—was approved by overwhelming bipartisan majorities in the House and Senate and signed into law by the president in December. The law grants much greater authority to the states over the design of their school accountability systems, especially in contrast to No Child Left Behind.

States now enjoy the opportunity—and face the challenge—of creating school rating systems that can vastly improve upon the model required by NCLB. To help spur creative thinking about how they might do so—and also to inform the Department of Education as it develops its ESSA regulations—the Fordham Institute hereby declares an “accountability design competition.” (We are focused on school ratings, not the interventions that may result from them.) Participants will be tasked with suggesting specific indicators for states to use in grading schools, along with working through the various decisions that states will struggle with as they determine how to calculate ratings. Judges will evaluate the recommendations, and all of us will get to watch and weigh in online.

By January 26, participants will submit their proposals with the following elements included. To keep things from getting too complex,...

  • There’s a reason we don’t bounce our grandkids on our knees and delight them with stories of how Congress muscled through the Deficit Reduction Act of 1984. As the saying goes, there’s nothing pretty about the way the sausage gets made. But for those who were begging for a new federal education law, Politico’s postmortem on the passage of the Every Student Succeed Act provides an inside look at a splendid, savory knackwurst of statutory goodness. In the year following the 2014 Republican midterm landslides, draft legislation had to overcome anti-testing fervor from teachers’ unions, the remnants of the anti-Common Core crusade, and the sudden resignation of House Speaker John Boehner. Between clearing these obstacles and stitching together the perennial philosophical differences of Left and Right, the ESSA used up seven or eight of its nine lives. Thankfully, it’s now a matter of settled law.
  • Speaking of the backlash against high academic standards: Reporting out of Colorado suggests that we might need to think differently about the opt-out movement and its adherents. Though the bulk of the students who absented themselves from the state’s PARCC test were indeed residents of wealthier, high-performing districts—you know, where the
  • ...

Some say the world will end in fire. Some say in ice. But if you’re pressed for time and want to end all intelligent life quickly, nothing beats a task force.

In New York last week, a task force chosen by Governor Andrew Cuomo issued its report on Common Core. In a model of stunning governmental efficiency, the group managed to “listen” to 2,100 New York students, teachers, parents, and various other stakeholders. They then retreated to their chambers to write, edit, and publish a fifty-one-page report a mere ten weeks after they were impaneled. But clearly that was time enough for these solons to learn and thoughtfully consider what the Empire State needs: to adopt “new, locally driven New York State standards in a transparent and open process.” The report has twenty recommendations on how to bring this about.

It should be noted (speaking of governmental efficiency) that God himself was content with a mere ten modest suggestions to govern all known human activity. Cuomo’s task force has double that number—just for Common Core in a single state. But God acted alone. On a task force, every voice must be heard, every grievance aired. And they were, in all their...

Morgan Polikoff

On Wednesday, I had the pleasure of visiting Success Academy Harlem 1 and hearing from Eva Moskowitz and the SA staff about their model. I’m not going to venture into the thorny stuff about SA here. What I will say is that their results on state tests are clearly impressive, and I doubt that they’re fully (or even largely) explained by the practices that cause controversy. (Luckily, we’ll soon have excellent empirical evidence to answer that question.)

Instead, what I’m going to talk about are the fascinating details I saw and heard about curriculum and instruction in SA schools. Right now, of course, it is impossible to know what’s driving their performance, but these are some of the things that I think are likely to contribute. (I’d initially forgotten that Charles Sahm wrote many of these same things in a post this summer. His is more detailed and based on more visits than mine. Read it!)

Here's what I saw in my tour of about a half-dozen classrooms at SA 1:

  • The first thing that I observed in each classroom was the intense focus on student discourse and explanation. In each classroom, students are constantly pressed to explain their reasoning, and other students respond constructively
  • ...

The Ohio Coalition for Quality Education (OCQE) has hit the airwaves in an effort to change the state’s accountability policies. The group claims that Ohio doesn’t take into account differences in student demographics across schools—and is thus unfair to schools educating at-risk pupils. Along with the Electronic Classroom of Tomorrow (ECOT), they are promoting the adoption of a new accountability measure that they believe will solve the problem.

The trouble with their argument is that Ohio policymakers have already implemented a robust measure—value added—that takes into account student demographics. Given what these groups are lobbying for, it is important to review the basics of student achievement, demographics, and school accountability, including value-added measures.

Let’s first keep in mind that the concerns about student demographics and educational outcomes are hardly new. For decades, analysts have recognized the link between demographics and achievement. The famous “Coleman report” from 1966 was among the first studies to show empirically the massive achievement gap between minority and white students. Gaps by race or income status remain clearly evident in today’s NAEP and state-level test data.

These stark results, of course, call into question the use of pure achievement measures (e.g.,...

Our friend and colleague Mike Petrilli is right about many things, but he’s wrong to dismiss solid interstate comparisons of academic performance as a “nice to have,” not a “must-have.” He acknowledges that the Common Core standards have largely failed to usher in an era of timely, valid, and informative comparisons, but then he says, in effect, never mind, we still have NAEP, PISA, and other measures by which to know how one state is doing academically versus another and in comparison with the country as a whole.

It is indeed a good thing that we have those other measures because it’s true that the Common Core era has failed to deliver on what many of us saw as one of its most valuable and important features: a platinum meter stick to be used to measure, monitor, and compare student achievement, not just between states but also among districts, individual schools, even individual classrooms and children. That’s how the superintendent in Springfield, Illinois, could determine how his schools—even just his fifth-graders—compare with their counterparts in Springfield, Oregon, Springfield, Ohio, and Springfield, Massachusetts, both in absolute achievement and in academic growth trajectories in math and English. That’s how a principal...

Nancy Brynelson, Corley Dennison, Daniel Doerger, Jacqueline E. King, William Moore, and Faith Muirhead

As states have implemented college and career readiness standards, it has sometimes been assumed that most of the work and attention has occurred at the elementary grades. In truth, many states have been working for some time to ensure that grade twelve prepares all students for post-secondary success. Programs like AP, IB, and dual enrollment are the most touted offerings for well-prepared students. But there has also been a great effort to create courses for students who are not yet college-ready and who can use senior year to close academic gaps and avoid the remedial instruction that so often acts as a drain on the time, finances, and morale of ascending college students. Just last month, the Fordham Institute held an event called “Pre-medial Education” that discussed ways to bring high school-based college readiness programs to scale.

For colleges and universities, “fixing” remediation is a major priority. According to Complete College America, three out of five students entering community colleges and one out of five students entering four-year institutions require remediation. The vast majority of these students (78 percent at community colleges and 63 percent at four-year institutions) do not go on to successfully complete gateway credit-bearing courses....

This study examines whether information supplied about a student’s ability helps inform that student’s decision to enroll in Advanced Placement classes. Specifically, the information “signal” is the “AP Potential” message on the student’s PSAT Results Report, as written by the College Board. Students who score at a certain cut point on the PSAT get a message that says, “Congratulations! Your score shows you have potential for success in an at least one AP course!” or else a message that says, “Be sure to talk to your counselor about how to increase your preparedness.”

Students in Oakland Technical High School who took the PSAT in 2013 made up the sample of roughly five hundred sophomores. The intervention was as follows: Right before and after they received their PSAT results that included one of the AP Potential messages above, they were given a survey that asked them (1) how they perceived their academic abilities and their plans relative to attending college; (2) the number of AP courses they plan to take; (3) whether they would take the SAT; (4) the probability that they’d pass the exit exam; and (5) the probability that they’d graduate high school.

Analysts found that the AP signal...

Pages