Standards, Testing & Accountability

The education components of Governor Kasich’s proposed budget—and the House's subsequent revisions—made a big splash in Ohio's news outlets. Much of the attention has been devoted to the House’s (unwise) moves to eliminate PARCC funding and their rewrite of Kasich’s funding formula changes. Amidst all this noise, however, are a few other education issues in the House’s revisions that have slipped by largely unnoticed. Let’s examine a few.

Nationally normed vs. criterion-referenced tests

As part of its attempt to get rid of PARCC, the House added text dictating that state assessments “shall be nationally normed, standardized assessments.” This is worrisome, as there is a big difference between norm-referenced and criterion-referenced tests.

A norm-referenced test determines scores by comparing a student’s performance to the entire pool of test takers. Each student’s test score is compared to other students in order to determine their percentile ranking in the distribution of test takers. Examples of norm-referenced tests are the Iowa Test of Basic Skills or the Stanford 10 exams. A criterion-referenced test, on the other hand, is scored on an absolute scale. Instead of being compared to other students, students are compared against a standard of achievement (i.e.,...

We released a new report today, School Closures and Student Achievement: An Analysis of Ohio’s Urban District and Charter Schools, that could change the way we think about school closure.  The study reveals that children displaced by closure make significant academic gains on state math and reading exams after their school closes.

The study examined 198 school closures that occurred between 2006 and 2012 in the Ohio ‘Big Eight’ urban areas (Akron, Canton, Cincinnati, Cleveland, Columbus, Dayton, Toledo, and Youngstown). The research included 120 closed district-run schools and 78 closed charter schools. Taken together, these closures directly affected 22,722 students—disproportionately low-income, low-achieving, and minority students—who were in grades 3-8 at the point of closure.

Three years after closure, the research found that displaced students made the following cumulative gains:

  • Students who had attended a closed district school gained forty-nine additional days of learning in reading and thirty-four additional days in math and;
  • Students who had attended a closed charter school gained forty-six additional days in math.

Further, the study reveals that students who attended a higher-quality school after closure made even greater progress. Three years after closure, displaced students who transferred...

A vast amount of contemporary education policy attention and education reform energy has been lavished on the task of defining and gauging “college readiness” and then taking steps to align K–12 outcomes more closely with it. The ultimate goal is for many more young people to complete high school having been properly prepared for “college-level” work.

The entire Common Core edifice—and the assessments, cut scores, and accountability arrangements built atop it—presupposes that “college-ready” has the same definition that it has long enjoyed: students prepared to succeed, upon arrival at the ivied gates, in credit-bearing college courses that they go right into without needing first to subject themselves to “remediation” (now sometimes euphemized as “developmental education”).

But this goes way beyond Common Core. Advanced Placement courses also rest on the understanding that an “introductory college-level course” in a given subject has a certain set meaning and fixed standards. The people at ACT, the College Board, and NAGB have sweat bullets developing metrics that gauge what a twelfth grader must know and be able to do in order to be truly college-ready—again, in the sense of having solid prospects of succeeding in credit-bearing college courses in one subject or another.

Lying beneath...

Everyone knows that impenetrable jargon is to the education community what sputtering indignation is to Twitter: both irritating and contagious. When teachers and administrators hold forth on the importance of psychometrics and normed modality processing, it emboldens the rest of us to test our comfort with stackable credentials and mastery-based learning. And in the midst of this morass of deliberate obscurantism, a term like “career-ready” should seem like a godsend. But as this new brief from ACT, Inc. reminds us, there are important nuances to even the most outwardly simple concepts.

Nearly ten years ago, the organization released Ready for College and Ready for Work: Same or Different?, a similar publication that made the case for equivalently rigorous education for all high school graduates, regardless of whether they matriculate into colleges or head directly for the workplace. As the authors of Unpacking “Career Readiness” note, the earlier brief “described college and career readiness in terms of benchmarks focusing solely on academic assessments and the level of education…required for success in postsecondary education or targeted workforce training.” They concede, though, that subsequent research “has clearly established the value of additional areas of competency that are important for both college...

The testing “opt-out” movement is testing education reform’s humility.

The number of students not participating in state assessments is large and growing. In one New York district, 70 percent of students opted out; in one New Jersey district, it was 40 percent.

Some reporting makes the case that this phenomenon is part of a larger anti-accountability, anti-Common Core story. Some reformers, it seems to me, believe opting out is the result of ignorance or worse.

Participants are routinely cast as uninformed or irrational. Amanda Ripley implied that opting out of testing is like opting out of vaccines and lice checks. New York Board of Regents Chancellor Merryl Tisch argued, “We don’t refuse to go to the doctor for an annual check-up…we should not refuse to take the test.” A column in the Orlando Sentinel argued we’d “lost our minds” and that the “opt-out movement has officially jumped...

The University of Kentucky may have lost the NCAA tournament, but Kentuckians can still take heart in their K–12 schools’ promising non-athletic gains. According to this new report, the Bluegrass State’s ACT scores have shot up since it began to implement the Common Core in 2011–12.

Using data from the Kentucky Department of Education, the study compared ACT scores for three cohorts of students who entered eighth grade between the 2007–08 and 2009–10 school years. The first group took the ACT—a state requirement for all eleventh graders—in 2010–11, immediately prior to CCSS implementation. They were therefore not formally exposed to instruction under the new standards. Cohorts two and three took the ACT in 2011–11 and 2012–13, after the introduction of CCSS-aligned curricula. They earned composite scores that were 0.18 and 0.25 points higher, respectively, relative to first cohort. The study authors report this gain as roughly equivalent to three months of additional learning.

The report rightly cautions against reading too much into these early findings. The short interval between Common Core implementation and the cohorts’ ACT scores reduces the effect the standards could have on student achievement. The authors also note that it is not clear whether the scoring gains...

The process of reforming charter school law in Ohio took another big step forward last week with the introduction of S.B. 148 in the Ohio Senate. Jointly sponsored by Senator Peggy Lehner (R-Kettering) and Senator Tom Sawyer (D-Akron), the bill is the result of workgroup sessions over the last nine months to craft the best legislation possible to improve charter school oversight and accountability.

The new Senate bill follows on the heels of House Bill 2, a strong charter school reform measure passed by the House last month. The Senate proposal maintains many of the critical provisions that the House bill included and adds some additional measures. Specifically, the Senate bill:

  • Strengthens House language around sponsor hopping
  • Increases transparency around expenditures by operators
  • Requires all sponsors to have a contract with the Ohio Department of Education
  • Incorporates much of Governor Kasich’s proposal related to charter school sponsor oversight
  • Prohibits sponsors from spending charter funds outside of their statutory responsibilities
  • Assists high-performing charter schools with facilities by encouraging co-location and providing some facility funding

We published a full roundup of press coverage of the rollout in a special edition of Gadfly Bites on April 16. Important highlights can be...

This post has been updated with the full text of "Wanna opt out of tests? Try this instead"

There’s a bracing moment early in the 1991 movie Grand Canyon. A tow truck driver played by Danny Glover miraculously appears to rescue a stranded motorist played by Kevin Kline, who is being terrorized by thugs on a deserted Los Angeles street. Glover’s character appears, calmly hooks up the disabled car to his rig, and appeals to the gun-toting gang leader to let him and Kline go on their way.

“I'm gonna grant you that favor, but tell me this,” the gang leader says after a tense standoff, reminding the tow truck operator that he’s calling the shots. “Are you asking me as a sign of respect? Or are you asking because I've got the gun?”

“You ain't got the gun,” Glover replies, “we ain't having this conversation.”

I think of this scene every time I read a story about the “opt-out movement”—parents and others protesting the distorting effects of standardized testing in schools by refusing to let their children take the tests. Opt-out parents believe they have a gun pointed at testing. They might be right. But the opt-out movement...

In a previous post, I referred to New York’s fierce political battle over teacher evaluations. Since then, New York lawmakers have passed the education portion of the budget—and moved Governor Cuomo’s controversial teacher evaluation proposal forward. State teachers’ unions responded by calling for parents to opt-out of standardized tests, hoping that a lack of data would sabotage the system. In response, the Brookings Institution’s Matthew Chingos has published an analysis of whether opting out will actually affect teacher evaluations. The short answer is “no,” and here’s why:

To conduct his analysis, Chingos examined statewide data from North Carolina—specifically, the math achievement of fourth and fifth graders during the 2009–10 school year. Chingos ran two simulations of the data: one that investigated a random group of students opting out of state exams, and another that investigated a group of the highest-performing students opting out. Both simulations found that the effect of opt-outs on a teacher’s evaluation score is small unless a large number of her students choose to opt out.

So what happens if a large number of students in New York opt out?[1] As the number of students opting out increases, so...

Part II of the latest Brown Center report is called “Measuring Effects of the Common Core.” Loveless creates two indexes of Common Core State Standards implementation by using data from two surveys of state education agencies. The 2011 index is based on a survey from that year, which reports how many activities—such as conducting professional development or adopting new instructional materials—states had undertaken while implementing the CCSS. “Strong” states are those that pursued at least three implementation strategies. The 2013 index uses survey data asking state officials when they plan to complete CCSS implementation. In this case, “strong” indicates full implementation by 2012–2013.

Analyzing the relationship between survey results and fourth-grade NAEP data for reading, Loveless finds little difference between “strong” states and the four states that never adopted Common Core. According to the 2011 index, strong implementers outscored the four states that didn’t adopt the Common Core by a little more than a scale point between 2009 and 13 (yet the small comparison group makes for less reliable findings). Strong states did a bit better relative to the 2013 index, but still outdid non-implementers by less than two NAEP points.

More interesting than these preliminary correlation studies, however, is...

Pages