Standards, Testing & Accountability

Ohio’s expanding attendance data scandal has the potential to match, if not exceed, the scale of recent test cheating scandals in big cities like Atlanta; Philadelphia; Washington, D.C.; New York; and Los Angeles. And the longer it lingers on, the more that innocent schools and educators suffer.

Ohio’s “attendancegate” began in June when the Columbus Dispatch reported that Columbus Public Schools’ staff had erased more than 2.8 million student-absence days from its attendance system dating back to the 2006-07 school year and instead marked those students as having withdrawn, then reenrolled, in the district. According to the Dispatch, key central office administrators were each responsible for tens of thousands of deletions. The changes would not only improve attendance records (one performance indicator on state report cards), but also could improve proficiency test scores. Only the test scores of those students who are continuously enrolled in a school from October until state tests are administered in the spring are included in the school's overall test scores and report card rating. For example, if a child moves among multiple schools during the year, his performance only "counts" at the state level, and does not apply...

Robert Pondiscio, a vice president at the Core Knowledge Foundation and editor of its blog, posed an interesting question on Twitter this week:

I’ve seen bad schools with good test scores before. Any good schools with bad test scores?

It’s a timely and important question that gets to the heart of the emerging debate over whether standardized tests can fairly and accurately measure student learning, and whether accountability systems based on their results are too often mislabeling successful teachers and schools as “failures.”

Obviously, no accountability system is perfect, but we can all agree that one that gets it wrong as often as it gets it right is in need of serious reform. But is there any proof that is happening?

No accountability system is perfect, but we can all agree that one that gets it wrong as often as it gets it right is in need of serious reform.

Enter Kristina Rizga, a Berkeley-educated muckraking journalist who recently took the reins as the education reporter at Mother Jones after stints at Wiretap Magazine and AlterNet. In preparation for her new article, “Everything You’ve Heard About Failing Schools Is Wrong,” Rizga spent a year “embedded” in Mission High...

Eek. Vouchers + creationism = liberal horror, teacher-union field-day, and at least a small risk to the school-choice movement. Politically and strategically, it would be so much simpler if those “voucher schools” would just behave themselves!

God-Creates-Adam-Sistine-Chapel
If only Michaelangelo had taken on voucher accountability too. 
Photo by ideacreamanuelaPps

But how upset should one really be about the AP report from Louisiana that some of the private schools participating in the Pelican State’s new voucher program “teach creationism and reject evolution”?

State Superintendent of Education John White offered the correct policy response: All voucher students must participate in the state assessments, which include science. “If students are failing the test, we’re going to intervene, and the test measures [their understanding of] evolution.” In other words, the schools can do what they like but if their voucher-bearing students don’t learn enough to pass the state tests, the state will do something about it—ultimately (under Louisiana regulations) eliminating those schools from eligibility to participate in the...

Fifty-four: That’s the percentage of students who took the ACT math exam last spring and failed to meet its benchmark score. (That benchmark signifies that a student attaining it has a 50 percent chance of earning a B, or a 75 percent chance of earning a C, in a credit-bearing college course.) The statistics are a bit better in English (33 percent fell below the ACT benchmark) and reading (48 percent). In science, however, they’re worse: Fully 69 percent of 2012 test-takers failed to meet the ACT benchmark score in this subject. A mere quarter of those who sat for the ACT met the organization’s benchmarks in all four subjects (as has been the case in years prior). Keep in mind, too, that for the most part only kids who plan to attend college even bother with the ACT. Unlike NAEP, in other words, it’s a selective sample of the high school population (and even NAEP omits dropouts). These bleak results—and the uphill battle they portend—are important to keep in mind as the U.S. embarks upon implementation of the Common Core academic standards, which are reportedly...

Last week, Ohio’s State Board of Education voted unanimously to delay the release of annual school performance report cards as state officials investigate allegations of data-tampering. It came to light this summer that some Ohio school districts (Auditor of State Dave Yost is working to determine just how many) retroactively un-enrolled and re-enrolled truant or low-performing students in order to break the students’ enrollment records with the district. Those students’ test scores and attendance records would then not count toward the district’s overall report card rating because the students hadn’t been continuously enrolled from October to spring testing. (To be clear, there is no evidence yet that data-tampering was taking place in all, or even most, of the state’s 600+ districts, and there is conflicting opinion about whether the data changes were actually on the up and up.)

The state board’s decision was the right one. They simply cannot make public extensive data about school performance unless they have faith in the accuracy of that information. However, the decision has widespread ramifications for Ohio’s districts, schools, and students. There are a number of policy provisions triggered by the annual report cards and the test data they...

Harder tests are coming to the Buckeye State.

Starting in the 2014-15 school year, Ohio will replace its current K-12 academic standards in math and English language arts, along with the aligned standardized tests, with the Common Core academic standards and their aligned tests. In Ohio, these exams will be the PARCC exams.

The Common Core standards will differ significantly from Ohio’s current academic standards in content, emphases, and cognitive demand.[1] These standards promise greater rigor in what students are expected to learn and how their learning is applied; therefore, we can also expect that the Common Core’s aligned assessments—again, the PARCC exams—will be more difficult.

How much harder should we expect the PARCC exams to be? Take a look for yourself.

Figure 1 shows two sample questions from Ohio’s current seventh-grade math exam. (The Ohio Department of Education provides practice tests, which are accessible via the source link below the figure.) The questions are relatively simple: the first question tests whether a student understands ratios; the second question tests whether a student understands a basic algebraic equation. Although I wouldn’t suggest that the questions are necessarily “easy” (it took me a few minutes to calculate the answers), they are straightforward—and...

If you have a high schooler at home, are a high school student yourself, or graduated from high school, you know these acronyms: SAT and ACT. These are, of course, the standardized tests juniors and seniors take in order to apply to college. In Ohio, over 92,000 college-seeking students took the ACT exam during the 2011-12 school year. Recently, ACT, Inc., the Iowa-based company that administers the exam, reported national and state-by-state results for the ACT test.

Ohio’s 2012 results, which can be found here, show that Buckeye State high-school students slightly outperformed their national peers in all tested subjects (English, reading, math, and science). The percent of Ohio students reaching the ACT benchmarks outpaced the national percentage by three (science) to six (reading) percentage points. Ohio’s ACT results, therefore, seem to correspond well to its NAEP results—another nationally administered exam—which also indicate that Ohio students do slightly better than the national average.

While Ohio’s above-average performance on ACT exams may trigger small celebrations, a closer examination of the data should cause concern. The more-rigorous Common Core academic standards in English language arts and math and its aligned assessment, the PARCC exam for Ohio, will arrive...

Harder tests are coming to the Buckeye State.

Starting in the 2014-15 school year, Ohio will replace its current K-12 academic standards in math and English language arts, along with the aligned standardized tests, with the Common Core academic standards and their aligned tests. In Ohio, these exams will be the PARCC exams.

The Common Core standards will differ significantly from Ohio’s current academic standards in content, emphases, and cognitive demand.[1] These standards promise greater rigor in what students are expected to learn and how their learning is applied; therefore, we can also expect that the Common Core’s aligned assessments—again, the PARCC exams—will be more difficult.

How much harder should we expect the PARCC exams to be? Take a look for yourself.

Figure 1 shows two sample questions from Ohio’s current seventh-grade math exam. (The Ohio Department of Education provides practice tests, which are accessible via the source link below the figure.) The questions are relatively simple: the first question tests whether a student understands ratios; the second question tests whether a student understands a basic algebraic equation. Although I wouldn’t suggest...

Flying squirrels!

After a week’s hiatus, Mike and Rick catch up on the Romney-Ryan merger, creationism in voucher schools, and the ethics of school discipline. Daniela explains teachers’ views on merit pay.

Amber's Research Minute

Trending Toward Reform: Teachers Speak on Unions and the Future of the Profession by Sarah Rosenber and Elena Silva with the FDR Group - Download the PDF

A version of this post originally appeared on the Shanker Institute blog.

Up until now, the Common Core (CCSS) English language arts (ELA) standards were considered path-breaking mostly because of their reach: This wasn’t the first time a group attempted to write “common” standards, but it is the first time they’ve gained such traction. But the Common Core ELA standards are revolutionary for another, less talked about, reason: They define rigor in reading and literature classrooms more clearly and explicitly than nearly any of the state ELA standards that they are replacing. Now, as the full impact of these expectations  starts to take hold, the decision to define rigor—and the way it is defined—is fanning the flames of a debate that threatens to open up a whole new front in America’s long-running “Reading Wars.”

Game of Risk
A new front opens on a war worth waging. 
Photo by Ben Stephenson.

The first and most divisive front in that conflict was...

Pages