Standards, Testing & Accountability

Last week, Ohio’s State Board of Education voted unanimously to delay the release of annual school performance report cards as state officials investigate allegations of data-tampering. It came to light this summer that some Ohio school districts (Auditor of State Dave Yost is working to determine just how many) retroactively un-enrolled and re-enrolled truant or low-performing students in order to break the students’ enrollment records with the district. Those students’ test scores and attendance records would then not count toward the district’s overall report card rating because the students hadn’t been continuously enrolled from October to spring testing. (To be clear, there is no evidence yet that data-tampering was taking place in all, or even most, of the state’s 600+ districts, and there is conflicting opinion about whether the data changes were actually on the up and up.)

The state board’s decision was the right one. They simply cannot make public extensive data about school performance unless they have faith in the accuracy of that information. However, the decision has widespread ramifications for Ohio’s districts, schools, and students. There are a number of policy provisions triggered by the annual report cards and the test data they...

Harder tests are coming to the Buckeye State.

Starting in the 2014-15 school year, Ohio will replace its current K-12 academic standards in math and English language arts, along with the aligned standardized tests, with the Common Core academic standards and their aligned tests. In Ohio, these exams will be the PARCC exams.

The Common Core standards will differ significantly from Ohio’s current academic standards in content, emphases, and cognitive demand.[1] These standards promise greater rigor in what students are expected to learn and how their learning is applied; therefore, we can also expect that the Common Core’s aligned assessments—again, the PARCC exams—will be more difficult.

How much harder should we expect the PARCC exams to be? Take a look for yourself.

Figure 1 shows two sample questions from Ohio’s current seventh-grade math exam. (The Ohio Department of Education provides practice tests, which are accessible via the source link below the figure.) The questions are relatively simple: the first question tests whether a student understands ratios; the second question tests whether a student understands a basic algebraic equation. Although I wouldn’t suggest that the questions are necessarily “easy” (it took me a few minutes to calculate the answers), they are straightforward—and...

If you have a high schooler at home, are a high school student yourself, or graduated from high school, you know these acronyms: SAT and ACT. These are, of course, the standardized tests juniors and seniors take in order to apply to college. In Ohio, over 92,000 college-seeking students took the ACT exam during the 2011-12 school year. Recently, ACT, Inc., the Iowa-based company that administers the exam, reported national and state-by-state results for the ACT test.

Ohio’s 2012 results, which can be found here, show that Buckeye State high-school students slightly outperformed their national peers in all tested subjects (English, reading, math, and science). The percent of Ohio students reaching the ACT benchmarks outpaced the national percentage by three (science) to six (reading) percentage points. Ohio’s ACT results, therefore, seem to correspond well to its NAEP results—another nationally administered exam—which also indicate that Ohio students do slightly better than the national average.

While Ohio’s above-average performance on ACT exams may trigger small celebrations, a closer examination of the data should cause concern. The more-rigorous Common Core academic standards in English language arts and math and its aligned assessment, the PARCC exam for Ohio, will arrive...

Harder tests are coming to the Buckeye State.

Starting in the 2014-15 school year, Ohio will replace its current K-12 academic standards in math and English language arts, along with the aligned standardized tests, with the Common Core academic standards and their aligned tests. In Ohio, these exams will be the PARCC exams.

The Common Core standards will differ significantly from Ohio’s current academic standards in content, emphases, and cognitive demand.[1] These standards promise greater rigor in what students are expected to learn and how their learning is applied; therefore, we can also expect that the Common Core’s aligned assessments—again, the PARCC exams—will be more difficult.

How much harder should we expect the PARCC exams to be? Take a look for yourself.

Figure 1 shows two sample questions from Ohio’s current seventh-grade math exam. (The Ohio Department of Education provides practice tests, which are accessible via the source link below the figure.) The questions are relatively simple: the first question tests whether a student understands ratios; the second question tests whether a student understands a basic algebraic equation. Although I wouldn’t suggest...

Flying squirrels!

After a week’s hiatus, Mike and Rick catch up on the Romney-Ryan merger, creationism in voucher schools, and the ethics of school discipline. Daniela explains teachers’ views on merit pay.

Amber's Research Minute

Trending Toward Reform: Teachers Speak on Unions and the Future of the Profession by Sarah Rosenber and Elena Silva with the FDR Group - Download the PDF

A version of this post originally appeared on the Shanker Institute blog.

Up until now, the Common Core (CCSS) English language arts (ELA) standards were considered path-breaking mostly because of their reach: This wasn’t the first time a group attempted to write “common” standards, but it is the first time they’ve gained such traction. But the Common Core ELA standards are revolutionary for another, less talked about, reason: They define rigor in reading and literature classrooms more clearly and explicitly than nearly any of the state ELA standards that they are replacing. Now, as the full impact of these expectations  starts to take hold, the decision to define rigor—and the way it is defined—is fanning the flames of a debate that threatens to open up a whole new front in America’s long-running “Reading Wars.”

Game of Risk
A new front opens on a war worth waging. 
Photo by Ben Stephenson.

The first and most divisive front in that conflict was...

Children Lose Out” was the title of an editorial penned by The Salt Lake Tribune in response to last week’s State Board of Education decision to withdraw from the Smarter Balanced Assessment Consortium (SBAC). Nationally, Common Core (CCSS) advocates worry that this move will not only hurt Utah’s kids, but also that it represents a weakening of support for the new expectations, and they worry that it could fuel even more anti-CCSS fire across the country.

Perhaps.

On the other hand, if Utah education leaders seize this moment as an opportunity to prove both that the CCSS is truly a state-lead initiative and to show how a state can take the reins to ensure that the aligned assessments are clear and rigorous and to give teachers the implementation tools they need, this move could do more to garner support for CCSS implementation than either consortium has done to date.

The reality is that, more than two years after the release of the final version of the CCSS, SBAC and the other assessment consortium, PARCC, have released scant information about what their assessments will look like—and how (if at all) they’ll differ from the mediocre tests we have now. Nor...

Rigorous academic standards and high-stakes accountability for schools and educators alike are important for school improvement efforts. The states where students have made the most significant academic gains over the last decade (for example, Massachusetts and Florida) have had high academic standards, assessments aligned to those standards – complete with high cut scores, and transparent systems for sharing school and student results through district and school “report cards.” The fact is standardized testing has proven to be the best, most objective tool for measuring both student and teacher success.

This is important to remember as Ohio deals with a widening scandal around allegations of “data fudging” and “manipulation of attendance records” to improve test scores and school report cards. Some Buckeye State educators and lawmakers have suggested that the underlying problem here is accountability, or that the state’s report card has taken on “way too much importance.” Accountability, however, is not the problem. The Columbus Dispatch editorial board got it exactly right when writing:

It’s true that the report card is short of perfect; it is an attempt to tell an extremely complex story – how effective a school district is, allowing for all of its advantages...

Teacher talent is squarely at the frontier of education reform. Last week, The New Teacher Project issued a report that scrutinized teacher retention practices, finding that many top-shelf teachers—especially those in poorer schools where the need for effective teachers is the greatest—leave to teach in better schools, or leave the profession altogether.

In 2010, McKinsey & Company, a global consulting firm, published a blistering report of America’s teaching profession. McKinsey found that, in comparison to countries with high-flying education systems, America has a woeful teacher workforce: too many American teachers come from the bottom of their graduating college class, while too few top-performing college students consider teaching—much less enter the profession.

With these teacher quality issues in mind, I wanted to see how future grad-school education students fared on their GREs, the grad-school admissions exam. Educational Testing Service (ETS) administers the GRE, and in its summary statistics report, ETS breaks down test results by the test-taker's intended major—with education as a possible selection.

How did America’s future educators fare? Consider figure 1 which compares the average GRE score by intended grad-school major across two exam sections: quantitative (math) and verbal. On the left, education majors rank dead...

Louisiana recently submitted a proposal to that state’s Board of Elementary and Secondary Education that calls for school choice and quality control in the state’s voucher program-- two words that have not been paired together enough here in Ohio. Specifically, the plan calls for a practical accountability system for the state’s voucher program. Louisiana’s K-12 scholarship program awards students who meet a residency and income requirement and who attend a low-performing school a scholarship to attend a private school of their choice. Currently approximately 5,000 students are using a public voucher in Louisiana.

The accountability plan, which would be the first of its kind in the nation, would introduce an accountability system based on a “sliding scale” (i.e. those schools enrolling more voucher students would be held to a higher level of accountability-- an idea Fordham proposed three years ago). Under the new system schools enrolling an average of greater than ten students per grade or forty or more students enrolled in tested grades will have their test scores reported. Schools will then be given points based on their performance, similar to the ones given to the public schools. Schools who receive low scores in the second year...

Pages