Standards, Testing, & Accountability

Standards-based reform in education is imperfect. The ways that states and districts assess kids, design tests, and attempt to hold teachers and schools accountable are bound to be flawed, lead to unintended consequences, and create many enemies along the way. But I wish the opponents of standards-based reform in Ohio would at least get a little more creative.

You may recall from a few months ago that Karl Wheatley, Cleveland State University ed professor, said the best way to improve education would be to "stop focusing on student achievement ." I outlined why I thought that was a bad idea here . The gist of his argument, believe it or not, was that because standardized testing creates "collateral damage," perverse incentives, etc. the best thing to do is to stop trying to raise student achievement.

Yesterday's op-ed in the Columbus Dispatch from another education professor, Thomas Stephens of Ohio State, comes from the same predictable script (aka "we don't like the focus on standards/testing/accountability so let's call for its demise-or at least replace it with a nebulous emphasis on problem solving and innovative thinking"). In "Standards obstruct education," Stephens argues that Ohio's decision to revise academic standards is a waste of time and money because, among other things, it "doesn't consider the needs of... children." This commentary uses the same creepy factory language intended to pit "standards-teach-and-test fanatics" against reasonable, warm-hearted education professors - e.g. "assembly-line-atmospheres" and the metaphor of children as widgets....


A week ago, I posted this in response to Secretary Duncan's speech about education schools at Teachers College. Over the course of several days, there were 11 comments posted that, when printed out, clocked in at 20 pages (single spaced, mind you). What was all the ruckus, you ask?

It was a vigorous give-and-take between two loyal Flypaper readers, Ze'ev Wurman and Karl Wheatley. Ze'ev once served as Senior Adviser in the U.S. Department of Education and helped shape California's math standards; Karl is Associate Professor of Early Childhood Education at Cleveland State University. Their long-winded debate started when Karl took umbrage at my accusation that education schools often don't deliver what all teaching candidates need-namely, a thorough understanding of the content they'll be teaching. By mentioning E.D. Hirsch's work, I thought Duncan highlighted the need for content-prepared teachers and content-rich curriculum.

Karl insisted that education professors (after all, he is one) ARE listening on this front, but that Duncan's proposals have "shown a weak grasp of the issues and what works in education." Eschewing "teacher-dominated" instruction, Karl goes on to say that "educational approaches with integrated, interest-based, real-life curriculum, substantial student choice, local control, and authentic assessment simply work better in the long run." Further, he insists that, "pretending a teacher who has content knowledge is ???highly qualified' is like pretending a plumber who owns a wrench is a good plumber."

Then Ze'ev picks up the gauntlet and reminds Karl of the...


As Amy indicates, the latest findings from the just-released National Center for Education Statistics (NCES) report contain few surprises, especially since we're well-versed in the differences between states' definitions of proficiency and proficiency as mapped onto a common scale (see here and here).

This NCES report maps state proficiency standards onto NAEP scales and concludes that:

All NAEP scale equivalents of states' reading standards were below NAEP's Proficient range; and in mathematics, only two states' NAEP scale equivalents were in the NAEP Proficient range (Massachusetts in grades 4 and 8, and South Carolina in grade 8). In many cases, the NAEP scale equivalent for a state's standards, especially in grade 4 reading, mapped below the NAEP achievement level for Basic performance.

Yikes. Dig into the report and you'll find several tables that show which states have lowered and raised their proficiency standards between 2005 and 2007. The data are rightly separated into those states that have results that can be compared (e.g., because they have the same tests in place) and those that cannot (e.g., they changed their standards/tests/testing policies). Of those states with comparable data, we see that New Jersey's NAEP scale equivalent in grade 4 reading has increased roughly 11 points over the last two years, while South Carolina's has dipped roughly 6 points. Interestingly, South Carolina has some of the highest proficiency standards in the nation in both reading and math--why hit the brakes now? Did...

Amy Fagan

The National Center for Education Statistics (NCES) is out with a new report today that looks at state achievement levels using the common yardstick of the National Assessment of Education Progress (NAEP). Not great news. According to the AP story:

It found that many states deemed children to be proficient or on grade level when they would rate below basic or lacking even partial mastery of reading and math under the NAEP standards.

From the Ed Week story:

Their results suggest that 26 states, between 2005 and 2007, made their standards less rigorous in one or more grade levels or subjects.

Our Amber Winkler shared her thoughts on the matter with both Education Week and the Christian Science Monitor.

Might I just point out that the Fordham Institute actually did a very similar report back in 2007 - the Proficiency Illusion. That report used a Northwest Evaluation Association test as a common yardstick. It too found that "proficiency" varied wildly from state to state, with "passing scores" ranging from the 6th percentile to the 77th.

Fordham went even further earlier this year, in the Accountability Illusion. That report examined how state accountability (AYP) rules under the No Child Left Behind Act varied from state to state as well.

Check them both out!...


Not long ago we presented a graphic illustrating the gross discrepancy between Ohio's achievement test scores and those from NAEP. Such "grade inflation" is common in the post-NCLB era, where many states appear to select standards and assessments - not based solely on academic rigor - but in order to ensure that more students are proficient?? and to bump up their state scores.

Unsurprisingly, the recent release of math results from the National Assessment of Education Progress confirms this trend, with 78 percent of Ohio fourth graders passing the state's math test, compared to only 45 percent who passed NAEP. In eighth grade, the gap is even wider, with 71 percent of students passing the Ohio math exam, but just 36 percent passing NAEP.

??ODE vs. NAEP Math Proficiency Averages, 2008-2009

Sources: Ohio Department of Education; National Assessment of Educational Progress in Math, 2009

A spokesperson at the Ohio Department of Education said that "they are different tests with different functions" and that it is understandable for Ohioans to be confused by the dramatic difference in scores. Today's Columbus Dispatch article makes a more compelling case when it says that "it could be that state standards aren't as stringent as those measured by the national exam," and "when a state has lower passage rates on the NAEP, you ???can't really escape a conclusion that low performance on NAEP is a signal that there...
Eric Ulas

Ohio's school district rating system has been getting criticism lately, and for good reason: the category of Continuous Improvement (a "C" rating) is so broad that it is nearly meaningless.

The graph below illustrates how many indicators were met by 79 school districts receiving a "C" on the 2008-09 state report card. Districts can meet up to 30 indicators, which are based on achievement test scores, graduation and attendance rates.

Number of performance indicators met by Ohio districts with a "C" rating, 2008-2009

Source: Ohio Department of Education

While nearly a fifth of "C" districts met barely any indicators (0-8), another four percent met nearly all of their performance indicators yet still received the same letter grade.

Confused? So are a lot of people. ??

Kettering City Schools met 29 out of 30 performance indicators while Marion City Schools met zero indicators, yet both received a "C" from the state. The reason for Kettering's low grade, as Emmy describes, is that they failed to make AYP with English language learners and special education students. Without this AYP provision in Ohio's rating system, Kettering would have ranked four categories higher, or Excellent with Distinction ("A+"). ??

Conversely, districts such as Columbus, Akron, and Cincinnati get a bump up in their ratings because they did make AYP, despite only meeting 6 indicators.??????

In a recent letter to the Dayton Daily News, Terry cautions...

Eric Ulas

Video is now available from our recent event, World-Class Academic Standards for Ohio, which was held October 5 in Columbus, Ohio.

What do state and national experts make of the "Common Core" standards effort??? How can states go about crafting top-flight standards??? How will the Buckeye State respond to the Common Core effort and a recent legislative mandate to upgrade its standards? ??Click on the links below to find out.

Opening Speaker:

Why World-Class Standards?

David Driscoll, former Massachusetts Commissioner of Education

Panel Sessions:

Current Efforts to Create National (???Common???) Standards

Michael Cohen, Achieve Inc.

Gene Wilhoit, Council of Chief State School Officers

Chester E. Finn, Jr., Thomas B. Fordham Institute, moderator

Highlighting the Efforts of Top Performing States??

David Driscoll, former Massachusetts Commissioner of Education

Stan Jones, former Indiana Commissioner for Higher Education

Sue Pimentel, StandardsWork

Bruno Manno, Annie E. Casey Foundation, moderator

Moving Forward in Ohio

Deborah Delisle, Ohio Department of Education



In the Fordham Institute's latest report--Stars By Which to Navigate? Scanning National and International Education Standards in 2009--expert reviewers appraised the Common Core drafts, which outline college and career readiness standards in reading, writing, speaking and listening, and in math. These draft standards were made public on September 21 by the National Governors Association and the Council of Chief State School Officers. This report goes further however--Fordham's reviewers also evaluate the reading/writing and math frameworks that undergird the National Assessment of Educational Progress (NAEP), the Trends in International Mathematics and Science Study (TIMSS) and the Programme for International Student Achievement (PISA). How strong are these well-known models? This report presents their findings.

Eric Ulas

The Thomas B. Fordham Foundation is a charter-school authorizer in our home state of Ohio and we currently oversee six schools in Cincinnati, Columbus, Dayton, and Springfield.??In the Buckeye State, academic performance of schools is gauged by both student proficiency rates and progress (using a "value-added " measure).??Schools are expected to help students make one year or more of academic progress annually and are given a value-added ranking of "below," "met," or "above" corresponding with how much growth their students made. We're proud of the academic progress our schools made last year compared to their district and charter peers. The following chart shows the percent of students in schools by "value-added" rating for Fordham-authorized schools, the home districts in which they are located, and charter schools in the state's eight major urban areas.

Percent of Students in Fordham-authorized Schools, Home Districts, and "Big 8" Charter Schools by Value-Added Rating, 2008-09

Source: Ohio Department of Education interactive Local Report Card

The Education Gadfly

The Fordham Institute's newest report???-Stars By Which to Navigate? Scanning National and International Education Standards in 2009--reviews the ???Common Core??? draft standards in math and reading/writing/communications (these drafts were made public on September 21). Our subject-content experts confer ???B??? grades on these drafts; the effort is off to a good start! Are there things to improve? You betcha. As for other influential barometers and benchmarks of educational performance, our reviews also examine the reading/writing and math frameworks behind NAEP, TIMSS, and PISA. Check out the report to find out which ones shine brightly and which ones are dull.