Ohio Policy

We know that teacher quality is the most important in-school factor impacting student performance—and that the variation in...
Politicians are wise to pay attention to public opinion data, but they are also responsible for crafting sound policies based on...
As students and teachers settle back into school routines, thousands of high schoolers are getting their first taste of classes...
College may not be for all , but it is the chosen path of nearly fifty thousand Ohio high school grads . Unfortunately, almost...
Ohio’s report card release showed a slight narrowing of the “ honesty gap ”—the difference between the state’s own proficiency...
School report cards offer important view of student achievement - c ritical that schools be given continuity moving forward The...
Today, the U.S. Department of Education (USDOE) announced that it would release the $71 million Charter School Program (CSP)...
GOP presidential candidate Donald Trump recently visited Cleveland Arts and Social Sciences Academy, a charter school educating...
There are emerging signs, as I’ve written , that Ohio’s charter law overhaul (HB 2) is working. Significant numbers of poorly...
Last week, several of my Fordham colleagues published a fantastic fifty-state review of accountability systems and how they...
How does teaching stack up to other occupations in terms of compensation? A recent analysis from the Economic Policy Institute (...
Although recent analyses show that the child poverty rate isn't as high as many people believe , the fact remains that millions...
Columbus Mayor Andrew Ginther is passionately outspoken about Columbus City Schools. He is an alumnus of the district, and his...
Columbus Collegiate Academy (CCA) epitomizes the relentlessness and vision necessary to close achievement gaps in urban education...
Columbus Collegiate Academy (CCA) epitomizes the relentlessness and vision necessary to close achievement gaps in urban education...
Ohio has developed one of the nation’s best school report cards , packed with data and clear A–F ratings for schools and...
August 16 marked the first day of school for the thousands of children who attend the Dayton Public Schools (DPS). They returned...
Competency-based education has attracted attention as a “ disruptive innovation ” that could remake American schools. Under this...
Chronic absenteeism among students elicits serious concern for good reason. When pupils miss many days of school, they risk...
The new education law of the land—the Every Student Succeeds Act (ESSA)—has been the talk of the town since President Obama...
Rabbi Eric "Yitz" Frank
This blog was originally posted on Education Next on July 24, 2016. The Thomas B. Fordham Institute recently released a study on...
Eighteen months ago, Ohio proved it was finally serious about cleaning up its charter sector, with Governor Kasich and the Ohio...
Shortly after Ohio lawmakers enacted a new voucher program in 2005, the state budget office wrote in its fiscal analysis, “The...

We know that teacher quality is the most important in-school factor impacting student performance—and that the variation in teacher quality can be enormous, even within the same school. We also know that most teachers are paid according to step-and-lane salary schedules that exclusively reward years on the job and degrees earned. These systems pay no attention to instructional effectiveness, attendance, leadership and collaboration within one’s school, or any other attributes relevant to being a good worker.

When I entered the classroom at age twenty-two, I looked at my contract and realized I wouldn’t reach my desired salary until I was in my mid-to-late forties. I would reach that level regardless of whether I took one or fifteen sick days every year; whether I put in the bare minimum or a herculean effort (as many educators do in fact do); or whether I clocked out at 3:01 or stayed with my students to offer extra help. No matter the outcomes my kids achieved, my salary would steadily tick upward based only on time accrued. Predictable, yes. But given the urgent task at hand—to keep excellent educators at the instructional helm, address the challenges of burnout and attrition, and professionalize teaching—it’s woefully insufficient. 

That’s why the breakdown of the Cleveland Metropolitan School District’s (CMSD) innovative teacher pay system is so disappointing. Developed in partnership with the Cleveland Teachers Union as part of a comprehensive package of reforms to improve the city’s schools, CMSD’s new teacher pay system was codified in 2012 by HB 525. The law earned rare bipartisan support and teacher buy-in and was the first -of-its-kind in Ohio to base annual salary increases on factors beyond years of experience and degrees—which should matter to some extent, just not singularly.

The multifaceted system went beyond the typical forms of “merit pay” largely disliked by teachers. The law required that all of the following be considered: the level of license a teacher holds; whether the teacher is highly qualified; ratings received on performance evaluations; and any “specialized training and experience in the assigned position.” Further, it allowed (but did not require) the district to compensate teachers for additional factors: working in a high-needs school (those receiving Title 1 funds); working in a challenged school (those in school improvement status); or teaching in a grade-level or subject with a staff shortage, a hard-to-staff school, or a school with an extended day or year—all of which are worthy of reward.

The system informed by the law and agreed upon in the 2013 teachers contract retained a fifteen-step pay system, but allowed for teachers’ placements within that system to be determined by how many “achievement credits” they earned, rather than by years of service and degrees. (Teachers were to earn credits through strong evaluation ratings as well as the ways described above.) Depending on their credit total, newer teachers could be placed further along in the new step system than in the previous model, while more experienced teachers wouldn’t automatically go to a higher rung (though no existing teacher would see her pay cut as a result of the new plan and all teachers received a one-time $1,500 bonus during the transition to the new pay scale).

But Cleveland’s promising compensation strategy has fallen apart just three years in, illustrating how a promising plan can die in the hands of bureaucrats and interest groups. There’s been mounting frustration that teacher raises were tied too heavily to annual evaluations rather than a combination of factors as allowed (but not required) by the original law. Despite “hours of meetings over the last four years,” the district and teachers union couldn’t come to basic agreements about how to define or reward performance. And even though 65 percent of teachers earned significant salary increases during the plan’s first two years, the union complained that some stipends were one-time allowances rather than permanent salary bumps. Meanwhile, the district never granted extra compensation for hard-to-fill jobs (saying there were none), nor would it pay extra for teachers working in corrective action schools. Last month, Cleveland teachers moved to strike, forcing all parties back to the negotiating table to reach a deal before the start of the school year.

That deal erases nearly all of the reforms enacted three years ago. It still grants teacher raises according to annual ratings but flattens those raises out and ensures that nearly everyone except the gravely incompetent earn them. Any teacher receiving the top three ratings—accomplished, skilled, or developing—will get the same raise. Much like the traditional step-and-lane structure, it treats nearly all teachers in equal fashion. The key difference is that ineffective teachers (just one percent of CMSD’s teaching force in 2014–15) will have their pay frozen, and teachers earning the top ratings will get a one-time $4,000 bonus. This will benefit CMSD’s best teachers and is perhaps the only detail deserving of praise.

Cleveland’s capitulation is discouraging, especially given the plan’s potential and the manner in which it fell apart. As the fact-finding report depressingly noted, “Neither time nor resources have been expended to build out the system. As a consequence, the District lost the opportunity to lead the country with respect to innovation where compensation systems are concerned.”

Cleveland’s is a cautionary tale about the importance of what happens in the weeds after a law passes. One conclusion to draw is that the details related to CMSD’s teacher pay plan should have been better prescribed in law, leaving no room for either gridlock or shirking of responsibility by either party. It might also point to the obvious fact that it’s very difficult to achieve change with so many parties at the table—and that’s why policymakers feel they have to resort to “top-down” policy changes like the Youngstown Plan. I’d venture to guess that most policymakers and leaders would like to achieve local buy-in and cooperation—at the very least, few would eschew it on principle. Ohio lawmakers left some details open for CMSD and the union to sort out for themselves—respecting local autonomy and not wanting to over-prescribe policy details. Yet this is where the plan dissipated.

Perhaps the most daunting takeaway is that sustainable change is often resisted, stalled, or derailed due to cognitive inertia—a psychology term that describes what happens when long-held beliefs endure even in the face of counterevidence. When it comes to teacher pay, there seem to be deeply ingrained beliefs that best way to pay teachers is the old, industrial-era manner in which we’ve always done it (despite evidence to the contrary). Excellent teachers stand to benefit the most from differentiated pay systems. Developing and effective teachers do, too—by receiving meaningful professional development and seeing improvements over time. Only the least effective educators stand to lose anything. Yet teachers aren’t coming out in droves to demand better pay systems that develop and reward them—not in Cleveland and not in most of Ohio.

Bipartisan backing and early teacher buy-in in Cleveland clearly were not enough to prevent the model’s breakdown and the district’s default to a system that largely treats all teachers equally. It may be tempting to conclude from the collapse of CMSD’s promising teacher-pay plan that the law should have been more specific, or that top-down reforms may work better to overcome local gridlock. An equally plausible observation—and one that education reformers may do well to consider—is that until prevailing opinion changes within the profession itself, improvements to teacher pay models will be difficult to sustain. Even heavily prescribed plans can be reversed later. Meanwhile, teachers themselves should consider how moving away from a factory model of compensation to one differentiated for performance and skills is one key step toward reaching a long-held goal: professionalizing teaching. 

Politicians are wise to pay attention to public opinion data, but they are also responsible for crafting sound policies based on research and evidence. So what are they supposed to do when these two goods conflict?  

Anya Kamenetz at NPR was the first to highlight the contradiction between newly released poll results from PDK International and a variety of research related to school closures (“Americans Oppose School Closures, But Research Suggests They're Not A Bad Idea”). The PDK survey revealed that 84 percent of Americans believe that failing schools should be kept open and improved rather than closed. Sixty-two percent said that if a failing public school is kept open, the best approach to improvement is to replace its faculty and administration instead of increasing spending on the same team. In other words, the majority of Americans are firmly committed to their community schools—just not the people working in them.

These findings shouldn’t come as a huge surprise (as my colleague Robert Pondiscio pointed out here). No one wants to see a school closed, no matter how persistently underperforming. For many communities, schools offer not just an education, but a place to gather that’s akin to communal houses from the past. Enrichment and after-school programs—which are profoundly important for low-income youth—often benefit from the use of school buildings. Buildings can also house wrap-around services like health centers, adult education centers, or day care centers.

In addition to their community-wide implications, school closures have also been called “psychologically damaging” for students. A 2009 University of Chicago report examining closure effects on displaced students in Chicago Public Schools (CPS) found that “the largest negative impact of school closings on students’ reading and math achievement occurred in the year before the schools were closed,” leading researchers to believe that closure announcements caused “significant angst” for students, parents, and teachers that may have affected student learning.

The report also indicates that one year after students left their closed schools, their reading and math achievement was not significantly different on average from what researchers “would have expected had their schools not been closed.” This is possibly due to the fact that most displaced students re-enrolled in academically weak schools—which, though disappointing, isn’t a huge surprise either. Even those who see value in school closures will point out that if there aren’t enough high-quality seats elsewhere for displaced students, the action simply reshuffles students from one bad school to another.

So if closing schools is bad for communities and students and the American public hates it, then why is it happening? As Kamenetz points out in her NPR piece, research shows that closing schools isn’t always bad—and therein lies the contradiction. The same University of Chicago study that points to “significant angst” and flat achievement also acknowledges that “displaced students who enrolled in new schools with higher average achievement had larger gains in both reading and math than students who enrolled in receiving schools with lower average achievement.” Translation? Displaced kids who end up in better schools do better. Fordham’s 2015 study on school closures and student achievement found similar results: Three years after closure, students who attended a higher-quality school made greater progress than those who didn’t. In a recent study of closures in New York City, researchers found that “post-closure students generally enrolled in higher-performing high schools than they would have otherwise attended” and that “closures produced positive and statistically significant impacts on several key outcomes for displaced students.”      

School turnarounds, on the other hand, have almost always been found to disappoint

In The Turnaround Fallacy, Andy Smarick offers three compelling arguments for giving up on “fixing” failing schools. First, data shows that very few past turnaround efforts have succeeded. (California is a prime example: After three years of interventions in the lowest-performing 20 percent of schools, only one of 394 middle and high schools managed to reach the mark of “exemplary progress.” Elementary schools fared better—11 percent met the goal—but the results were still disheartening.) Second, there isn’t any clear evidence for how to make turnaround efforts more successful in the future. Even the Institute of Education Sciences seems unable to find turnaround strategies that are backed by strong evidence. And finally, although the long list of turnaround failures in education makes it reasonable for advocates to look outside the sector for successful models to import, there aren’t many. A review of the “two most common approaches to organizational reform in the private sector” found that both approaches “failed to generate the desired results two-thirds of the time or more.”

Let’s review. Many American schools are consistently failing to properly educate students. The public doesn’t like the idea of closing these schools, but many research studies indicate that students who re-enroll in higher-performing schools perform better than they would have if they’d stayed in their previous schools. What the American public wants is to improve failing schools instead of closing them. Unfortunately, research shows that school turnarounds haven’t worked in the past—and no one has any idea how to make them work in the future. Overall, the phrase “damned if you do, damned if you don’t” seems particularly apropos.

So what’s a policy maker to do when schools are failing to properly educate students? Expanding the number of high-quality seats is a good place to start. We might not have a clue about how to turn around bad schools, but we do know of school models that work, especially in the charter sector. Policy makers who want to give kids immediate access to a great education should invest in expanding and replicating schools and networks that are already doing an excellent job.

Boosting the supply of excellent schools will lead to two important changes. First, many families and children will get immediate relief—and life-changing opportunities in new, better schools. And second, over time, failing schools will see their enrollment plummet, creating a fiscally unsustainable situation. At that point, officials can shut them down—not because they are failing academically, but because they are failing financially. And to my knowledge, no public opinion poll has shown Americans averse to closing half-empty, exorbitantly expensive schools. At least not yet.

College may not be for all, but it is the chosen path of nearly fifty thousand Ohio high school grads. Unfortunately, almost one-third of Ohio’s college goers are unprepared for the academic rigor of post-secondary coursework. To better ensure that all incoming students are equipped with the knowledge and skills needed to succeed in university courses, all Ohio public colleges and universities require their least prepared students to enroll in remedial, non-credit-bearing classes (primarily in math and English).

Remediation is a burden on college students and taxpayers who pay twice. First they shell out to the K–12 system. Then they pay additional taxes toward the state’s higher education system, this time for the cost of coursework that should have been completed prior to entering college (and for which students earn no college credit). The remediation costs further emphasize the importance of every student arriving on campus prepared.

Perhaps the bigger problem with remedial education is that it doesn’t work very well. In Ohio, just 51 percent of freshmen requiring remediation at a flagship university—and 38 percent of those in remedial classes at a non-flagship school—go on to complete entry-level college courses within two academic years. It’s even worse at community colleges: Just 22 percent of students go on to take a college course that is not remedial.  

While far too many college-bound students in Ohio aren’t ready for college upon matriculating, the Buckeye State has made some progress in recent years. Back in 2012, 40 percent of entering college students required remedial coursework, raising concerns of an Ohio college remediation rate crisis. But the three most recent years of data show Ohio’s remediation rate has decreased to 37 percent in 2013, and now to 32 percent for the high school graduating class of 2014. According to the Ohio Department of Higher Education’s most recent report, more students required math remediation (28 percent) than English (13 percent), and 10 percent of first-time students enrolled in both remedial math and English courses.

Table 1. Remediation by subject area

Source: Ohio Department of Higher Education, “2015 Ohio Remediation Report”

In the absence of rigorous research, we can only speculate about what’s behind this drop in remediation rates. One possible explanation is that fewer students who need remedial education are going straight to college. If this were true, we might expect to see college-going rates declining commensurately with the decrease in remediation rates. But college-going rates, while falling between 2009 and 2013, jumped by 5.6 percent from 2013 to 2014. Though we can’t rule it out entirely, this suggests that college-going trends are probably not a leading explanation for the recent fall in remediation.

Another possibility is that the population of students going to college in 2014 was actually better prepared than in previous years. Thirty-two percent of first-time college students in 2014 required remediation upon entry, compared to 41 percent of first-time students in 2009. Between 2009 and 2014, Ohio implemented higher K–12 educational standards; it is possible that we’re starting to see the fruit of those efforts. (In 2012, Ohio began implementing the Common Core academic standards in math and English language arts, along with new learning standards in science and social studies.) At the very least, it doesn’t appear that rising academic standards are having an adverse impact on college readiness. Despite all the travails, the new learning standards might be giving Ohio’s young people a modest boost when it comes to readiness. Not bad!

Or maybe the credit goes to the implementation of Ohio’s “remediation-free” standards in 2013. Ohio’s standards (for public colleges and universities) detail the competencies and ACT/SAT scores each student must achieve in order to enroll in credit-bearing courses. Now students can predict from their ACT subject scores whether they’ll be able to directly enroll in credit-bearing courses. Many states and colleges opt to enroll all students in credit-bearing coursework with increased support instead of offering remedial courses. But Ohio’s standards fail to address how remedial students must be served and whether their remedial status bars them from acquiring credit even with increased support. However, these statewide standards are also being used to hold high schools accountable for college-preparedness; remediation-free status is now also incorporated in the Prepared for Success measure on the state’s school report cards. Maybe this policy is working as intended—encouraging students to improve their reading and math skills before they reach campus.

Further, it is worth considering whether Ohio’s remediation rate decline is being driven by the incentives its colleges and universities face. Public funding for higher education in Ohio is not linked to the remediation rate, but 50 percent of funding for two-year and four-year institutions is determined by the percentage of degree completions (the graduation rate), which also heavily impacts college rankings. To increase graduation rates and rankings, many universities may seek to decrease the number of students they accept who fall below the remediation-free threshold. Still, this preference does not change the number of students in need of remediation as determined by their ACT score.

Ohio’s declining need for remedial education is good news, though there’s still a ways to go before all students matriculating to college are truly ready for it. It’s not entirely clear what driving this trend—whether it’s enrollment patterns, policy implementation, a bit of both, or other explanations that we didn’t consider. Certainly more research and analysis on this topic is needed to determine causation. In the meantime, we’ll need to monitor how the remediation trend unfolds in the years to come. The falling remediation rates at least indicate that the state is moving in the right direction. If Ohio can stay the course and maintain high academic standards and a focus on college preparedness, the gap between college aspirations and college readiness will hopefully close even further. 

Ohio’s report card release showed a slight narrowing of the “honesty gap”—the difference between the state’s own proficiency rate and proficiency rates as defined by the National Assessment of Educational Progress (NAEP). The NAEP proficiency standard has been long considered stringent—and one that can be tied to college and career readiness. When states report inflated state proficiency rates relative to NAEP, they may label their students “proficient” but they overstate to the public the number of students who are meeting high academic standards.

The chart below displays Ohio’s three-year trend in proficiency on fourth and eighth grade math and reading exams, compared to the fraction of Buckeye students who met proficiency on the latest round of NAEP. The red arrows show the disparity between NAEP proficiency and the 2015-16 state proficiency rates.

Chart 1: Ohio’s proficiency rates 2013-14 to 2015-16 versus Ohio’s 2015 NAEP proficiency

As you can see, Ohio narrowed its honesty gap by lifting its proficiency standard significantly in 2014-15 with the replacement of the Ohio Achievement Assessments and its implementation of PARCC. (The higher PARCC standards meant lower proficiency rates.) Although Ohio did not continue with the PARCC assessments, the chart above indicates that Ohio continued to raise its proficiency benchmarks on its new reading exams (AIR/ODE developed). Math proficiency, however, remained virtually unchanged in these grades from 2014-15 to 2015-16.

Despite the frustration that some schools are expressing, Ohio policy makers should be commended for continuing to raise standards in 2015-16. Parents and citizens are now getting a much clearer picture of where students stand relative to rigorous academic goals. 

Twenty-five years into the American charter school movement there remains little research on the impact of charter authorizers, yet these entities are responsible for key decisions in the lives of charter schools.

A new policy brief from Tulane University’s Education Research Alliance seeks to shed some light on authorizer impact in post-Katrina New Orleans, specifically does the process by which applications are reviewed help to produce effective charter schools? And after those schools have been initially authorized, does that process also shed light on which types of charter schools get renewed?

It merits repeating that the authorizing environment in New Orleans was unlike anywhere else in the country: Louisiana had given control of almost all New Orleans public schools to the Board of Elementary and Secondary Education (BESE) and the Recovery School District (RSD). Independent review of charter applications was mandated in state law, and tons of organizations applied to open new charters.

To facilitate the application process, BESE hired the National Association of Charter School Authorizers (NACSA). NACSA reviewed and rated applications, and in most cases BESE followed those recommendations. As the authors point out, NACSA is the largest evaluator of charter applications in the country and the extent of its work in New Orleans provides some insights regarding the potential impact of authorizer decisions.

First, NACSA examined much more than the charter application alone, including information gleaned via interviews and site visits. The authors found that the only factor that predicted both charter approval and renewal is a school’s rating from NACSA. Interestingly, the authors also found a number of application factors that had no effect on application approval or renewal, including: number of board members with backgrounds in education, whether partners (vendors providing services such as curricular materials, tutoring, college advising, social services, etc.) were for-profit or non-profit, whether a national charter management company (CMO) was involved, whether a principal had been identified at the time of the application, and the amount of instructional time and professional development proposed.

Second, there does not appear to be a link between these application factors and future school performance. However, it did appear that applicants with non-profit partners showed lower state performance scores, lower overall enrollment, and lower enrollment growth than those without such partners.

In terms of charter renewal, it appears that School Performance Scores (the SPS includes indicators of assessment, readiness, graduation, diploma strength and progress) and value added (growth) are strong predictors of charter renewal (in addition to the initial NACSA rating). And while charter schools with higher enrollment growth were more likely to be renewed, enrollment levels themselves were not a factor in renewal decisions.

The takeaway for authorizers is that past performance is the best predictor of future success, and that some of the criteria we typically include in applications (e.g., board members, partners, school leader) really aren’t predictive of anything. Looking at a paper application simply isn’t enough. Authorizers must also examine qualitative data (such as interviewing school leaders and references, and making detailed site visits.

The study acknowledges a few of its limitations: lack of a clear scientific basis for determining which application factors to measure; the ability to only observe the future performance of schools with the strongest applications (the worst applications didn’t make the cut); and, importantly, the fact that many authorizers (nationally, not just in Louisiana) simply have not had enough applications and renewals to make a comprehensive study possible.

But fear not: Those of us at Fordham have our own study in the works to look at this question in four states. Stay tuned!

SOURCE: Whitney Bross and Douglas N. Harris, "The Ultimate Choice: How Charter Authorizers Approve and Renew Schools in Post-Katrina New Orleans," Education Research Alliance (September 2016).

The annual release of state report card data in Ohio [LINK to ODE] evokes a flurry of reactions, and this year is no different. The third set of tests in three years, new components added to the report cards, and a precipitous decline in proficiency rates are just some of the topics making headlines. News, analysis, and opinion on the health of our schools and districts – along with criticism of the measurement tools – come from all corners of the state.

Fordham Ohio is your one-stop shop to stay on top of the coverage:

  • Our Ohio Gadfly Daily blog has already featured our own quick look at the proficiency rates reported in Ohio’s schools as compared to the National Assessment of Educational Progress (NAEP). More targeted analysis will come in the days ahead. You can check out the Ohio Gadfly Daily here.
  • Our official Twitter feed (@OhioGadfly) and the Twitter feed of our Ohio Research Director Aaron Churchill (@a_churchill22) have featured graphs and interesting snapshots of the statewide data with more to come.
  • Gadfly Bites, our thrice-weekly compilation of statewide education news clips and editorials, has already featured coverage of state report cards from the Columbus Dispatch, the Dayton Daily News, and the Cleveland Plain Dealer. You can have Ohio education news sent direct to your Inbox by subscribing to Gadfly Bites.

And most importantly, Fordham’s own annual analysis of Ohio report card data. We look in-depth at schools, districts, and charter schools in the state’s Big 8 urban areas. You can see previous years’ reports here and here. Look for it in the coming days.

SIGN UP for updates from the Thomas B. Fordham Institute