Ohio Education Gadfly

A compilation of viewpoints on vital education issues in Ohio this spring 

NOTES: The Thomas B. Fordham Institute occasionally publishes guest commentaries on its blogs. The views expressed by guest authors do not necessarily reflect those of Fordham.

This piece was originally published in the Dayton Daily News.

When you ask most people, “What should a high school diploma represent?” They’ll tell you, “It means a student has a 12th grade education.” If only that was the truth. Unfortunately, in Ohio, it’s not.

This year’s diploma recipients will have completed 15 required high school courses and at least five elective courses. The required courses include four years of English, four years of math, three years of science, and three years of social studies. In addition, students will have scored proficient on the five sections (reading, writing, mathematics, science and social studies) of the Ohio Graduation Test. The dirty little secret, though, is that the Ohio Graduation Test is a test of 8th grade knowledge. Do most students graduate with more than an 8th grade education? Of course. But an 8th grade education is the minimum.

Back in 2010, Ohio made a decision. An eighth grade education isn’t enough. An eighth grade education is not enough for our students to succeed in what comes after high school – life, further education, and career. It’s not enough to ensure the continued economic success of our communities and our state.

Researchers that study the changing nature of the U.S. workforce say loudly and clearly that today, and into the future, more and more people will need some education past high school to get a job that pays a living wage and leads to a life-sustaining career. That doesn’t mean a student has to go to college – it could mean a certificate program, an apprenticeship, or an employer training program. One thing is clear; you cannot make a living or build a career working a minimum wage job! And yet, for too many young Ohioans a minimum wage job is where they’re headed with the equivalent of an eighth grade education.

In 2010, Ohio set in motion higher standards, and the expectation that in order to earn a diploma students should demonstrate at least a 10th grade level of learning to earn a diploma. Our districts and schools have known this for 6 years, and the expectation is that they have been ramping up and providing the educational experiences necessary so that this year’s junior class can be the first class to meet this higher standard. In addition to taking the required courses students will need to take seven tests over the four years of high school (instead of the five test required previously). These are tests of freshman and sophomore level English, freshman and sophomore mathematics (Algebra and Geometry), American History, American Government, and Biology.

If you think about it, it doesn’t really seem to be so daunting. During the four years, a student is in high school, they’ll take at least four years of English. You’d think that educating a student sufficiently to pass tests of freshman and sophomore level English should be pretty easy. Similarly, students will have to take four years of math courses. Is it unreasonable to expect students to acquire the knowledge and skills needed to pass Algebra I and Geometry tests by the end of four years? The remaining tests, American Government, American History and Biology are subjects that have been required courses for years. Students have multiple opportunities to take each test. And students don’t even have to score at a proficient level on every test. Let me tell you what the cut scores are for each of the 7 tests.

  • Algebra 1 Basic 27% Proficient 38%
  • Geometry Basic 22% Proficient 35%
  • Integrated Math 1Basic 27% Proficient 40%
  • Integrated Math 2 Basic 26% Proficient 39%
  • English Language Arts 1Basic 40% Proficient 52%
  • English Language Arts 2 Basic 35% Proficient 48%
  • Biology Basic 26% Proficient 30%
  • American History Basic 31 Proficient 42%
  • American Government Basic 27% Proficient 39%

Only one of those cut scores has a cut score greater than 50%.  Each and everyone of them would be an F in any classroom in this state.

How are some school districts reacting to this reality? The districts most committed to students are saying, “We agree with the standards and the assessments. We’re taking this issue seriously, and we’re putting in place what’s needed to help our students meet these higher standards. We’re going to do whatever it takes to support students reaching the requirement.” Districts in this category represent the best kind of no-excuses, get-it-done attitude and commitment that reflects what makes Ohio great. It reflects an understanding that our students are certainly up to the task, and our teachers, schools and administrators are up to the challenge too. Some of these districts make the case that Ohio may need a longer transition to meet the standard. The need for a little more time might be something worth exploring.

At the other end of the spectrum are districts that say, “The sky is falling! 40% of our students won’t graduate. We’ll have a lot more students without diplomas – and you can’t even get a Walmart job without a diploma.” They’ll go on to tell you that the tests are too hard, and they simply don’t know what to do to help students reach these higher levels. They might even suggest that students simply can’t reach this higher bar. Not everybody needs Algebra, right? Who really uses Geometry, or Biology?

This perspective misses the whole point. It places more importance on the symbolism of the diploma rather than the learning it should stand for. That view adopts the worst kind of defeatist attitude that undervalues the capability of our students, and, frankly undervalues the capabilities of the teachers and professionals that every day commit themselves to providing the best educational opportunity possible. It closes doors for students at a time when we should be making every possible future pursuit a viable one.

Perhaps what is most encouraging is that Ohio’s been in this kind of situation before. When the Ohio Graduation Tests were first implemented, we had some schools and districts sound the alarm that many students wouldn’t graduate. But we got it done – because educators, state government, communities, and partners all worked together to identify the strategies and actions that needed to be taken to get there.

What happens if we decide it’s just too hard?  Businesses will continue to struggle to find workers with the knowledge and skills to do the increasingly complex work that represents the new normal. They’ll go elsewhere to places that can meet their workforce needs. Colleges will continue to enroll students who can make it through the front gate, but don’t have what it takes to cross the finish line. The patterns we see today of high levels of students dropping out of college with high debt and no hope of having the means to repay will continue.

Our students deserve better. Yes, it will be hard work. Yes, it will push all of us outside our comfort zones. We know that the conditions aren’t always ideal for change to occur. We’ll find strength in working together and supporting each other, and knowing that the work we do will create hope – hope for our students, hope for our communities, and hope for the future of our state. Let’s commit ourselves once again – educators, state government, communities, and partners of all varieties – to do what we know can be done. Our children and future generations will thank us.

Tom Gunlock is a Centerville businessman who served six years on the State Board of Education of Ohio, including two as president. He left the panel earlier this year.


“Government by the people” is one of the most powerful ideas in American government. It represents the belief that, in a democracy, the people hold sovereignty over government and not the reverse.  

I bring this up as a way of considering how far we can deviate from this ideal. Take a look at Ohio’s newly formed assessment committee, which is charged with the important task of reviewing assessment policies. While this is an “advisory” committee with no formal policymaking authority, one expects its recommendations to make headlines and capture legislator attention. The state superintendent recently appointed twenty-three committee members and it’s almost entirely comprised of government (i.e., public school) employees. As you can tell from the table below, public school administrators, principals, and teachers hold eighteen of the twenty-three seats—a large majority.


* One member is considered tentative

Administrators and teachers should definitely be part of this conversation. But it’s not right to stack this committee with public employees whose own interests are also at stake. For instance, it’s no secret that many school officials want to weaken Ohio’s assessment and accountability policies. One reason: Their lives get easier when the state waters down assessments (and accountability based on them). Under an “A’s for all” approach, they’ll surely face fewer nosy questions from their boards, parents, and community members about how well their schools are preparing kids for college and career. It may even increase the likelihood that levies pass and lead to pay raises for them and their staff.

It’s a shame that this committee couldn’t have been more representative of families, taxpayers, and employers. They too have a stake in how Ohio assesses and reports student learning. Parents have an interest in their own kids’ state test scores, as well as how their school handles matters of state (including test prep) and local assessments. Taxpayers, more broadly, also have an interest in assessment and accountability policies.  Given the amount we spend on K-12 education—roughly $20 billion per year in local, state, and federal dollars—they deserve honest gauges of how students and schools are doing based on objective achievement data. Finally, Ohio’s employers should be an engaged partner in this conversation as well. They rely on a strong K-12 school system to meet their workforce needs in a competitive, global marketplace. Understanding this, the U.S. Chamber of Commerce has been one of the staunchest advocates of higher standards and strong accountability policies.

Assessment and their related accountability policies affect Ohioans everywhere. Yet the recently formed review committee doesn’t reflect a wide spectrum of voices. It’s almost entirely comprised of public employees with their own narrow interests. Whatever the committee recommends this summer should be taken with a hefty grain of salt.


The manner in which Ohio funds charter schools is controversial and is a serious contributing factor to the antipathy felt toward them. Traditional public school districts argue that Ohio is “taking money away,” even going so far as to invoice the state department of education for the money they feel they’ve “lost” to charter schools. This is one way of increasing publicity around Ohio’s imperfect funding system, but it also fuels misperceptions about how charter funding works and increases hostility between the sectors. It also belies the notion that the state funds children, not buildings or staff positions.

In a recent Fordham paper done in conjunction with Bellwether Education Partners, “A Formula That Works: Five ways to strength school funding in Ohio," we recommend doing away with Ohio’s current method of indirect funding. This approach has state dollars for charter schools “pass through” districts—thus appearing to be a subtraction from their bottom line. The reality is far more complicated and has been explored in previous Ohio Gadfly posts, like “’That’s not how this works!’ – correcting the rhetoric around public charter schools” and “Straightening the record on charters and local tax revenue.”

Take a look at our animated briefing, “Ohio Charter School Funding: Confusing and Controversial.” It explains how funding works for both districts and charter schools and why there is so much confusion around how charters are funded. It also shows why some can be lead to think that charters receive more per pupil funding (they don’t) or “steal” local funding from districts (again, they don’t). Charter schools receive about one third less in total—considering state, local, and federal revenue—than their traditional counterparts, despite serving students who are predominantly low-income and/or students of color.

Ohio’s current method of funding charter schools—and schools of choice more broadly, for that matter—is confusing, inefficient, and creates controversy rather than collaboration. For these reasons and more, it’s time that Ohio lawmakers consider direct funding.




School funding policies continue to be a subject of intense debate across the nation. Places as diverse as Alabama, Connecticut, Illinois, Kansas, Maryland, and Washington are actively debating how best to pay for their public schools. According to the Education Commission of the States, school finance has been among the top education issues discussed in governors’ State of the State addresses this year. 

States have vastly different budget conditions and a wide variety of policy priorities. No one-size-fits-all solution exists to settle all school funding debates. But there is a common idea that every state can follow: Implement a well-designed school funding formula, based on student needs and where they’re educated. Then stick to it.

A recent study commissioned by Fordham and researched by Bellwether Education Partners looks under the hood of Ohio’s school funding formula. Our home state’s formula is designed to drive more aid to districts with greater needs, including those with less capacity to generate funds locally, increasing student enrollments, or more children with special needs. In large part, Ohio’s formula does a respectable job allocating more state aid to the neediest districts. According to Bellwether’s analysis, the formula drives 9 percent more funding to high-poverty districts. This mirrors findings from the Education Trust which also found that Ohio’s funding system allocates more taxpayer aid to higher poverty districts.

Still, the Buckeye State has much room for improvement in its funding policies. And it’s worth highlighting three lessons from the study, as they illustrate challenges other states might face when designing a sound funding formula.

First, states should allow their formula to work—and not create special exceptions and carve outs. Our study found that the majority of Ohio districts have their formula aid either capped or guaranteed, meaning allotments are not ultimately determined by the formula. Instead, caps place an arbitrary ceiling on districts’ revenue growth, even if they are experiencing increasing student enrollment. Conversely, guarantees ensure that districts don’t receive less money than a prior year—they “hold harmless” districts even if enrollment declines. While caps and guarantees may be necessary during a major policy shift, allowing them exist for perpetuity, as Ohio does, undermines the state’s own formula. Ideally, all districts would receive state aid according to a well-designed formula. They shouldn’t receive more or less dollars through carve outs such as funding caps and guarantees.

Second, policymakers in choice-rich states need to make clear that funds go to the school that educates a student—and not necessarily her district of residence. Ohio has a wide variety of choices, including more than 350 charter schools, several voucher programs, and an inter-district open enrollment option. Yet the state takes a circuitous approach to funding these options, creating unnecessary controversy and confusion. The state first counts choice students in their home districts’ formula and then funds “pass through” to their school of choice. This method creates the unfortunate perception that choice pupils are “taking” money from their home districts, when in fact the state is simply transferring funds to the school educating the child. (For more on Ohio’s convoluted method to fund schools of choice, check out our short video.) To improve the transparency of the funding system in Ohio, we recommend a shift to “direct funding.” Under such an approach, the state would simply pay the school of choice without state dollars passing through districts.

Third, states should ensure the parameters inside the formula are as accurate as possible. Ohio, for example, faces a problem when assessing the revenue-generating capacity of school districts. A longstanding state law generally prohibits districts from capturing additional revenue when property values rise due to inflation, unless voters approve a change in tax rates. But this “tax reduction factor” is not accounted for in the formula, leading to an underestimation of the state’s funding obligations. Solid gauges of property and income wealth, along with sound measures of enrollment and pupil characteristics, are essential ingredients to a well-designed formula.

The realm of school finance is vast, encompassing a seemingly endless number of challenges. We don’t cover it all in this one report. But state policymakers would be wise to focus on the design and implementation of the school funding formula. It’s a key policy lever in efforts to create a fairer and more equitable funding arrangement for all students, regardless of their zip code or school of choice. Creating a solid formula—and ensuring its use—is hard work, but it might be our best bet for settling the debates over school funding.  


Research on individualized, in-school tutoring such as Match Corps has demonstrated impressive results. A report from the Ohio Education Research Center examines a tutoring intervention developed by Youngstown City Schools and Youngstown State University to help more students meet the test-based promotion requirements of Ohio’s Third Grade Reading Guarantee.

Called Project PASS, the initiative enlisted almost 300 undergraduate students who weekly tutored second and third graders outside of regular instructional time. Each undergrad committed thirty hours per semester and received course credit and a small monetary award in return. The tutors received training and used a variety of reading strategies. The evaluation includes about 300 students who participated in one or more semesters of PASS from spring 2015 (second grade) to spring 2016 (third grade). The evaluation was not experimental, and the self-selection of students into PASS limits the ability to draw causal inferences, as the authors note. Nevertheless, the researchers were able to match participants and non-participants based on demographic and prior achievement data (using a second grade diagnostic test given before program launch) to compare test score outcomes.

The results indicate that the tutoring increased their state test scores in third grade reading. PASS participants scored significantly higher than non-participants on the reading part of their spring 2016 third grade ELA exam. The higher scores translated to an increased likelihood of meeting the promotion requirements of the Third Grade Reading Guarantee by 29 percentage points. Interestingly, the positive results were largely driven by participants who had also received tutoring in prior semesters (e.g., fall 2015 and spring 2016). However, for “new” PASS participants—those in the program only that spring—the gains were smaller and not significant.

The analysts conclude, “It may take students time to acclimate to PASS tutoring before they reap rewards in the subsequent semester.” That sounds right. Students who stick with the program—and receive higher dosages of tutoring—stand to benefit the most. Hopefully, the university and school district can also sustain what sounds like a promising partnership, while other communities without a program like this might just take a look and see what Youngstown is doing.  

Source: Adam Voight and Tamara Coats, Evaluation of Grades 2 and 3 Reading Tutoring Intervention in Youngstown, Ohio Education Research Center (2016).


Getty Images/Rauluminate

Michael J. Petrilli

Secretary DeVos can be explained and forgiven—especially in these wee early days of her tenure—for bringing many of her public statements back to the theme of school choice. After all, that was President Trump’s one big education idea on the campaign trail, and the public-policy cause to which DeVos has dedicated her life.

And yet.

There’s a saying that when all you have is a hammer, every problem looks like a nail. Team DeVos loves the hammer called school choice. (I do, too!) But they could use a few more tools in their toolbox, lest they cause some serious damage to the construction project we call school reform. Secretary DeVos: May I offer you a screwdriver?

The trouble started with her White House comments to presidents of Historically Black Colleges and Universities. These postsecondary institutions were “pioneers for school choice,” she said in prepared remarks. “Tone deaf,” Representative Barbara Lee (D-CA) tweeted. “HBCUs weren’t ‘more options’ for black students. For many years they were the only options.” The DeVos team soon walked it back.

But, it didn’t stop there. A week later, when speaking to the Military Child Education Coalition, she praised America’s servicemen and women “for joining the military with the express purpose of having access to the Department of Defense’s excellent schools—a patriotic form of school choice if I’ve ever seen one.” (The 74’s Matt Barnum was rightly skeptical, pointing out that most recruits enlist at age 18, long before becoming parents to school-age children.) She complicated matters by asking “why we should restrict the DOD schools only to military personnel” and called on Defense Secretary Jim Mattis to “tear down this wall” that keeps non-military children out of these “excellent educational institutions.”

Diplomatic relations were not improved when she addressed a group of education ministers from the member nations of the Organization for Economic Development (OECD) a few days later. “There’s little doubt,” she argued, “that the many migrants moving around the world are a classic example of pro-school choice parents voting with their feet.”

DeVos didn’t stay for questions and the Trump administration refused to comment. Moreover, she wasn’t seen or heard from for nine days.

DeVos re-emerged this Wednesday for a keynote address at the Association of Title IX Administrators. “Why do we insist on perpetuating the myth,” she asked the audience, “that the growth in women’s athletics was due to a federal regulation? Long before Title IX there was school choice, in the form of all-girls schools, and those girls, let me tell you, knew how to play a mean game of field hockey!” The crowd, initially silent for what seemed like ages, eventually erupted into heckles and boos. Her Secret Service detail rushed her off stage and into a black SUV, wherein she made an escape.

To be sure, advocating for school choice is an important, and legitimate, part of Secretary DeVos’s job. Particularly if President Trump’s $20 billion school choice proposal is to get traction in Congress, it’s going to take consistent persuasion and leadership from the top. But if DeVos is truly going to break some glass, and make real change, she needs to give her hammer a little bit of rest.


We look at Ohio’s standardized testing regimen, a crackdown on e-schools, teacher evaluations, and more

After much criticism, state superintendent Paolo DeMaria decided to delay Ohio’s submission of its ESSA plan until September. One of the chief complaints was that the plan did not propose any cutbacks on the number of state assessments students take, and a committee is now forming to examine whether any could be culled.

The committee will find that most state assessments must be given to comply with federal law. ESSA, like No Child Left Behind before it, requires annual exams in grades 3-8 in math and English language arts (ELA); science exams once in grades 3-5 and 6-8; and one high school math, ELA, and science exam. This leaves just seven of twenty four state exams on the table for discussion: four social studies assessments, two high school end-of-course exams, and the fall third-grade ELA exam. Ohio students spend less than 2 percent of their time in school taking these state tests.

While eliminating any of these assessments would slightly reduce time on testing, doing so also comes at a steep price. Let’s take a closer look.

Social Studies Exams

Ohio currently administers exams in grades 4 and 6 social studies and end-of-course assessments in US history and US government. The Buckeye State has a relatively long history of exams in social studies (previously called “citizenship”). Ohio’s old ninth grade citizenship tests go back to 1990, and tests in grades four and six were added in the mid-1990s. In 2009, the state suspended social studies testing due to budget cuts in grades four and six, but they resumed in 2014-15. The state uses the results from social studies exams in its school accountability system.

One of the central missions of education is to mold young people into the knowledgeable citizens needed for informed participation in democratic life. Over time, though, American schools have crowded out social studies. Based on studies of instructional time, Harvard’s Martin West writes “Test-based accountability can result in a narrowing of the curriculum to focus on tested subjects at the expense of those for which schools are not held accountable.” Abandoning Ohio’s social studies tests could encourage further narrowing. Meanwhile, as several analysts (including Fordham’s Robert Pondiscio) have argued, in today’s raucous political environment, students need solid civics instruction now more than ever.

Of course, testing alone can’t cure all that ails social studies instruction. For instance, a mere 18 percent of American eighth graders reached proficiency on NAEP’s 2014 US history exam; in civics, the proficiency rate was just 23 percent. But ensuring its place among Ohio’s assessments counterbalances the incentive for schools to concentrate on ELA and math at the expense of social studies. It would also signal a clear commitment that social studies, American history, and US government are an integral part of students’ education.

End of Course Exams (EOC)

Ohio recently implemented two sets of EOCs in math and ELA at the high school level and could, under federal law, drop one in each subject.[1] The exams are Algebra I and Geometry (or Integrated Math I and II) along with ELA I and II. Starting with the class of 2018, the EOCs replace the Ohio Graduation Tests (OGTs) as the exams taken in high school. The OGTs were widely considered to be low-level exams assessing eighth grade content. The EOCs raise the bar for students, as they test content from their current high school courses—not stuff they were supposed to have learned years ago. EOC implementation is also part Ohio’s move towards college and career ready standards, including test alignment to the state’s new learning standards in high school math and ELA. Like Ohio, many states—also shifting to higher standards themselves—have decided to move towards EOCs in high school.

Commitment to higher expectations and more challenging high school assessments are needed in Ohio. Post-secondary data show that too many Buckeye students are not prepared for college. For example, roughly one-third of Ohio’s college freshman needs remedial English or math. Too few young people make it to college completion: Based on ODE’s post-secondary statistics, just one in three of the class of 2009 obtained an Associate degree or higher six years after post-secondary matriculation. Employers have repeatedly indicated that many young people are not ready for the demands of today’s workplaces. Testing twice in high school in math and ELA should keep high schools—and their students—focused on the goal of readiness for college or career.

Dropping a set of EOCs could also place at risk an important advancement in Ohio’s accountability system. With EOC implementation, the state recently began to calculate value added (or growth) for high schools. This has been a step forward, as high schools had previously been judged on the basis of graduation rates and simple test scores—poor measures due to their close link with demographics and prior achievement. While it may be possible to calculate value added based on just one set of EOCs, a second assessment yields results in which we can have more confidence. It increases the sample size, which in turn allows for more precise statistical estimates of student growth. Additionally, since a second EOC covers a larger number of students attending a particular high school, the results better portray overall school performance. The results from just one grade may not reflect the performance of a school with four grade levels.

Third Grade ELA (Fall Administration)

This is the first of two ELA state tests that third-graders take, the other being the normal spring exam. It may benefit school leaders and educators to have the early results, especially with the Third Grade Reading Guarantee’s retention provisions in effect. For instance, they may want to know which students are most in need of immediate attention before taking the spring exams.


Calls to ditch state exams are sure to be loud as the superintendent’s committee starts its work. Its members—and the legislature, which would ultimately make decisions about state testing—should think carefully about the consequences of abandoning any of these exams.

[1] Another option might be to replace both sets of math and ELA EOCs with the ACT or SAT. Though not addressed here, pursuing this alternative would also entail tradeoffs needing careful consideration.


E-schools, a.k.a. virtual charter schools, have been so thoroughly mired in controversy that they’ve become radioactive in most education discussions. Or in most discussions, period. The current dispute in Ohio is largely technical and centers on the extent to which e-schools provide learning opportunities to students rather than merely offering them. This is much more than semantics; how to track attendance and student log-ins for funding purposes is at the heart of a year-long lawsuit against the Ohio Department of Education (ODE) by one of the state’s largest and most politically influential e-schools. Hundreds of millions of public dollars are at stake.

There have also been broad concerns about e-schools’ lagging performance in Ohio as well as nationally. Last year, a trio of education groups, including long-time charter advocacy organizations, began to share their concerns more publicly, offering policy recommendations to base funding on performance and consider creating enrollment criteria for students. These bold suggestions were embraced shortly thereafter by Ohio’s Auditor of State, Dave Yost, who recently ordered a statewide examination of how online charters collect learning and log-in data.

So it’s no surprise that Senator Joe Schiavoni, a long-time advocate for charter accountability, is back at it with another bill. His latest proposal (SB 39) would add new rules for Ohio’s virtual schools, which collectively serve 38,000 students—well, some of them. E-schools overseen by school districts would get a pass. If Schiavoni hopes to be a serious champion for quality, he needs to drop the legislation’s double standards, carve-outs, and special exemptions, which are precisely what Ohio’s latest charter reforms were meant to eliminate. Now is not the time for new loopholes or favoritism.   

Quick glance at SB 39

Let’s set aside whether the bill’s main provision—requiring e-schools to keep record of the number of hours students spend engaged in learning opportunities—is necessary. (In our view, Ohio’s charter reform law, HB 2, put adequate provisions in place to allow ODE to require log-in and attendance data.) What stands out most is the bill’s brazen partiality for district-sponsored e-schools and an attempt to hold them to a lower standard despite the fact that they presumably face the same issues with attendance tracking.

Take a look at some of SB 39’s provisions below, which would only apply to e-schools not sponsored by school districts. (This is not meant to be a comprehensive analysis of SB 39.)

  • Each e-school must keep track of the number of hours each student is “actively participating in learning opportunities”
  • Parents or guardians must be notified when a student fails to participate in learning opportunities for ten days in a row
  • The school must calculate full-time equivalency (FTE)—which is the basis for how e-schools are paid—according to the amount of time a student was engaged in learning opportunities
  • E-schools must provide additional report card information, including mobility data
  • Test scores of students enrolled in an e-school who transfer back to their resident school district will be included in the e-school’s accountability report if the student had attended there more than 90 days
  • Schools must include their A-F report card grades on their advertising, recruiting, or promotional materials
  • Public meetings of an e-school must be made available to the public through live streaming
  • If an e-student’s performance declines, her parents, teachers, and principal must “confer to evaluate” whether the student should continue in that school

The nine e-schools sponsored by school districts (out of 23 total in the state) would be unaffected by these changes. This begs all sorts of questions. If Schiavoni believes detailed log-in records are necessary to adequately gauge learning time in a virtual setting, why wouldn’t it also be applied to district e-students? If a student misses 12 school days, wouldn’t all parents of e-students appreciate notification? If transparency in governing board meetings (which are already public) are so important, why not require live streaming for all e-schools? Why not all publicly funded schools, for that matter? Why should the state change its funding calculation, but only for some e-schools? If test scores from e-students shouldn’t mar school districts in the event that students transfer back, is it fair (or even legal) to count scores in some instances but not others? What would happen if one of Ohio’s large e-schools switched to a district sponsor; would these provisions no longer apply to them?

Wanted: consistency

Despite most aspects of SB 39 being unnecessary and/or overboard, the bill’s intention to improve e-school accountability is reasonable given their performance history and the amount of funding at stake. Still, the carve-outs for district-affiliated schools have no place in an accountability bill. SB 39 also shines light on hypocrisy among some members of the General Assembly, who may be motivated by antipathy for Ohio’s big virtual charter networks, allegiance to traditional public schools, or both. Plain Dealer reporter Patrick O’Donnell covered this inconsistency in December with a particularly poignant headline: “A few online schools want special treatment to avoid paying money back to state.” Several district-sponsored e-schools serving at-risk students—faced with the threat of having to pay back public funds—made “emotional pleas” to the state legislature. They won the hearts of some Democrats—several of whom are typically unabashed in calls for more accountability for e-schools and charters broadly.

Perhaps those same lawmakers don’t realize that 40 percent of Ohio’s e-schools are sponsored by school districts. Many are low-performing and post similar scores (sometimes lower scores) compared to other e-schools. If SB 39 is sound policy, it should be applied across the board. On the flipside, if SB 39 would create undue compliance burdens, jeopardize the education of hard-to-serve youth, or impose unreasonable standards, Schiavoni and his fellow Democrats are entitled to be concerned. The problem lies in caring about only some schools and students, while thousands of others would be disparately impacted for the sole reason that they opted out of the traditional public school system. 


When the Ohio Teacher Evaluation System (OTES) went into effect in 2011, it was the culmination of a process that began back in 2009 with House Bill 1. This bill was a key part of Ohio’s efforts to win the second round of Race to the Top funding, which, among other things, required states to explain how they would improve teacher effectiveness.

Beyond bringing home the bacon, Ohio’s evaluation system aimed to accomplish two goals: First, to identify low-performing teachers for accountability purposes, and second, to help teachers improve their practice. Unfortunately, as we hurtle toward the end of the fourth year of OTES implementation, it’s become painfully clear that the current system hasn’t achieved either goal.

To be fair, there have been some extenuating circumstances that have crippled the system. Thanks to its ever-changing assessments, Ohio has been in safe harbor since the 2014-15 school year, which means that the legislature prohibited test scores from being used to calculate teacher evaluation ratings. As a result, the full OTES framework hasn’t been used as intended since its first year of implementation in 2013-14. But even back then, OTES didn’t offer much evidence of differentiation—approximately 90 percent of Ohio teachers were rated either accomplished or skilled (the two highest ratings) during the first year, and only 1 percent were deemed ineffective.

Despite the fact that most teachers earn the same ratings, their experience with the system can vary wildly depending on the grade and subject taught. To understand why, it’s important to understand how the current system works: In Ohio, there are two teacher evaluation frameworks that districts choose between. The original framework assigns teachers a summative rating based on teacher performance (classroom observations) and student academic growth (student growth measures), with both components weighted equally at 50 percent. The alternative framework also assigns a summative rating based on teacher performance and student academic growth, but changes the weighting and adds an additional component: 50 percent on teacher performance, 35 percent on student growth, and 15 percent based on alternative components, such as student surveys. Under both frameworks, there are three ways to measure student growth: value added data (based on state tests and used for math and reading teachers in grades 4-8), approved vendor assessments (used for grade levels and subjects for which value added cannot be used), and local measures (reserved for subjects that are not measured by traditional assessments, such as art or music). Local measures include shared attribution, which evaluates non-core teachers based on test scores from the core subjects of reading and math, and Student Learning Objectives (SLOs), which are long-term academic growth targets set by teachers and measured by teacher-chosen formative and summative assessments.

Results from these frameworks have left many teachers feeling that the system—and the student growth component in particular—is unfair. They’re not wrong. As our colleague Aaron Churchill wrote back in 2015, Ohio teachers with student growth evaluated based on value added measures[1] were less likely to earn a top rating than teachers using other methods. A 2015 report from the Ohio Educational Research Center (OERC) found that 31 percent of Ohio teachers used shared attribution to determine their student growth rating—meaning nearly a third of teachers’ ratings were dependent on another teacher’s performance rather than their own. SLOs, meanwhile, are extremely difficult to implement consistently and rigorously, they often fail to effectively differentiate teacher performance, and they’re a time-suck: A 2015 report on testing in Ohio found that SLOs contribute as much as 26 percent of total student test-taking time in a single year. In essence, OTES doesn’t just fail to differentiate teacher performance—it fails to evaluate teachers fairly period.

As far as professional development goes, the results probably haven’t been much better. A quick glance at the ODE template for a professional growth plan, which is used by all teachers except those who are rated ineffective or have below-average student growth, offers a clue as to why practice may not be improving: It’s a one-page, fill-in-the-blank sheet. Furthermore, the performance evaluation rubric by which teachers’ observation ratings are determined doesn’t clearly differentiate between performance levels, offer examples of what each level looks like in practice, or outline possible sources of evidence for each indicator. In fact, in terms of providing teachers with actionable feedback, Ohio’s rubric looks downright insufficient compared to other frameworks like Charlotte Danielson’s Framework for Teaching

In short, OTES has been unfair and unsuccessful in fulfilling both of its intended purposes. Luckily, there’s a light at the end of the tunnel: ESSA has removed federal requirements for states related to teacher evaluations. This makes the time ripe for Ohio to improve its teacher evaluation system. We believe that the best way to do this is to transform OTES into a system with one specific purpose—to give quality feedback to teachers to help them improve their craft.

A series of new recommendations from Ohio’s Educator Standards Board (ESB) contains some promising proposals that could accomplish this, including a recommendation to end Ohio’s various frameworks and weighting percentages by embedding student growth measures directly into a revised observational rubric.[2] Ohio teachers would then have their summative rating calculated based only on a revised observation rubric rather than a combination of classroom observations and student growth components. Specifically, ESB recommends that five of OTES’ ten rubric domains incorporate student growth and achievement as evidence of a teacher mastering that domain. These domains include knowledge of students, differentiation, assessment of student learning, assessment data, and professional responsibility.

Not only would teachers be required to “use available high-quality data[3] illustrating student growth and achievement as evidence for specific indicators in the OTES rubric,” they would also be required to use these data “reflectively in instructional planning and in other applicable areas of the revised OTES rubric.” This will go a long way toward convincing teachers that assessment data can help improve their practice rather than just unfairly “punish” them. Most importantly, though, it reflects a solid understanding of how good teachers use assessments and data already.

The only problem with this idea is that ESB recommends including value added measures based on state tests as part of the new system. State tests weren’t designed for and weren’t intended to measure teacher effectiveness. So rather than carrying these assessments into a revised system, we propose that the role of state tests in teacher evaluations cease completely. Removing state tests from consideration and letting districts select formative and summative assessments with real classroom purposes is a far better way to fulfill the ESB’s call to “promote the use of meaningful data by teachers and districts that reflects local needs and contexts.”

As with many policy proposals, there are some implementation issues that could undermine the potential of this recommendation. The revision of the rubric—and how assessments are incorporated into it—will be hugely important. If the use of student achievement and growth becomes just one of many evidence boxes to check off rather than a deciding factor for both performance ratings and which professional development opportunities to explore, then the rubric won’t be honest and won’t lead to effective professional development.

To be clear, this suggestion isn’t an attempt to rollback teacher accountability. Rather, it’s an acknowledgement that Ohio’s current system—before and during safe harbor—doesn’t actually hold anyone accountable. Ohio doesn’t have a statewide law that permits the dismissal of teachers based solely on teacher evaluation ratings, so even for the small number of teachers who are identified as ineffective there aren’t meaningful consequences. Moreover, the testing framework built specifically for OTES has created its own bureaucracy and helped feed the anti-testing backlash.

Data show that well-designed evaluation systems based solely on rigorous observations can impact the quality of the teacher workforce. By transforming OTES into a system that focuses on teacher development, we don’t just get improvement for teachers and better learning experiences for kids; we could also end up effectively differentiating teachers without high stakes testing. What’s not to like about that?

[1] According to our previous calculations, approximately 34 percent of Ohio teachers are evaluated based on value added.

[2] A separate ESB recommendation not explored in this piece advises that the OTES rubric be updated in collaboration with a national expert in rubric design and the assessment of teaching. This revision process is likely how student growth measures would be embedded into the rubric.

[3] The ESB notes that “ODE will establish high-quality criteria which all growth and achievement data must meet.”



A recent report from Education Northwest extends previous research by the same lead researcher, drilling down into the same dataset in order to fine-tune the original findings. That earlier study (June 2016) intended to test whether incoming University of Alaska freshmen were incorrectly placed in remedial courses when they were actually able to complete credit-bearing courses. It found that high school GPA was a stronger predictor of success in credit-bearing college courses in English language arts and math than college admissions test scores. The follow-up study deepens this examination by breaking down the results for students from urban versus rural high schools, and for students who delay entry into college.

In general, the latest study’s findings were the same. Except for the students who delayed college entry, GPA was generally found to be a better predictor of success in college coursework than were standardized test scores. It stands to reason that admissions test scores would better represent the current abilities of students who delayed entry into college (call it the final “summer slide” of one’s high school career), and indeed the previous study showed that students who delayed entry were several times more likely to be placed into developmental courses than were students who entered college directly after high school graduation. But does this mean that colleges err when they use such test scores to place incoming students? The Education Northwest researchers believe so, arguing that colleges should use high school GPAs in combination with test scores, with the former weighted more highly since GPAs can more effectively measure non-cognitive skills they deem more relevant to college success.

But it is worth noting that both of their studies are limited by a few factors: First, there are only about 128,000 K–12 students in all of Alaska, and its largest city, Anchorage, is about the same size as Cincinnati. A larger, more diverse sample (Baltimore, New York, Atlanta, or even the Education Northwest’s hometown of Portland, Oregon) could yield different results. Second, there is no indication that the University of Alaska students were admitted or placed solely on the basis of admissions test scores. Sure they’re important, but not every school puts Ivy League emphasis on test scores to weed out applicants. Third, the “college success” measured here is only a student’s first credit-bearing class in ELA and math. That seems like a limited definition of success for many students; depending on one’s major, math 102 is harder than math 101. Fourth, “success” in these studies merely means passing the class, not getting an A. If a student’s high school GPA of 2.5 was better at predicting his final grade in the college class (a D) than was his SAT score (in the 50th percentile), only Education Northwest’s statisticians should be happy about that. A more interesting and useful analysis would look at the difference in success rates between students with high versus low GPA, students with high versus low test scores, or students who earned As versus Ds in the college courses.

Previous studies have shown correlation between high GPA and high ACT scores. There’s lots of talk that test scores are (but shouldn’t be) the most important factor when it comes to college admissions decisions, and the “who needs testing?” backlash at the K–12 level appears to have reached upward to colleges. This study is not the silver bullet that’s going to slay the admissions testing beast, but more care must be taken at the college level to avoid incorrect and money-wasting developmental placements. It is to be hoped that at least part of the answer is already in development at the high school level (high standards, quality curricula, well aligned tests, remediation/mastery) and that colleges will be able to jump aboard and calibrate their admissions criteria to maximize high levels of performance, persistence, and ultimately degree attainment.

SOURCE: Michelle Hodara, Karyn Lewis, “How well does high school grade point average predict college performance by student urbanicity and timing of college entry?” Institute of Education Sciences, U.S. Department of Education (February, 2017).


It’s that time of year when many of us are searching desperately for a local Girl Scout troop in order to buy some cookies. (Helpful hint: It’s super easy to find a cookie booth near you.) But the Girl Scouts aren’t just the bearers of thin mint goodness—the organization also has a research arm, which recently published The State of Girls 2017, an examination of national and state-level trends related to the health and well-being of American girls.

The report analyzes several indicators including demographic shifts, economic health, physical and emotional health, education, and participation in extracurricular/out-of-school activities. Data were pulled from a variety of national and governmental sources, including the U.S. Census Bureau and the U.S. Centers for Disease Control and Prevention. Trends were analyzed from 2007 through 2016.

American girls are growing more racially and ethnically diverse along with the rest of the country’s population. The report notes that the percentage of white school-age girls (ages five to seventeen) decreased from 57 percent in 2007 to 51 percent in 2016. Meanwhile the percentage of Hispanic/Latina girls increased from 20 to 25 percent while the percentage of Black girls decreased from 15 to 14 percent. Approximately 26 percent of all school-age girls are first- or second-generation immigrants, up from 23 percent in 2007. 34 percent of girls live in single-parent homes, and 41 percent of girls live in low-income families. Both these percentages are slightly higher than they were in 2007. 

For girls’ physical and emotional health, there’s both good news and bad. Most risky behaviors—such as smoking cigarettes and alcohol use—have declined. Fewer girls report being bullied, though there has been a slight increase in the number of girls who report being victims of cyberbullying. But there’s worrisome data surrounding emotional health: In 2015, 23 percent of high school girls reported seriously considering suicide, compared to 19 percent in 2007. The rate was highest among ninth-graders (27 percent). In addition, approximately 13 percent of low socioeconomic-status girls reported being depressed compared to 9 percent of more affluent girls. The report’s authors concluded that these data demonstrate the need for “better mental health assessments and interventions for youth in schools and communities.”

Speaking of school, the data related to high school completion and reading and math proficiency should already be familiar to those in the education world. The high school dropout rate has decreased for girls, but it’s significantly higher among low-income girls than among their higher income peers—6 percent compared to 2 percent, respectively. Using NAEP as its basis, the report also notes that although reading and math proficiency has generally improved for girls, achievement gaps based on race and income persist.

Perhaps the most interesting aspect of this report are the data on extracurricular and out-of-school activities. It’s a widely accepted fact that enrichment and extracurricular opportunities matter. Unfortunately, consistent school athletic participation is significantly lower for low-income girls: 17 percent participated regularly, compared to 31 percent of higher income girls. And it’s not just sports, either. Low-income girls also have lower levels of extracurricular participation in areas like community affairs or volunteer work and student council/government.

These statistics on America’s girls serve as a solid reminder that schools and nonprofit groups have a big role to play in ensuring that all young women have the opportunity to succeed.

SOURCE: “The State of Girls 2017: Emerging Truths and Troubling Trends,” The Girl Scout Research Institute (2017).  


We look at Ohio’s new requirement that high school juniors take the SAT or ACT, dig into the state’s school funding formula, continue our look at Ohio’s ESSA plan, and more.

Back in 2014, the passage of House Bill 487 ushered in major changes to Ohio education policy, including new high school graduation requirements for 2018 and beyond. Among the new provisions was a requirement that all juniors take a college-admissions exam. Previously, only those students and families considering going to college forked over the money to take a test designed to measure college readiness. Starting this spring, however, Ohio joins several other states  in requiring 11th graders to take either the ACT or SAT (it’s up to districts to choose which one to administer). To offset the mandate’s expense, the state will pick up the tab on testing costs.

Despite recent calls for the Ohio Department of Education (ODE) to reduce state testing, there’s been little pushback about requiring 11th graders to take a college admission exam, probably because the results won’t be a significant part of the state accountability system. It could also be because folks have bigger fish to fry when it comes to fighting the new graduation requirements. Regardless, the statewide administration requirement, which some students have already started taking, is good education policy. Here are a few reasons why having juniors take the ACT or SAT is a good idea.

  1. Opening doors to postsecondary options. According to ACT, admittedly a self-interested source, many students who were not considering college have gone on to attend after earning an encouraging score as part of a statewide administration. Many of these students were from traditionally underrepresented groups—minority and low-income students. Data out of Kentucky corroborates ACT’s findings and show that college-going rates have improved since it became state law in 2008 to administer the ACT. Illinois has also experienced a similar increase in overall college enrollment.     
  2. Providing useful and easily comparable information. Both the ACT and the SAT offer national and state-specific annual reports about student results. But until now, these data were limited because they included only students who chose to take the assessments. Although Ohio’s performance compared to other states will probably drop because all students will be taking the test, communities across the state and education policy leaders will have a far more wide-ranging picture of students’ achievement. Comprehensive ACT/SAT results could also help identify achievement gaps, offer more details about growth (or lack thereof) over time, and serve as a comparison point for Ohio’s new end-of-course exams and national exams like NAEP. Schools and teachers could also use these data to intervene with students in need of remediation before they graduate—which could save students and their families both time and money.
  3. Helping improve alignment to the state accountability system. Although the state-funded administration of the ACT and SAT won’t be included as part of the state’s proficiency and growth components, there is a state report card indicator that takes these tests into account: the Prepared for Success component. For this component, districts are graded on an A-F scale based on a point system made up of a variety of measures, including points for each student who earns a remediation-free score on either the ACT or SAT. Prior to the statewide administration of these tests, districts were limited by the number of students who chose to take them. With the statewide administration, it’s possible that more students will achieve a remediation-free score and give a more accurate view of how well students are being prepared for life after high school.  

It’s important to note that although the state will pay for only one administration for each student, low-income students can still go through their high school counselor to access fee waivers to take both tests. This is good news—it means that like their more affluent peers, low-income students will still have the opportunity to take college admissions tests multiple times. Furthermore, although waivers have long been available to low-income students, the process of obtaining one, or simply being unaware such waivers even existed, could have prevented many low-income students from signing up to take the test. Statewide administration ensures that everyone will have the opportunity to take the exam at least once.

As with any policy, there’s always room for improvement. Right now, the state-funded administration doesn’t include the writing component for either test. This is worrisome not just because writing is perhaps the most important skill needed for college success, but also because some schools include the writing portion as part of their application requirements. This means that students who otherwise would have taken the test only once must take it again in order to complete the writing portion. That seems incredibly wasteful in an era where we are debating over-testing. The benefits of statewide administration—greater awareness for students and more and better data—are lessened by the fact that the state doesn’t fund the writing component.

It’s imperative that policy makers move quickly to fund the full ACT and SAT assessments, not just parts of them. But in the meantime, Ohio has finally joined the ranks of 24 other states that are implementing this important policy. Statewide administration opens doors for all students, including traditionally underrepresented groups and those that may doubt their potential—and that’s definitely worth celebrating. 


School funding debates are as predictable as the seasons, and right on cue, the release of Governor John Kasich’s biennial budget has precipitated hand-wringing from various corners of Ohio. Why? Like many other states, Ohio’s budget is tightening and his plan would reduce the amount of state aid for dozens of districts that have been consistently losing student enrollment.

No public entity anywhere has ever been happy about receiving less money than the year before; every elected leader worth their salt is going to fight for more resources for their own constituents. The challenge ahead for thoughtful policy makers is to distinguish the typical bellyaching from legitimate and serious problems in Ohio’s school funding policies.

To help, we are pleased to present this analysis of Ohio’s school finance policies. It gets under the hood of the Buckeye State’s education funding formula and tax policies and seeks to understand how well they promote two essential values: Fairness and efficiency. Why these two? Consider:

  • Ohio must lift student achievement to meet the demands of colleges and employers—an especially urgent imperative for children from low-income backgrounds. According to last year’s state test results, proficiency rates for economically disadvantaged students fell a staggering 30 percentage points below their peers. Funding structures should ensure that public funds are being fairly distributed to the districts and schools whose pupils have the greatest educational needs.[1]
  • Like many states, Ohio is experiencing increasing demand for school choice, including inter-district open enrollment, charter schools, private school vouchers, independent STEM schools, and college dual enrollment. Funding structures should be designed in ways that recognize the fact that more students are availing themselves of educational opportunities that do not follow the traditional organizational patterns by which K-12 education has long been funded.
  • According to the National Association of State Budget Officers’ December 2016 report, The Fiscal Survey of States, states are reporting tightening budget conditions in 2017. In recent comments, state budget director Tim Keen has indicated that Ohio will face budgetary constraints in the coming biennium. Challenging fiscal conditions only reinforce the need for an efficient allocation structures that make certain that every dollar is being used to educate students.

To offer an independent, critical review of Ohio’s funding policies in light of these concerns, we turned to Andy Smarick, formerly at Bellwether Education Partners and now at the American Enterprise Institute.

In 2014, we teamed up with Andy in a successful review of Ohio charter-school policies; we were exceptionally pleased when he accepted the challenge of analyzing our home state’s school-funding system. He enlisted Bellwether’s Jennifer O’Neal Schiess, who spent a decade working with the Texas legislature on school finance and education policy to lead the research efforts along with her colleagues Max Marchitello and Juliet Squire.

As readers will see, Ohio’s present approach has several strengths, including its ability to drive more state aid to more disadvantaged districts—via the State Share Index—and the added dollars for students with greater educational needs (e.g., pupils with disabilities or English language learners). Yet Bellwether also explains several elements of the present system that subvert its fairness and efficiency.

Three issues are particularly worrisome:

  • Caps and guarantees. More than half of Ohio districts were affected by funding caps or guarantees as recently as fiscal year 2016. A funding cap withholds state dollars that a district should receive under the formula, while a guarantee provides districts with state funds they should not receive under the formula. Caps and guarantees fail to meet standards of fairness and efficiency by undercutting the state’s own formula and the core principle that Ohio provides funding to districts based on the students whom they are responsible for educating. For example, the guarantee holds harmless certain districts with declining enrollment, effectively delivering state aid to educate “phantom students” who are no longer enrolled in that district. To ensure that all districts are funded according to the formula, legislators should eliminate the cap and guarantee.
  • Pass-through funding. Students exercising choice—e.g., charters, inter-district open enrollment, or independent STEM schools—are included in their home district’s funding formula. State funds are then deducted from their district and transferred to their school of choice. But more Ohio students are choosing non-home-district options every year, making this “pass-through” structure increasingly problematic. It creates the illusion that pupils exercising choice are “taking” money from their home district, when in fact state dollars go to the school that educates the child—as indeed they should. In addition, the inclusion of choice students in a district’s formula makes it look needier than it actually is (i.e., the district appears to have more kids to educate relative to its local tax base). This in turn muddles the calculations that ultimately determine the state’s funding obligation to that district. To create a cleaner and more efficient funding formula, legislators should eliminate the pass-through and instead fund schools of choice directly from the state.
  • Phantom property tax revenue. Since the mid-1970s, state law has prohibited districts from capturing additional tax revenue when property values rise due to inflation. While this law—referred to as “tax reduction factors”—protects homeowners from abrupt tax hikes, it also denies districts a certain amount of local revenue. Think of it this way: You have a home that was worth $100,000 but is now assessed at $150,000 because housing prices are booming. With few exceptions, your district does not generate revenue on that extra $50,000 absent a tax rate election. That’s a plus for the property owner, of course, and some would argue that voters should weigh in on increases in local revenue for schools. But the state’s formula automatically and incorrectly assumes the district earns tax revenue on that additional value—sometimes called “phantom revenue.” This in turn causes a miscalculation of the state’s funding obligation under the formula. To ensure fair funding calculations, legislators should discount the value of property that is impacted by tax reduction factors in the state funding formula. This recommendation would not affect a property owner’s tax burden, but would likely increase the state’s obligation to districts that, as a result of state law, are denied revenue tied to increasing property values.

These recommendations, along with a couple of others discussed in the paper, would greatly improve Ohio’s school finance system and drive limited state dollars to where they’re most needed. We urge that this be done.

Much work remains to be accomplished if Ohio is to craft a transparent, modern school-funding structure. We realize that the profound complexities and political realities of school funding policy make this a daunting task. In our view, the best course forward is to take one manageable step at a time. If state leaders make these essential repairs, Ohio will take its next step in the long journey toward a school funding system that supports an excellent education for all.

You can download the full report - A Formula that Works: Five ways to strengthen school funding in Ohio - here.

[1] This paper doesn’t touch on policies and practices that can promote the productive use of school funds at a local level. For Fordham policy briefs on this issue, see for example, Stretching the School Dollar and Getting Out of the Way.


Ohio’s Gap Closing report card component reports how students in certain subgroups perform on state tests and their schools’ graduation rates compared to the collective performance of all students in the state. The subgroups include racial/ethnic groups, students with disabilities, and economically disadvantaged pupils. Gap Closing is one of six major report card components and makes up 15 percent of a school district’s rating in Ohio’s current summative grading formula, set to officially begin in 2017-18.

Currently, Gap Closing compares subgroup proficiency on state assessments and graduation rates to a set, statewide standard—also known as an Annual Measureable Objective (AMO). These objectives rise gradually over time, heightening expectations for subgroup performance. When a school’s subgroup meets the AMO, the school receives the full allotment of points (“full credit”). When the subgroup fails to meet the objective, the school receives no credit—unless it makes improvements relative to the prior year. In such cases, the state awards partial credit. Those points are tallied across subgroups and divided by the points possible to compute a component grade reported on an A-F scale. In certain circumstances, schools’ Gap Closing letter grade could be demoted (e.g., A drops to a B).

Without a doubt, Gap Closing is a complicated report card element—it assesses the performance of ten distinct subgroups based on several different measures. In fact, one of us has suggested scrapping it altogether and starting over. Meanwhile, the Ohio Department of Education’s (ODE) ESSA feedback process yielded suggestions to rework the component—it was panned for not providing enough credit for the progress of students falling short of proficiency—and many Ohioans deem Gap Closing to be the “least useful” report card measure. In response to the feedback—and to some new federal requirements—Ohio’s draft ESSA plan proposes some important changes to the component. Let’s take a look. 

First, ODE would gauge subgroup achievement using the performance index instead of raw proficiency rates. Most readers are probably familiar with the performance index—it looks at achievement at multiple performance levels, such as proficient and advanced—as it has been used in overall school accountability for many years (just not to gauge subgroup achievement). Using the performance index instead of proficiency rates for subgroups is a good idea since it encourages schools to pay attention to students at all parts of the achievement spectrum, not just those at or around the proficiency bar.

Second, ODE plans to meet a new ESSA requirement—tracking the progress of English language learners (ELLs)—by creating a new indicator within the Gap Closing component. Instead of using Ohio’s general state assessments, this measure of ELL progress will use an alternative assessment: the Ohio English Language Proficiency Assessment (OELPA). The ELL portion will take into account English learners who attain proficiency on OELPA and those who make improvements but have not yet met the proficiency standard.

Third, Ohio is proposing to reduce its minimum “n-size” used for the Gap Closing component from thirty to fifteen students. For example, under the current n-size rules, a school with twenty students with disabilities would not be held accountable for their achievement as a separate subgroup. But it would under ODE’s ESSA plan. The upside of this proposal is that more schools will be held accountable for the separate performance of more subgroups, yielding greater transparency around results. The tradeoff is that it could force otherwise high-performing schools into one of Ohio’s intervention categories—a "focus" or "watch" school—based on the achievement of a smaller number of pupils than in previous years.

Two major concerns persist on Gap Closing.

First, ODE will continue to use annual changes in subgroup achievement as a measure of improvement. Quite frankly, they should stop doing this. This calculation doesn’t account for changes in the composition of a school’s subgroups from year to year. As a result, it is not necessarily correct to imply that a school with a higher performance index score for, say, black students than in previous years “closed the achievement gap.” It might’ve been due to an influx of students with stronger academic backgrounds.

Perhaps the simplest approach on Gap Closing is to just say it’s a status measure and call it a day. In other words, document subgroup achievement gaps (if they exist), but don’t try to evaluate whether a school is narrowing it from one year to the next. As Matt DiCarlo of the Shanker Institute writes, “[changes in achievement gaps] are poor gauges of school performance and shouldn’t be the basis for high-stakes rewards and punishments in any accountability system.” For more on the problems of “gap closing” in accountability settings, see DiCarlo’s great posts here and here.

Second, ODE should reconsider how it premises sanctions on Gap Closing grades. Its ESSA proposal says that schools earning a D or F on Gap Closing for two consecutive years will land in “school improvement” status. As our colleague Jamie Davies O’Leary discusses, an overwhelming number of schools and districts currently receive poor Gap Closing ratings. ODE should make sure that it is not going to sanction hundreds, if not thousands, of Ohio schools based on the results from a single report-card component. While they’re at it, policymakers should also reduce the weight on Gap Closing in Ohio’s summative grading system and instead put student growth closer to the center of accountability.

Ohio’s Gap Closing component puts the Buckeye State into compliance with several key federal requirements. While important to carry out, policy makers should consider a less complicated and less punitive approach to subgroup accountability in Ohio.


NOTE: The Joint Education Oversight Committee of the Ohio General Assembly is hearing testimony this week on Ohio's proposed ESSA accountability plan. Below is the written testimony that Chad Aldis gave before the committee today.

Thank you Chairman Cupp, and members of the Joint Education Oversight Committee, for giving me the opportunity to provide testimony today on the Ohio Department of Education’s proposed ESSA plan.

My name is Chad Aldis, and I am the Vice President for Ohio Policy and Advocacy at the Thomas B. Fordham Institute. The Fordham Institute is an education-focused nonprofit that conducts research, analysis, and policy advocacy with offices in Columbus, Dayton, and Washington, D.C. Our Dayton office, through the affiliated Thomas B. Fordham Foundation, is also a charter school sponsor.

I’d like to first applaud the department for their hard work on this plan. ODE staff worked tirelessly to gather a massive amount of stakeholder feedback, and many of the recommendations that they heard throughout the state can be either seen as a part of this plan or are identified as areas meriting further study. I know you’ve listened to testimony from a number of people who felt that their voices weren’t heard. As legislators, you know as well as anyone that it’s extremely difficult to incorporate feedback that, while important and strongly valued, is diverse and many times contradictory.

The ESSA plan created by ODE is a thoughtful approach that strikes an important balance between meeting the federal requirements and protecting Ohio’s autonomy. While the impact and role of the new federal education law has generated much discussion, the most important thing that ESSA does is return more authority over education to the state and local school districts—where it belongs.

Before I comment on the content of the plan itself, let me offer a suggestion regarding process. This plan should be as limited in scope as possible. That’s because, once it is approved by the U.S. Department of Education, it is locked into place for many years to come. Revising it will be a hassle and require cooperation from officials in Washington. Thus we should resist the urge to put everything but the kitchen sink into the plan. We should stick to the plan requirements, and leave other important policy to be decided at the state level as necessary.

Worth noting, many of the changes being suggested have been lobbied for in front of this body in the past. Some of the most notable examples include the role of teacher evaluations, the quantity of tests administered, and the state’s school grading system. A fair amount of the criticism is coming from people and entities that didn’t like the decision reached by the General Assembly the first time around and have seen the ESSA engagement requirement as an opportunity to have a second bite at the apple. That’s fine and to be expected. However, if we hijack the legislative process by creating policy through our ESSA proposal, Ohio could find itself right back in an NCLB environment where we were forced to carry out a plan that failed to take into account local contexts, needs, and solutions.

Shifting to the contents of the plan itself, here are some things that Ohio’s ESSA plan should be commended for:

  1. Improves the state’s current school accountability system. Over the years, Ohio has developed a robust, data-rich state report card system that has drawn critical acclaim. The amount of information available for policy makers, communities, and, most importantly, parents is comprehensive. High quality information on school and district academic achievement is more important than ever with the growing amount of school choice that parents have available.
  2. Keeps A-F school ratings. The A to F rating system is an intuitive, easily understood framework that represents a significant improvement from the previous labels Ohio used like “continuous improvement.”
  3. Recommends review of tests that aren’t required by ESSA. Acknowledging the public comments regarding the need to reduce testing, the plan calls for re-examining any tests that aren’t required by federal law. While we are open to eliminating any of the non-required assessments, this review is important and should look at any potential impact of removing the assessment and consider its initially intended purpose. A review of these assessments should also include a comprehensive review of the testing impact of the Ohio teacher evaluation system including the use of vendor assessments and student learning objectives.
  4. Retains the performance index measure. This is important because performance index incentivizes schools to improve student performance at every level of academic achievement rather than focusing only on students who score near the proficiency threshold.
  5. Preserves the prepared for success measure. In an era where there’s plenty of rhetoric surrounding college and career readiness, Ohio has created a straightforward measure that examines whether students have demonstrated some of the academic and career-related indicators likely leading to success after high school.
  6. Adopts a smaller n-size for calculating achievement gap measure. The state’s adoption of an N-size of 15 (decreased from 30) is an important change. It means that information on many more students subgroups will be available and will make it easier to see if all students are getting the necessary attention to be successful.
  7. Increases focus on supporting excellent educators. The plan recommends utilizing the 3 percent Title II set-aside to support principal and teacher leadership development. There are several programs that could accomplish this goal, including pilots that fund, implement, and evaluate teacher coaching, the creation of hybrid teacher roles, and an online network of open educational resources that Ohio teachers can access anytime, anywhere. It’s also important that the state take advantage of this opportunity to re-evaluate the efficiency and effectiveness of the existing Ohio Teacher Evaluation System in accordance with recommendations from the Educator Standards Board.
  8. Subsidizes fees paid by low-income students participating in Advance Placement and International Baccalaureate courses. All students deserve access to these advanced courses, and this will remove some of the barriers. That being said, it's likely that many students in rural and urban schools will still not have sufficient access to high-level course work.
  9. Improves ODE support for struggling schools. Some positive aspects of Ohio’s school improvement plans include: the creation of an online evidence-based clearinghouse to provide resources to schools and districts as they go about selecting improvement plans; the department’s plans to “build its research capacity” and conduct performance monitoring in addition to compliance monitoring; the creation of a peer-to-peer network for districts to engage directly with one another; and incentives for districts to participate in random control trials and other research.

Of course, no plan is perfect and this one is no exception. The provisions below should either be removed from the plan or, where appropriate, the legislature should consider amending current law (these recommendations are underscored) to address the underlying issue.

  1. Change the summative grade rating calculation. The current summative grade calculation, which the ESSA plan doesn’t appear to recommend changing, is likely to result in the overwhelming majority of high poverty schools and districts—regardless of their effectiveness—getting a D or an F. This is happening because current law weighs grade components that tend to correlate with poverty at about 80 percent of the overall grade. Growth, a factor that doesn’t correlate with poverty, accounts for only 20 percent of a school district’s grade. Serious consideration should be given to increasing the impact of student growth on the overall school grade.
  2. Explore calculating the graduation cohort in a different way. While graduation rate is a standard, generally accepted measure, Ohio should explore calculating the cohort in another manner. By removing students from a high school’s ninth grade cohort, even if the student transfers in the twelfth grade, the graduation rate for traditional high schools is being overstated at the expense of dropout recovery and some online schools. The net result is that school districts, whether they utilize it or not, have a direct incentive to encourage credit deficient upperclassmen to transfer to other high schools.
  3. Eliminate the category of “watch” schools. The definition of what places a school in the watch category is ambiguous and unclear—the plan mentions schools that struggle to meet the needs of one or more student subgroups in accordance with state, but the details about what this means and how it specifically fits into the new school identification system are fuzzy. The state should remove the “watch” category from its proposal, and instead focus on identifying and supporting the federally mandated categories of priority and focus schools.
  4. Alter the district continuum of support so that it’s not as broad and inclusive. Ohio’s overall report card calculation relies disproportionately on factors that correlate with demographics and relatively little on student growth. Because of this calculation, it’s likely that most high-poverty districts will find themselves in “intensive support status” under the state’s proposed continuum of support. Although some of these districts will undoubtedly deserve to be there, there is also the possibility that undeserving districts will end up there too. Adding districts with at least one watch school to the moderate support status category will exacerbate the problem. In summary, there’s a very real chance that a large number of school districts will be forced to adopt improvement plans intended only for the lowest-performing schools. The sheer amount of time and manpower it will take the department just to monitor the data of that many districts—let alone actually support them—is staggering. It will dilute energy and resources, all without yielding any benefits for most schools. New compliances processes and burdensome paperwork will also create extra work for everyone involved. We recommend altering the continuum of support to include only the lowest-performing districts and schools. This can be accomplished by removing “watch” status from the list of things that can place a district into intensive or moderate support status and ensuring the newly recommended methodology for calculating “gap closing” doesn’t over identify schools and districts. The current measure results in the overwhelming majority of districts receiving a D or F in gap closing.
  5. Use great caution when spending education dollars on school turnaround efforts. Studies from across the nation suggest that funding school turnarounds hasn’t gone well. In Ohio, hundreds of millions in school improvement grant dollars were spent with little to show for it. We are concerned that Ohio’s ESSA plan contains many of the same elements as these school improvement plans of the past. While the turnaround strategies listed in the plan aren’t bad in and of themselves, they will fail to result in systemic, long-term positive change if they are applied at random. It would be wise for the state to invest heavily in districts and schools that choose strategies that have been proven to work and should consider including high-quality tutoring and public school choice options.
  6. Protect the autonomy of CTE programs. The plan identifies the need to ensure alignment of CTE standards with Ohio’s learning standards more broadly. While generally supportive, care should be taken to ensure that it doesn't result in the loss of program level autonomy/independence that's likely to be important in the long-term success of CTE programs. In other words, a light touch should be used in pushing for standards alignment.

While I agree with many who have testified and suggested that Ohio’s ESSA plan can be improved, I disagree with those suggesting that Ohio delay its application until September. If you believe that as a matter of sound public policy Ohio should promise to do only that which federal law requires thereby preserving its autonomy in other areas, the best course of action is to submit our state plan for federal approval as soon as possible. This plan largely does that.

Moreover, this plan is effective for the 2017-18 school year and local school districts deserve a certain degree of certainty when a school year begins. Waiting until September to submit this application could force districts to operate for months without knowing for sure what the rules of the game are—especially if the federal government pushes back on any of our template submitted elements. This should be avoided.

Thank you again for the opportunity to speak with you today. I am happy to answer any questions that you may have.


With a $20 billion federal educational choice program now a real possibility under the Trump Administration and Republican-led Congress, the media spotlight has turned to the voucher research. The discussion often revolves around the question of participant effects—whether students are better off when they use a voucher to transfer to a private school. In recent days, voucher naysayers have pointed to the negative participant findings from recent studies in Louisiana and Ohio in order to attack the idea. (I oversaw the latter study as the Thomas B. Fordham Institute’s Ohio research director.)

These cursory analyses are misleading for a number of reasons. The Ohio study, led by respected Northwestern University professor David Figlio, came with a number of caveats that are often glossed over. Figlio was only able to credibly examine a small sample of voucher participants. To do an apples-to-apples comparison using a “regression discontinuity” approach, he had to focus on voucher students who came from marginally higher performing public schools (akin to a “D” rated school). As a result, voucher participants who left the most troubled public schools in the state—the “Fs”—were not studied. It’s possible that these students benefited from the program (or perhaps not), but there was no trustworthy way to find out.

In addition, the Ohio analysis uses state test scores, which are “high stakes” for public schools but not for private ones. Thus, public school students might have been encouraged to try harder on these tests than their voucher counterparts. Had evaluators been able to use a more neutral test, like the SAT-9, it’s possible that voucher student performance would have looked more impressive. Earlier studies, which found significant positive effects for voucher participants, used such neutral tests.

Meanwhile in Louisiana, State Superintendent John White notes that the implementation of his state’s voucher program is still in its infancy—just a few years in. White goes on to explain how private schools needed time to adjust to the new program. Among them include: adapting instruction to different expectations, ensuring academic supports for new pupils, and securing the talent needed to staff an excellent school. Though still in negative territory, voucher students’ test scores were on the upswing in year two, with new data on the horizon. At the very least, it’s premature to render a clear verdict in Louisiana based on just a couple years of early results.

Skeptics, however, make a more serious error when they omit the competitive effects of school choice. This piece of the research puzzle examines whether the introduction of vouchers leads to higher outcomes for pupils remaining in public schools. Stiffer competition, so the theory goes, should nudge improvements in district-run schools, which traditionally enjoy monopolies over the delivery of K–12 education.

In Ohio, the findings were positive: The introduction of voucher competition modestly improved the outcomes of students who remained in their public schools—in the range of one-eighth of the magnitude of the black-white test-score gap. In Louisiana, Anna Egalite of North Carolina State found similar results. Though some of her estimates were null, she found positive test score effects for students attending public schools facing the strongest voucher competition. 

It’s hardly surprising to see anti-voucher—and often pro-union—pundits skip the research on competitive effects. It undermines one of their major charges against vouchers: That they harm public-school pupils “left behind” because of a loss of funds. But as the Ohio and Louisiana studies indicate, the research lends little credence to this line of thought. Public schools students aren’t harmed; in fact, we find evidence that they reap academic benefits due to competition.

In heated debates like those over vouchers, solid empirical research remains an important guide. As opponents assert, the participant results from Louisiana and Ohio—caveats and all—are troubling and point to the need for improvements to existing choice programs. But they are wrong to suppress from public view studies on voucher competition simply because the findings don’t match their policy agenda. As state and federal policy makers consider private-school choice programs, they should heed research on both participant and competitive effects. 

Josh Dwyer and Carolyn E. Welch, J.D.

A recent High Flyer post made a strong case for how acceleration can benefit high-ability students and help administrators and teachers more effectively address the individual needs of their unique learners. It echoes findings in dozens of previous studies that show that acceleration works.

Despite mountains of evidence demonstrating its benefits, most decisions about acceleration policies are made locally. According to a recent report by the Jack Kent Cooke Foundation, forty-one states either do not have acceleration policies or permit school districts to decide whether to institute them.

Using Illinois as a case-study, the Illinois Association for Gifted Children and the Untapped Potential Project recently published a report that sought to determine whether districts step up to the plate in terms of establishing acceleration policies to support their high achievers in the absence of a state requirement. Unfortunately, the report’s findings are disappointing. Among Illinois school districts, large percentages lack policies that permit students to do the following:

  • Enter kindergarten early: 56 percent
  • Enter first grade early: 55 percent
  • Take classes above grade-level: 46 percent
  • Skip a grade: 90 percent
  • Graduate early: 41 percent

These troubling statistics are compounded by the fact that 33 percent of Illinois students already meet or exceed grade-level proficiency on the state exam, with 36 percent proficient or higher in English language arts and 31 percent proficient or higher in math. When a state does not provide for high-ability students in education policy, attention and resources can get directed largely to students below the proficiency bar, resulting in the dismantling of enrichment and gifted programming. Since No Child Left Behind and the end of state funding for gifted programs in 2003, the number of Illinois districts providing gifted programming has plummeted from over 80 percent in 2003 to only 27 percent in 2016. 

While more affluent families may be able to switch districts or provide supplemental enrichment outside of school in the absence of gifted programming and appropriate opportunities for acceleration, parents of high-ability low-income students often lack those options. They depend on public schools to identify and cultivate their children's talent, and this should be a priority of our education system as well.

Soon, members of the Illinois Senate Education Committee will have an opportunity to decide whether students throughout the state have access to proven acceleration practices. They will be considering Senate Bill 1223—the Accelerated Placement Act—which would establish a statewide acceleration policy grounded in best practices in Illinois.

It mirrors Ohio’s law, which requires each district to have an acceleration policy, form an acceleration committee to ensure that one gatekeeper cannot prevent students from being accelerated, and use a peer-reviewed assessment mechanism to determine whether a student should be accelerated.

Gifted education advocates take note. If you live in one of the twenty-two states without acceleration policies or one of the nineteen states that, like Illinois, allow the existence of these policies to be determined at the local level, it is likely that many students in your school district are not getting the education they deserve.

Consider pushing for a statewide acceleration policy. Acceleration is a well-researched and cost-effective way for schools to provide students with the level of challenge needed to reach their potential, and it is the least a state can do for its high-ability students whose educational needs are so often overlooked.

Josh Dwyer is the Policy Director for the Untapped Potential Project. Carolyn E. Welch, J.D., is an education attorney, Officer and Trustee of the Midwest Center for the Gifted, Board member of pilotED schools, and a member of the Parent Editorial Content Advisory Board of the National Association for Gifted Children and the State Initiatives Committee of the Illinois Association for Gifted Children. 

The views expressed herein represent the opinions of the author and not necessarily the Thomas B. Fordham Institute.


We look at the proper role of student growth in school ratings, Ohio’s proposed ESSA plan, teacher externships, and more

Under federal and state law, Ohio policy makers are responsible for gauging and reporting on the performance of its 3,000 public schools and 600 districts. To do this, Ohio has a report card system that assigns A-F grades based on a variety of performance indicators. While Ohio does not currently roll up these disparate component grades into a final “summative” rating, in 2017-18, the Buckeye State will join thirty-nine other states that do just that.

Why summative grades? They are intended to accomplish a number of purposes, including improving the transparency of complicated rating systems, helping families decide where to send their child to school, and guiding local decision making on which schools need the most help and which deserve recognition. With the importance placed upon these overall ratings, it is critical to examine the grading formula that Ohio policy makers will use to calculate schools’ final letter grades—specifically the weights assigned to each element of the school report card.

Current weights

Ohio law requires the State Board of Education to create the summative school rating formula within two key parameters: 1) it must include all six main components of the state report card; and 2) it must equally weight the Progress and Achievement dimensions. Set forth in administrative code, the state board has established weights for each component as displayed in the table below. When a component is absent for a school—e.g., K-3 Literacy wouldn’t apply to a high school—the existing weights are adjusted in a way that maintains proportional relationships between the components.

Table 1: Ohio’s overall school-rating weights

Are these weights fair?

The central problem with Ohio’s current system of weights is that five out of six components are highly correlated with student demographics or prior achievement. This creates an uneven accountability system that strongly favors high-income schools while disadvantaging low-income ones. In effect, Ohio’s state report card punishes schools simply for serving needy children while giving accolades to those with less disadvantaged pupils.

Using Ohio’s component A-F grades from 2015-16, the tables below demonstrate the extent of the problem.

Table 2: Distribution of A-F school ratings on Ohio’s report card components, 2015-16

Consider the following observations:

  • Over 90 percent of high-poverty schools received Ds or Fs on four report-card components: Achievement (94%), Gap Closing (99%), Prepared for Success (95%), and K-3 Literacy (92%). The comparable D or F grades for low-poverty schools are much lower, ranging from 21 percent on Prepared for Success to 73 percent on Gap Closing. These patterns are predictable, given the achievement gaps between low-income students and their peers on state and college admissions exams.
  • Graduation rates for low-income students have historically lagged those posted by high-income students. This means that the Graduation Rate component also correlates with demographics: 68 percent of high-poverty schools received a D or F, while just 2 percent of low-poverty schools were assigned such grades.

Meanwhile, the Progress component—value added or student growth—is the one measure in which high-poverty schools can and do perform well. Consider Table 3, which shows that 32 percent of Ohio’s high-poverty schools received an A or B, while 48 percent received a D or F—a less skewed distribution of grades relative to the other components. Ohio’s growth measure is designed to be more poverty-neutral, leading to the different distribution of grades.

Table 3: Distribution of A-F grades on Ohio’s Progress component, 2015-16

Taken together, the five demographically correlated components will constitute 80 percent of a school district’s grade and anywhere from 70 to 80 percent of a school’s grade depending upon the grades served. Meanwhile, just 20 to 30 percent will be based on Progress—or student growth—a measure independent of demographics or prior achievement.[1]      

As a result, the weighting system developed by the State Board will assign almost all high-poverty schools a D or F grade. This is not only inaccurate—there are excellent high-poverty schools—but could also have unintended consequences such as: a) misleading school-shopping families about the quality of their options; b) subjecting high-performing, high-poverty schools to intrusive interventions; c) failing to recognize or reward high-performers; d) discouraging educators (and students) in high-poverty schools who start to believe that the grading system—no matter what—results in them being deemed failing.

A weighting proposal

Ohio policy makers should reweight the overall school-grading formula in the way that places more emphasis on growth. Table 4 displays my proposal for reweighting the various components.[2]

Table 4: Proposed summative grading formula

This proposal would make three key changes in Ohio’s weighting system. It would:

  • Increase the weight on Progress, the component that houses Ohio’s student growth measures. Several other states already place disproportionate weight on their growth measures, and according to surveys of Ohio parents, growth is viewed as a very useful gauge of school quality.
  • Reduce the weight on Gap Closing, Graduation (in district accountability), and K-3 Literacy. These measures that have challenges, notwithstanding their correlation with demographics. For example, the problems with graduation rates include improper assignment of blame or credit for graduation/non-graduation in certain cases and they are open to “gaming” through shoddy credit recovery programs. K-3 Literacy has challenges, too. For instance, it relies on district-selected assessments and the rigor of such exams could vary, giving some districts a slight advantage if they administer a less demanding test.
  • Differentiate more clearly the grading formula for K-8 and high schools. Growth should account for somewhat less weight in high school accountability, primarily because students are drawing close to taking their next step in life. Hence, indicators of readiness for post-high school success become increasingly critical.

* * *

There is no one scientifically correct way to determine school grading weights. It will ultimately come down to judgments on issues of what we prioritize and value, how technically sound an indicator is (all measures have their challenges including, yes, value added), how we think certain measures will affect behavior, and how we think about fairness to schools and to students. Yet we must acknowledge that not all report card measures are of equal or similar importance—which appears to be one of the assumptions behind Ohio’s current weighting approach. Buckeye policy makers should revisit these summative weights and work toward prioritizing student growth in the accountability system.

[1] A K-8 school would not have high-school specific components (Graduation Rate and Prepared for Success) and the weights would be as follows: 72.5% on Achievement, Gap Closing, and K-3 Literacy combined and 27.5% on Progress. A grade 9-12 high school would have 77% combined weight on Achievement, Gap Closing, Prepared for Success, and Graduation Rate and a 23% weight on Progress. See pg. 3 of this ODE document for the various permutations in the absence of a component(s).

[2] To implement changes such as this, Ohio legislators would need to repeal the statutory clause mandating equal weight on Achievement and Progress [ORC 3302.03(C)(3)(f)]. Then, the State Board of Education would need to adopt new rules for the new weighting system.


Ohio’s draft plan for implementing the Every Student Succeeds Act (ESSA) came out earlier this month, and we at Fordham continue to analyze it and offer our thoughts. In a previous article, I argued that Ohio’s plans for improving low-performing schools were underwhelming. But there is an even more worrisome set of details worth pointing out and rectifying—namely that Ohio’s proposal will likely result in a vast number of schools and districts being labeled as failing and routed into a burdensome and ineffective corrective action process.

For starters, Ohio’s ESSA plan moves beyond what’s required by law when it comes to identifying “low-performing” schools. Federal law requires states to have at least two buckets for school improvement—comprehensive support and targeted support (or the equivalent of what Ohio is naming “priority” and “focus” schools, respectively). The law is direct in spelling out how states should place schools in either category (see Table 1).

Table 1: ESSA requirements

Now take a look at Ohio’s proposed criteria below.

Table 2: Ohio’s proposed implementation of ESSA’s requirements

There are several problems with this approach. First and most glaringly, Ohio opted to add a third category to the mix: “watch” schools, nebulously defined in ODE’s plan as “schools that struggle to meet the needs of one or more student subgroup(s) as outlined in state law.” Ohio should clarify what this means or consider scrapping the category altogether, as it’s not necessary under ESSA and the state has already cast a wide net in terms of identifying priority and focus schools. Specifically, the inclusion of the gap-closing measure as a way to determine eligibility for targeted support does not inspire confidence based on current grades. Ninety-three percent of Ohio districts earned a D or an F on gap-closing in 2015-16, with the vast majority of those (87 percent) getting an F. Only two districts in the entire state earned an A.

To be fair, it appears that Ohio is changing its gap closing calculation—to fall more in line with recommendations that my colleague Aaron explored here last year. Still, until the new metric is fully rolled out, Ohio may want to tread carefully in tying sanctions to the gap-closing grade. It could also lengthen the timeframe to three consecutive years or make some of these qualifiers “and” (instead of “or”).

Ohio’s plan for identifying low-performers becomes even more worrisome when considering the implications for districts. Take a look at the table below—copied directly from Ohio’s ESSA plan. ESSA requires that states provide “technical assistance” to districts that serve a significant number of schools identified for support, which includes review by the state, quarterly or biannual improvement plans, submission of student outcome data, and more.

Given that the overall report card calculation relies disproportionately on factors that correlate with demographics and relatively little on student growth, it’s likely that most low-income districts will find themselves in intensive support. By my rough estimate, about twenty districts—primarily those located in Ohio’s cities or inner-ring suburbs—would fall into intensive support status. That’s about three percent of Ohio districts.

But if future scores are anything like current scores, the inclusion of the gap closing measure will place a vast majority of schools into intensive or moderate support status. And any district with just one identified school whatsoever—even an ambiguously identified “watch” school—will be placed on the support spectrum.

Rather than going deep and requiring intensive, dramatic change among its very lowest-performing schools, Ohio appears to have taken the opposite approach to school improvement—wide and shallow. By adding a new, non-required “watch” status for schools and creating multiple (very likely) scenarios for districts to fall on the support continuum, there’s a very real chance that just about everyone will have to succumb to improvement plans. This dilutes energy and resources and is not a reasonable ask, nor will it yield benefits for most of the schools asked to undergo it. New compliances processes and burdensome paperwork create extra work for everyone involved—both the state and the schools and districts under monitoring. The extra workload is all the more unbearable give Ohio’s currently proposed scattershot strategy for improving schools.  

There are chronically low-performing schools in urgent need of attention, where kids are losing their one opportunity to get an excellent education. These are the schools (and districts) that ODE should focus its efforts on—the lowest five percent or perhaps even one percent of schools—in a concentrated strategy to raise student outcomes. 


It’s budget season in Ohio, and that means plenty of debates about school funding and other education policy issues. Buried deep in the legislative language is a short provision about teacher licensure that’s garnering a whole lot of pushback—as it should. Here’s the legislative language: “Beginning September 1, 2018, the state board of education’s rules for the renewal of educator licenses shall require each applicant for renewal of a license to complete an on-site work experience with a local business or chamber of commerce as a condition of renewal.”

In Ohio, teacher licenses are renewed every five years. Although the requirements vary depending on the license, renewal typically involves six semester hours of coursework related to classroom teaching or the area of a teacher’s licensure and 18 continuing education units. If this proposal becomes law, completing an externship at a local business will become part of the process.

The intentions behind this requirement are good: Governor Kasich is trying to actuate a recommendation made by his executive workforce board, which wants to “help business connect with schools, and to help teachers connect with strategies to prepare their students for careers.” This is a worthy goal to be sure—all of us, no matter our profession, benefit from wider perspectives—but requiring all teachers to complete an externship won’t ensure that they’re able to advise students on their myriad career opportunities. Here are a few reasons why this provision shouldn’t become law.

  1. Externships are too infrequent. Ohio teachers renew their licenses every five years, which means that if this provision becomes law, teachers will only complete an externship once every five years. When talking to the Plain Dealer about the requirement, Kasich’s Office of Workforce Transformation Director Ryan Burgess specifically mentioned how quickly the workforce and in-demand jobs change. If this is true, then it begs the question how a single externship once every five years could possibly keep teachers up to date on the newest pathways available to students—especially in areas like technology and health care, where innovation is the name of the game. To be clear, I’m not arguing that teachers should be required to complete an externship every year. I’m just pointing out that if we need to end the disconnect between businesses, educators, and schools, an externship once every five years isn’t going to do it.
  2. Externships won’t help teachers talk to all their students about careers. I taught high school English, and I had plenty of conversations with students about careers. Some of these conversations were based on my personal experiences and knowledge, and some of them involved working with students to research options. Very few of these conversations would have improved had I been required to complete an externship. This is because my students had such a wide range of interests: An externship at a local steel plant might have helped me talk to my kids about careers in the steel industry, but if none of my students were interested in working in the steel industry, my time would have been wasted and their questions wouldn’t have been answered.  
  3. Elementary and some middle school teachers won’t benefit the way high school teachers may. The conversations I had with my students about careers were most likely not the same conversations that elementary and middle school teachers have with their students, and rightfully so—students need different things at different ages. Teacher licensure requirements should reflect what teachers need to know to teach specific subjects at specific grade levels. A one-size-fits-all provision that requires all teachers to complete an externship won’t benefit the majority of teachers, or more importantly, students.
  4. It will probably become a check-box compliance item. Teachers already bear important responsibilities, such as preparing lessons, grading students’ work, communicating with parents, leading extracurricular activities, handling sensitive disciplinary matters, keeping current with research on effective pedagogical practices, and mentoring teachers new to the profession. Given these priorities, it’s a bit naive to think that classroom teachers will immerse themselves in another profession. Instead, it’ll probably become another go-through-the-motions compliance item that teachers do every five years.

This provision is a response to a very real disconnect between schools and the workplace. But there are ways to address that disconnect other than adding to the already full plates of educators. For instance, the provision could be amended to make externships an option—but not a requirement—for teachers to earn continuing education credits. The state could also invest in bringing businesses into schools rather than sending teachers out—a sort of career day on steroids. This would give students and teachers a chance to interact with business leaders and entrepreneurs firsthand and would showcase a broader menu of potential career pathways than teacher externships ever could.

State lawmakers should definitely consider ways to connect schools and businesses, but requiring externships for all of Ohio’s teachers is over the top. A lot of positive things are already happening across Ohio to help young people understand their career opportunities, and policy makers would do well to build on those types of initiatives instead.  


Since the 1980s, there has been a significant increase in the average age at which women in industrialized nations have their first child. Advanced maternal age, medically defined as ages 35 and up, has in a number of studies shown negative association with infant health, and potentially, development in later life. However, data from three separate birth cohorts in the United Kingdom (1958, 1970, and 2001) indicated a marked increase in the cognitive ability of first-born children over time. At face value, this appears to be a disconnect: Shouldn’t the trend towards later child-bearing correlate to lower cognitive abilities among first-borns? A trio of researchers explored what was behind the unexpected results and recently published their results in the International Journal of Epidemiology.

The three birth cohorts were studied separately for different longitudinal research projects and each included more than 16,000 randomly sampled children born in specific windows of time. Cognitive ability of the children was assessed at the ages of 10 or 11 using different tests of verbal cognition depending on the cohort. The researchers in the present study combined the data and standardized the three different test results to ensure the best comparability between the diverse data sources. The most common age range in which women were giving birth in both 1958 and 2001 was 25-29, so that range was chosen as the comparison for the advanced age cohort. The researchers zeroed-in on first-borns in both age ranges. The study included adjustments for socio-demographic characteristics (married or single, income, education at time of birth, etc.) and health behaviors before and after pregnancy.

The results: First children born to younger mothers performed higher than peers born to older mothers in both the 1958 and 1970 cohorts, followed by a complete reversal of that performance outcome for the 2001 cohort.

How can it be that advanced maternal age went from being a negative influence on kids’ cognitive ability to a positive influence in less than 30 years? Although not addressed directly in the study, one part of the answer is likely to be the benefits of living in the 21st century—better medicine, hygiene, and reproductive science included. But the researchers posit that since first-born children typically have access to more maternal resources—both material support and things like attention—the trend toward older first births puts that positive variable ahead of the adverse variable of “advanced maternal age” and leads to the reversal. Additionally, older mothers are more likely to be established in their own education, career, and life, whatever level they have achieved. In short, later first births generally mean smaller ultimate family size, resulting in more resources for children and less competition for those resources down the line. These additional resources are enough to overcome the remaining potential negatives of advanced maternal age.

The researchers caution that their study looked at only a few cohorts in one nation and that more study is needed to better understand the link between timing of child bearing and its impact on later development. If the results are replicated with additional research, however, these findings could help bolster policymakers’ efforts to push adult stability as a key to child academic ability. Perhaps a push like the one proposed in the success sequence could nudge this phenomenon even further.

SOURCE: Alice Goisis, Daniel C. Schneider, Mikko Myrskalä, “The reversing association between advanced maternal age and child cognitive ability: evidence from three UK birth cohorts,” International Journal of Epidemiology (February, 2017).



Ohio’s current approach to school funding (K-12) has several strengths, including its ability to drive more state aid to disadvantaged districts and to add dollars for students with greater educational needs. But in a time when Ohio’s budget – like that of many other states – is stretched thin, policy makers need to ensure that every dollar is being well spent. As state lawmakers debate Ohio’s biennial budget, thoughtful analysis is more important than ever.

We invite you to attend the release event for Fordham’s latest research report, A Formula That Works. Conducted by national education policy experts at Bellwether Education Partners, this analysis is a deep dive into Ohio’s education funding policies and includes several recommendations for improvement. The study touches on questions such as: How well does Ohio’s funding system promote fairness and efficiency to all schools and districts? How can policy makers better ensure that all students have the resources needed to reach their goals? And what are the most critical policy issues that legislators should concentrate on as the budget debate proceeds this spring?

Join us in Columbus as we hear policy recommendations from our new study and discuss ways to improve K-12 funding policy in the Buckeye State.

Thursday, March 9, 2017

8:30 – 10:00 am

The Athletic Club of Columbus
136 E Broad St
Columbus, OH 43215

Doors open at 8:00 am and a light breakfast will be served


Jennifer O'Neal Schiess - Associate Partner, Bellwether Education Partners (report co-author)

MODERATOR AND PANELISTS - To be announced soon 



We beg for the end of a tired school funding process, extol the benefits of lowering chronic absenteeism, and more.

The recent unveiling of Governor Kasich’s budget plan for the 2018-19 fiscal years has kicked off Ohio’s biennial ritual of debating school funding. Caps and guarantees have long been a central part of that discussion, and it’ll be no different this spring. As I’ve argued before, state leaders should get rid of these pernicious policies.

To allocate state dollars to school districts, Ohio lawmakers have crafted an intricate funding formula designed to drive more aid to districts that need it most (e.g., those with more students to educate, more pupils with special needs, or less capacity to fund education locally). They’ve done a pretty decent job of it, too. Don’t just take our word for it: EdTrust has said Ohio is one of the best in the nation at it. Both caps and guarantees throw a wrench into this system.

Caps limit the increase in a district’s state formula aid from year to year. Conversely, a guarantee ensures that a district won’t receive less funding than it received in a previous year. Caps are generally associated with districts experiencing enrollment growth, while guarantees typically apply to districts with declining enrollment. Changes in district property values and resident incomes can also play a role.

The Kasich Administration has repeatedly urged a phase-out of caps and guarantees to no avail. Each year, the legislature inevitably feels the heat from local districts—particularly from those facing flat or reduced funding—and walks back the governor’s attempt. Caps are also tough to get rid of, as doing so would cost the state millions; moreover, because they affect some (but by no means all) wealthier districts, it could create the appearance that the state is spending more on less needy pupils. The administration, perhaps learning its lesson, structured this year’s budget proposal in a manner that retains the caps and guarantees but modifies them in a few ways.

The governor’s budget imposes a gain cap of 5 percent, meaning that a district cannot receive more than a 5 percent boost in state formula aid relative to a prior fiscal year. According to state budget projections, the governor’s proposal would cap 130 districts in FY18 and deny them $466 million in state funding. Compared to the current year’s cap—set at 7.5 percent—the proposal actually places a tighter cap on funding increases, somewhat at odds with the Administration’s desire to phase out caps. That being said, eliminating (or lifting) the cap would increase costs to the state which itself faces a budget crunch.

To illustrate the effect of the cap on three selected districts, the table below displays estimated capped amounts under the governor’s proposal. Licking Heights, a district with considerable increases in enrollment, loses $9.2 million in state aid that it would otherwise receive under the FY18 formula.

Source: Ohio Office of Budget and Management Note: The average change in enrollment across the 130 districts on the cap was +0.4%; median was -0.4%

The governor’s proposal reduces the amount of guarantee aid for districts with declining enrollment. Generally speaking, if a district covered by the guarantee experienced enrollment declines of more than 5 percent from 2010-11 to 2015-16, it would receive only part of the “full” guarantee amount under the proposal.[1] It’s important to note that such declines are not due to students leaving for charters or inter-district open enrollment, but to factors such as families moving out of the district and/or slowing birth rates.[2] Projections for FY18 indicate that 315 districts would receive guarantee aid at a cost to the state of $181 million.

For illustration, the table below displays the guarantee calculations for selected districts under the governor’s proposal. East Cleveland, a district with large student declines, under the formula would lose $8.5 million in FY18 relative to the prior year, but is shielded from most of that loss via $6.93 million in guarantee aid.

Source: Ohio Office of Budget and Management Note: The average change in enrollment across the 315 districts on the guarantee was -6.6%; median was -5.9%.

If the governor’s plan were to pass today, a whopping 445 out of 610 districts in Ohio would be affected by either the cap or guarantee in FY18—and thus not funded according to the formula.[3] Despite the political pressure they’ll face, the legislature should eliminate these policies. Here’s why:

  1. Both caps and guarantees subvert the state’s own funding formula, which is designed specifically to ensure that more aid goes to districts with the greatest needs. All of us should embrace this type of approach to funding schools rather than one driven by politics—be it partisan, provincial, or special interests. Ohio has made strides in developing a sound formula, but in the words of state budget director Tim Keen, caps and guarantees continue to “short circuit the formula.” Instead of working around it, lawmakers should focus on driving dollars through the formula—and if necessary make refinements to better ensure that all districts receive the proper amount of aid.
  2. Caps deny scores of districts the funding increases they deserve according to the state’s own formula. Some of these capped districts have experienced increases in enrollment, requiring additional resources to make sure they’re meeting the needs of more and more students. Perhaps contrary to perceptions, capped districts are not necessarily wealthy (though some certainly are). Several high-poverty districts would have millions withheld in FY18: Canton takes a $4.2 million hit; Dayton loses $4.0 million; and Columbus—which has seen relatively large enrollment gains—is capped at a staggering $92.6 million. Such districts are educating some of Ohio’s most disadvantaged youngsters, and it’s shameful that the state is denying them dollars they should receive under its own funding formula.
  3. Guarantees squander state taxpayer dollars on pupils who are no longer attending a school district. As discussed above, districts with fewer students are often sheltered from funding reductions through the guarantee; this in turn means that Ohio funds a certain number of “phantom students.”
  4. Guarantees also pay districts based on a previous year’s funding amounts. However, this is not how the rest of the world works, which bases budgets on current or projected economic conditions. No one thinks that any company should get paid based on its customers from years ago, but that’s essentially what the guarantee does.
  5. The common argument in favor of guarantees—districts need time to adjust—is a red herring. That might be fine if guarantee aid were offered temporarily, like Ohio’s catastrophic cost reimbursement program, a short-term funding pool for schools facing sudden costs increases associated with special needs. Instead, guarantees allow districts in long-term decline to avoid making difficult changes about how they do business. This could include restructuring unwieldy labor agreements, moving to shared services, shifting to a more flexible cost structure, and in some cases reducing staff and facility expenditures when enrollment declines are significant.

Caps and guarantees will be among the most urgent matters before the state legislature this spring. True, these policies won’t get droves of citizens to protest at the Statehouse and the political pressure to resort to the status quo will be intense. But if we as a state truly want to fund K-12 education based on the children our schools actually educate, Ohio lawmakers should finally drive a stake through the heart of caps and guarantees.

[1] In FY16, districts were guaranteed 100 percent of its FY15 state aid; LSC Greenbook (p. 12)

[2] The proposal uses a district’s changes in Total ADM, which is a measure of public school students residing in the district (but not necessarily attending a district-run school); for the definition see, ODE District Profile Reports.

[3] The state budget office projects 424 districts on either the cap or guarantee in FY19.



On February 2, the Ohio Department of Education (ODE) released the first draft of its state plan to comply with the Every Student Succeeds Act. ESSA, the year-old federal education law, is the successor to No Child Left Behind (NCLB). While many of ESSA’s accountability provisions are similar to those found in NCLB, a new requirement is for states to have an indicator of “school quality or student success” that goes beyond state standardized test scores or graduation rates.

Ohio’s plan proposes two measures that meet this requirement. The first measure, Prepared for Success, is a carryover from the state’s current report card. It uses multiple indicators to determine the number of students ready for college or career at the end of high school, and is exclusively used for districts and high schools. The second measure, on the other hand, will be used by all schools and districts: student engagement as measured by chronic absenteeism.

Although the threshold for being considered chronically absent depends on the state, the idea behind the term is the same—chronic absentees are students who miss too much school. In Ohio, these students are known as “habitual truants.” They earn this designation by being absent without “legitimate excuse” for “thirty or more consecutive hours, forty-two or more hours in one school month, or seventy-two or more hours in a school year.” Serving these students well has been a struggle for districts and schools in the Buckeye State for years, so it makes sense that the state would use ESSA as an opportunity to address the problem. Putting chronic absenteeism under the umbrella of student engagement makes sense too: If a student misses too much school, they’re not fully engaged in their education—and probably not learning much either.

But chronic absenteeism is a smart addition to Ohio’s state report card for a number of other reasons as well. First, it’s consistent with and supportive of a policy direction already identified by Ohio leaders. The Buckeye State recently revised its truancy laws in House Bill 410. This legislation updates the state’s definition of truancy and prohibits schools from suspending, expelling, or removing students from school solely on the basis of attendance. Instead, the bill outlines an intervention structure for districts and schools to follow that should “vary based on the needs of each individual student.” While unintentional, this structure aligns well with ESSA’s emphasis on locally driven interventions.

Second, Ohio’s new truancy law also revises what schools and districts must report to ODE in regards to chronic absenteeism based on the law’s new definition and intervention structure. Aligning the new measure with data that the state was already planning to start collecting is smart and efficient. It’s also a measure that can be easily disaggregated by subgroup, school, and district, making it potentially more useful.

Finally, and most importantly, reducing chronic absenteeism can increase achievement. In elementary school, truancy can contribute to weaker math and reading skills that persist into later grades. In high school, where chronic absenteeism rates are higher, students often experience future problems with employment, including lower-status occupations, less stable career patterns, higher unemployment rates, and low earnings. Ohio could raise student achievement by lowering its chronic absenteeism rate, and making absenteeism part of the state’s accountability system is a signal that districts and schools must start paying more attention to attendance numbers.

ODE has proposed a statewide long-term goal for chronic absenteeism of 5 percent or less. Recognizing that some groups of students were starting off with much larger absenteeism rates than others, ODE also assigned goals to each subgroup using the 2015-16 school year as a baseline. Furthermore, it devised a transparent equation that results in “consistent annual increases” of expectations.  Here’s a look at the goals for the state as a whole and for various subgroups of students.


In order to successfully meet the student engagement indicator, districts and schools will either have to meet the benchmark percentage of 5 percent or less or meet an improvement standard determined by ODE. (For example, the department lists “reducing the percent of chronically absent students by at least 3 percentage points from one year to the next.”) If districts and schools accomplish either of these goals, they will be deemed meeting the indicator. It’s important to note that this indicator is graded based on a “meets” or “not meets” standard, not with an A-F grade. ODE plans to incorporate this student engagement/chronic absenteeism measure into the Indicators Met portion of the state report card’s Achievement Component. It will be one of many subcomponents within the measure and will likely play a very small role in the overall calculation of school grades.

Nevertheless, this measure could unintentionally create an incentive for districts or schools to expel truant students in order to improve their attendance numbers. Recognizing this, ODE proposes that data on expulsions be used as a “check.” The plan notes: “To ensure that districts do not expel truant students as a way to reduce their chronic absenteeism rate, the calculation will include a review of each school’s or district’s expulsion data. Districts or schools that otherwise would meet the indicator but show a significant increase in their expulsion rate with the discipline reason listed as ‘truancy’ will have their ‘met’ demoted to ‘not met’ for this indicator.”

Overall, ODE’s plan for incorporating chronic absenteeism into the state’s accountability system is both thoughtful and nuanced. The state plan reinforces the newly revised state law while also following ESSA guidelines—and shedding light on a problem that, if solved, could improve student achievement in the Buckeye State.

Ohio just released its draft ESSA plan. While there’s much to applaud, the state’s proposals for improving the most chronically underperforming schools are underwhelming—serving to further remind us that sixteen years after the federal government began pushing states to turn around failing schools, our ideas for doing so are still scattershot.

Compared to past federal requirements for school improvement, ESSA is turnaround-lite—intentionally backing away from prescriptive solutions regarding school turnarounds embedded in NCLB and the School Improvement Grant program (SIG). Schools failing to make Adequate Yearly Progress (AYP) under NCLB faced a series of consequences including replacement of school staff, new curriculum, decreased authority of school administration, help from outside turnaround specialists, or restructuring of the school. Restructuring (similar to the more rigorous options that SIG put in place) included alternative school governance, reopening the school as a public charter school, replacement of most or all of school staff and leadership, takeover by an outside entity, or state takeover.

In Ohio, hundreds of millions in SIG dollars were spent with little to show for it. Low-performing schools were allowed to choose from a slate of turnaround options in exchange for funds; unsurprisingly, the majority of Ohio schools selected the least disruptive school improvement option—a professional development plan, an extra hour of learning time, and other supports that tinkered at the edges of change.

ESSA doesn’t require even these minimal efforts at turnaround; it merely mandates that states make districts do something, anything to address their worst schools—and step in if they fail to do so.

The nine pages of Ohio’s draft ESSA plan dedicated to describing its plans to improve low-performing schools (identified thusly) are unremarkable. That’s because Ohio’s ESSA draft contains many of the same elements as school improvement plans of the past that didn’t work, often reading like a SIG application: districts will “build capacity of school principals,” provide “targeted professional development,” and “work collaboratively with their community and stakeholders to determine… specific, evidence-based strategies.” It’s not that any of these concepts are bad, just that if chosen and applied at random they most definitely don’t result in systemic, long-term positive change. Under ESSA, low-performing districts and schools are often the arbiters of their own improvement plans. There exists a certain degree of madness in hoping that low-performing schools/districts will wake up one day and figure out how to fix themselves. Rather than push back against this premise, Ohio’s plan mostly appears to focus on simply complying with the (limited) federal requirements.

And there are few high-stakes repercussions for chronic failure, at least at the school level. Schools that languish in priority or focus status (see Table 1 for how these are to be categorized) will be subject to “additional Department [ODE] oversight on federal expenditures,” subject to more reviews, more paperwork, and more improvement plans (unless of course they are charter schools—then they are likely to close).

Table 1: Quick overview of Priority, Focus, and Watch Status

Ohio should also be cautious before creating a plan that defers too much to the “community” or ignores that school improvement is largely about changing what happens within a school in the way of teaching and learning. While the inclusion of mental health services in Ohio’s plan seems like a good one (this acknowledges the role that trauma and mental health play in truancy and academic performance), and there indeed may be a need for “a more coherent focus on addressing the needs of students, families and communities in struggling schools,” the plan’s lack of emphasis on changing what actually occurs within the four walls of a school on a given day is disconcerting. In fact, the section describing how Ohio will support low-performing schools in their quest to improve includes more references to community groups and organizations (e.g., “Community groups… want more of a voice in developing those local plans”) than it does to teaching and learning. This seems problematic.

There are some positive aspects of Ohio’s school improvement plans: the creation of an online evidence-based clearinghouse to provide resources to schools and districts as they go about selecting improvement plans; the department’s plans to “build its research capacity” and conduct performance monitoring in addition to compliance monitoring; the creation of a peer-to-peer network for districts to engage directly with one another; and incentives for districts to participate in random control trials and other research (important for building the evidence based referenced frequently in ESSA).

So what can Ohio do to strengthen its school improvement plans given that much of its blasé nature stems from an intentionally open-ended federal law? The state could consider two vestiges from the NCLB era that preserve parent agency. Under NCLB, parents whose children attended low-performing schools had more power than under current law. Children in a languishing school were given the option to transfer to a better performing school within their district and were also eligible for supplemental educational services such as tutoring. Despite historically low uptake rates for these options, they provided a safety valve for families.

Ohio’s draft plan lists “direct student services” as one possible intervention that might be required in instances where schools fail to make significant progress. The plan outlines expansion of Advanced Placement, “transitional coursework,” and early literacy initiatives among direct services. (ESSA also allows for high-quality academic tutoring—which Ohio should include in its list.) Ohio’s current wording for direct services is too wishy-washy: the state should commit to providing these services as a clear and consistent option for families when their students attend chronically failing schools. Ohio should also consider reinstating the student transfer option, perhaps even looking into incentives for higher performing districts that take on students from outside their borders.

As Ohio collects public comment over the next month, it should consider strengthening parent choice, guaranteeing the provision of direct services for students in chronically failing schools, and consider reinstating the “transfer out” student option. Improving chronically low performing schools is a monumentally difficult task requiring immense leadership and innovation. While Ohio districts take a crack at it themselves, the state should at least guarantee stronger options for parents and students who lack the ability to exercise choice by moving elsewhere.

Stay tuned for another look at how Ohio can improve its ESSA school accountability plan—specifically by walking back key portions that appear to go beyond what federal law requires.  

Do incentives nudge students to exert more effort in their schoolwork? A recent study by University of Chicago analysts suggests they do, though the structure of the incentive is also important.

The researchers conducted field experiments from 2009 to 2011 in three high-poverty areas, including Chicago Public Schools and two nearby districts, with nearly 6,000 student participants in grades two through ten. Based on improved performance relative to a baseline score on a low-stakes reading or math assessment (not the state exam), various incentives were offered to different groups of pupils, such as a $10 or $20 reward, or a trophy worth about $3 along with public recognition of their accomplishment. The analysts offered no reward to students in a control group. To test whether pupils responded differently to immediate versus delayed incentives, some of the students received their reward right after the test—results were computed on the spot—while others knew the reward would be withheld for one month.

Several interesting findings emerged. First, the larger cash reward ($20) led to positive effects on test performance, while the smaller reward had no impact ($10). This suggests that, if offering a monetary reward, larger payouts will likely lead to more student effort. Second, the $3 trophy and public recognition also had a positive impact on achievement, though not as big of an effect as the $20 incentive. In addition to being cost-effective, this finding is important because in practice, non-cash incentives may be more acceptable in school environments. As the study authors note, “Schools tend to be more comfortable rewarding students with trophies, certificates, and prizes.” Third, incentives that were withheld from students for a month after the test did not improve performance. This suggests that sooner, rather than later, disbursement is an important feature of an effective incentive structure.  

It is possible that external incentives could “crowd out” intrinsic motivation—students may be less likely to work hard once an incentive is removed. The authors find no evidence of this when examining treated students’ low-stakes test scores after the incentives ceased. Instead, they conclude that incentives, when structured well, can help motivate students to put in just a little more effort.

Source: Steven D. Levitt, John A. List, Susanne Neckermann, and Sally Sadoff, “The Behavioralist Goes to School: Leveraging Behavioral Economics to Improve Educational Performance,” American Economic Journal: Economic Policy (November 2016).


Citizens Leadership Academy (CLA) is preparing Cleveland middle schoolers for success in high school, college, and life—and not just academically. CLA, whose population is 79 percent economically disadvantaged and made up almost entirely of students of color, is second among all public schools in the city on student growth. The school’s eighth graders reach and surpass proficiency at a rate that is more than three times that of their peers across the city. Reading and math proficiency rates at CLA are more than double those of Cleveland Metropolitan School District’s.

No matter how you slice the data, CLA is providing academic preparation that would likely be unavailable to them if the schools—and its broader high-performing charter network (Breakthrough Schools)—did not exist. And yet its academic prowess is just the tip of the iceberg.

The school’s model—as captured in its name, Citizens Leadership Academy—prioritizes and cultivates broader attributes and mindsets necessary for long-term success. As you’ll read in this profile about one student, Keith Lazare Jr., CLA asks students to consider what it means to be active, engaged citizens and community members. Students are asked to grapple not only with tough math problems or reading passages—strengthening the stick-with-it-ness known in education circles as “grit”—but also to develop a sense of responsibility, ownership, and persistence in all aspects of their character. And this is not done in top-down fashion, either. Instead, CLA’s leaders and staff have created an environment where students advocate for themselves. These are skills that will no doubt serve them well in high school, college, professional and personal relationships, job interviews, board rooms, and beyond. CLA cares as much about empowering students and helping them hone their voices as it does about high test scores. This is a testament to their commitment to their students’ lifelong success as well as to the school’s deep understanding of what it takes to lift students in poverty and propel them toward the success they so deserve.

We invite you to read Keith’s story and see for yourself how good charters like CLA are good choices for students in Cleveland and across Ohio.


NOTE: The Thomas B. Fordham Institute occasionally publishes guest commentaries on its blogs. The views expressed by guest authors do not necessarily reflect those of Fordham.

In the last Ohio Gadfly, I described the many similarities between Washington State’s lengthy debate about high school graduation requirements during the years that I worked there and the debate underway in Ohio now. 

As has been Washington’s habit as well on everything from funding to accountability, the Ohio State Board of Education has kicked the issue to a study panel for the time being. At its meeting on December 13, the Board, after first rejecting proposals to delay or reduce the college and work-ready requirements adopted in 2014, directed the State Superintendent to appoint a work group to “review the graduation requirements and consider alternative approaches." The up-to-twenty-five-member work group with broad representation from the education community is to make a recommendation to Superintendent DeMaria by the Board’s April 2017 meeting.

Following is some immodest advice to the work group from someone who may be new to Ohio but is not new to work groups, task forces, and their brethren.

Set a specific end date for any transition to the new graduation requirements and make equivocating on it hard. As the nonprofit League of Education Voters put it when Washington was addressing the same task, “A transition period is understandable. A transition period with no end date or specific plan does not serve our students’ best interests, nor does it display any urgency for closing our state’s growing achievement and opportunity gaps.” Adopting minimum cumulative scores on the end-of-course (EOC) tests and effective dates in rule makes it more difficult to back away from the requirements when they draw near, because a rule once done would have to be undone. 

Don’t attempt to make your recommendation through a consensus process. The work group is made up of an unwieldy number of members (currently twenty-three) representing interests across the education spectrum, with little time before it’s asked to report in April. The wide representation ensures the Board will have the diverse input it wants and needs. But it also means that the people around the table are not likely to agree on a direction for moving forward. An insistence on consensus is more likely to lead to paralysis than to an actionable recommendation. While seeking as much common ground as possible, the work group should be prepared to vote at its last meeting and to submit a minority report to the Board if needed. Otherwise nothing is accomplished but to delay a decision that is better made sooner than later.

Don’t presume that the “right” answer can be found through data. The work group’s recommendation should be informed by data, but not be driven by it. The first consideration in any decision to revise present graduation requirements should be what best prepares Ohio students to be successful in life beyond high school and what it takes to get them there and not on whether graduation rate projections are politically acceptable. I’ve staffed and followed more than one study committee that operated on the assumption that the answer would come clear if there were just enough data. It won’t.

Consider the inclusion of an evaluation component. Any new graduation requirements are intended to alter behavior, or they would not be worth the effort. The fact is we cannot know with any level of precision how districts, schools, and students will respond to higher standards for achievement on assessments or how they will utilize the options made available to obtain a diploma. The work group might consider recommendations of an independent evaluation when there’s been enough experience with the graduation requirements, such as that California formerly conducted on its high school exit examination. Such an evaluation might examine, for example, impacts on four-year and five-year graduation rates by student subgroups; the strategies used by districts to improve performance on the EOC’s; the use of alternatives such as industry certifications and career readiness assessments, and post-secondary outcomes from those alternatives; and the effectiveness of local and state interventions. 

Finally, keep front and center the central question of what a high school diploma is and what it should represent. The Legislature made clear in passing HB 487 in 2014 that Ohio could no longer persist in granting high school diplomas that provide no assurance to students, parents, colleges, and employers that a student receiving one is ready to move on to post-secondary education or a career. A diploma can no longer be the rite of passage as it was when most of us were in high school, awarded when we’d passed enough courses to “walk” on graduation day. It should be a demonstration of accountability for both the student and the school that the student has attained the skills and learning needed to take the next step on whatever path chosen. A work group recommendation that does not meet that mark—and not just for some generation ahead but for students in the system now—will not meet the expectations of the Board, the legislature, or the public.

Jack Archer served as director of basic education oversight at the Washington State Board of Education until his retirement in 2016.  He lives in Fairview Park.


We take a look at a great charter school in Columbus, ask a serious question about school closures, and more

Parents make choices about their child’s schooling based on a variety of factors: location, safety, convenience, academics, extracurriculars, support services, and more. Many families choose their school by moving to the neighborhood of their preference, thus exercising “choice” when making homeownership decisions. It’s important to recognize that not all families have the same luxury. In fact, many don’t. For the most part, parents living in poverty can’t just up and move themselves to a neighborhood with higher-performing, better-programmed, safer schools. Yet their children deserve high-quality educational opportunities, too, in schools that work for them based on their unique learning styles, interests, and needs.

If we believe that parents of all income levels and backgrounds deserve the same choices we exercise for ourselves and our own children, then Ohio’s high-performing charter schools deserve our unwavering support. The 21,000+ events held across the nation last week for National School Choice Week demonstrate the pressing need—and support for—quality school options. Columbus Collegiate Academy (Dana Avenue campus), one of the city’s highest-performing middle schools, helps its eighth graders achieve math and science proficiency at a rate that’s more than double what the district achieves. Meanwhile, its eighth-grade reading proficiency rate is thirty-seven points higher than the district’s. It achieves this despite nearly all of its students being economically disadvantaged. 

Impressive data only tell part of the story. Hearing directly from parents and students sheds much needed light on how life-changing a good charter school option can be. Consider the story of just one student, Farah, who in the video below shares how she felt unsupported and was “made fun of” at a past school. (Given that one in five students reports being bullied, we know Farah is not alone.) Switching to Columbus Collegiate Academy gave Farah a sense of safety—arguably a prerequisite for learning given that we know bullying increases a child’s risk of having poor sleep and facing anxiety, depression, and other hardships.


But Farah, like many students from similar backgrounds, was far behind academically. She admits, “I was ‘an all Fs kid.’” Columbus Collegiate’s culture of high expectations, hard work, and relentless commitment to the idea that all children can and will learn helps Farah find a sense of self-efficacy and visualize a path to success.

Take a look at Farah’s inspiring story about the difference a good charter has made in her life.

For more student perspectives, check out Shyanne’s story, as well as profiles on several other students attending high-performing charter schools. 


“Winners never quit and quitters never win.” There's a lot of truth in that cliché, but it doesn't seem to apply to education. When it comes to chronically low-performing schools, in many cases, the better – and more courageous – course is to “quit” and close a school that is simply beyond repair.

In recent years, attempts to turn around failing schools are most closely linked to the Obama Administration’s supercharged School Improvement Grant (SIG) program. Between 2010 and 2015, the federal government spent $7 billion in efforts to turnaround low-performing schools. In exchange for these funds, grantee schools pledged to implement prescribed interventions, such as replacing personnel or changing instructional practices.

The returns: Not much—or perhaps not clear—according to a massive study by Mathematica and the American Institutes for Research (AIR). The study examined schools in the 2010 SIG cohort and tracked pupil outcomes through three years of implementation. Using data from twenty-two states, their analysis found that SIG had no significant impact on students’ state math or reading test scores. Nor did they find any evidence that SIG increased pupils’ likelihood of high school graduation or college enrollment. Further, the analysts didn’t even uncover an effect of SIG on the practices of participating schools.

Even with large sums of new money tied to well-meaning mandates, SIG turnaround schools could not appreciably move the achievement needle.

Now, what to do? One option is to keep trying to fix bad schools. We’ve been down that road with precious little to show for the time, effort and dollars spent.

Another approach is to close low-performing schools, while also launching excellent new schools to replace them. Andy Smarick, arguably the staunchest proponent of this view, writes:

We’ve tried to fix these deeply troubled schools for eons to no avail. … The wiser course of action is to make persistently underperforming institutions go away and then start new institutions in their place.

His position has been criticized as a “crusade” against SIG, and perhaps it is. Interestingly, though, in SIG’s own program design, one possible “intervention” was to close the school and have students enroll in another one. Predictably, almost no SIG schools took up the offer to voluntarily go out of business. But what if SIG had somehow induced the closure of more low-performing schools, and instead diverted billions of dollars to new school formation? Would kids’ outcomes have been any different?

We certainly don’t know the answer to this counterfactual. But a growing body of research (unrelated to SIG) suggests that students might have benefitted had their low-performing schools closed. In The 74, Matt Barnum highlights recent research from New Orleans that finds students made academic gains when low-performing schools closed. This mirrors Fordham’s own research, conducted by Deven Carlson and Stéphane Lavertu, which found displaced students from Ohio’s urban areas made significant gains on state exams, post-closure. Another study, this one from New York City, revealed that closing low-performing high schools increased the likelihood of students graduating from high school. Though closures may be politically difficult, studies now indicate that students benefit when a low-performing school closes and they relocate to a better one.

The Obama Administration’s parlay on intervention mandates and school turnarounds yielded little pay off for many thousands of children attending low-performing SIG schools. That demands some different thinking on how to lift outcomes in America’s neediest communities. Of course, it would be naïve to think that we can simply close our way to success. Done judiciously, shutting the lowest-performing schools while focusing resources on promising startups, might be our surest bet.

This piece originally appeared on the Real Clear Education blog.


One of the hallmarks of school accountability is the identification of and intervention in persistently low-preforming schools. Under No Child Left Behind (NCLB), schools were required to make adequate yearly progress (AYP); if they fell short, they were subject to a set of escalating consequences. Much of the backlash against NCLB was a result of these consequences being imposed from afar with little flexibility. So when Congress geared up for reauthorization, it wasn’t surprising that the new law, the Every Student Succeeds Act (ESSA), shifted the responsibility of identification and intervention to the states.

Last week, the Ohio Department of Education (ODE) released an overview of its proposed ESSA state plan. This isn’t the entire plan—the full draft will be released for public comment in early February. In future posts, we’ll do some deep dive analyses of the key areas and potential impacts of the full draft. But in the meantime, there’s plenty in the overview to explore—including how the Buckeye State plans to identify its lowest-performing schools.

ESSA requires states to identify at least two categories of schools: comprehensive support schools (which include the lowest-performing schools in the state) and targeted support schools (which include schools struggling with certain subgroups). Although ESSA requires only two categories, ODE’s plan proposes to carry over the three categories it currently uses that were a part of its federal waiver under NCLB/ESEA: priority schools, focus schools, and watch schools. Identification of these schools will begin at the end of the 2017-18 school year and, per ESSA requirements, the list of identified schools will be updated every three years. Let’s take a closer look at Ohio’s proposal for each of these categories.

Priority Schools

Priority is the name ODE plans to give to schools that fall under ESSA’s category of comprehensive support. There are three ways to fall onto this list:

  • Schools that receive an overall report card grade of F. Although Ohio hasn’t assigned a summative rating to schools in recent years, state law currently requires (but federal law no longer does) overall grades starting in 2018. ESSA requires that schools identified in this category include the lowest-performing 5 percent of Title I schools. Ohio’s plan notes that if less than 5 percent of schools receive an F, the next lowest-performing schools as determined by the overall report card grade will be added to meet the ESSA requirement.
  • Schools with a four-year graduation rate of less than 67 percent. It is an ESSA requirement that Ohio label such schools as comprehensive support schools (or in the case of Ohio, priority schools).
  • Schools with one or more subgroups performing at a level similar to the lowest 5 percent of schools.[1] According to ESSA, these types of schools start out under the targeted support label. If, however, a school fails to meet state-determined exit criteria within a certain number of years, it must transition into the priority category. Ohio already disaggregates some of its report card results based on certain subgroups (e.g., English language learners or race/ethnicity), but ESSA ups the ante by adding homeless students, foster care students, and children of active duty military personnel to the list of required subgroups for accountability. ODE’s plan also proposes to adjust its N-size for subgroups from 30 students to 15.

In order to move off of the priority schools list, schools must accomplish each of the following:

  1. Based on the overall report card grade, achieve school performance higher than the lowest 5 percent of schools for two consecutive years
  2. Earn a four-year graduation rate of more than 67 percent for two consecutive school years (if applicable)
  3. Have no student subgroups performing at a level similar to the lowest 5 percent of schools

Focus Schools

Focus is the name ODE proposes to give schools that fall under ESSA’s category of targeted support. There are three types of schools that will be labeled as focus schools:

  • Schools that earn a grade of D or F for the Gap Closing report card component for two consecutive years
  • Schools that have one or more student subgroups that fail to meet specific locally determined improvement goals for three consecutive years
  • Schools that do not meet multiple student subgroup performance benchmarks

In order to move off of the focus list, schools must earn an overall report card grade of C or better, earn a C or better on the Gap Closing component, and meet subgroup performance goals outlined by the state.

Watch Schools

Ohio’s additional category, watch schools, consists of schools that “struggle to meet the needs of one or more student subgroups.” ODE’s overview includes little detail about these schools, but the forthcoming plan is sure to offer more.

Although school identification is typically associated with low-performing schools, it’s worth noting that Ohio already identifies high performers. The state plans to continue these efforts under ESSA—even though it’s not required—in order to “honor and celebrate school districts that grow and achieve.” These recognition categories include schools that accomplish sustained achievement and substantial progress while serving a significant number of economically disadvantaged students and schools that exceed expectations in student growth for the year.

ODE’s plan for identifying low-performing schools shouldn’t be big news—other than changing the names of the categories and opting to have three categories instead of two, Ohio follows ESSA’s identification provisions pretty closely. The real drama is going to come with the news of which schools have been identified and how districts will select and implement improvement plans that actually work.

[1]This is based on individual subgroup performance.


On the college football field, Ohio and Michigan are bitter rivals. But in the charter school world they share something in common: Both states’ charter sectors have been saddled with the unflattering label of the “wild west.” Recently, this characterization—generally meant to describe a state without proper accountability policies—has been used in critiques of Michigan native and charter supporter, Betsy DeVos, president-elect Trump’s appointee for secretary of education.

What’s clear is that this label and accompanying narrative are hard to shed, even though both states have significantly strengthened their charter laws. On these Gadfly pages, Daniel Quisenberry has described how Michigan is improving its charter sector. In a Fordham report released today, we show how Ohio’s era of stagecoaches and saloons is starting to give way to a more modernized charter sector.

In On the Right Track, we examine the early implementation of recently enacted charter reforms in our home state of Ohio. Bottom line: The Buckeye State’s reforms are being implemented with rigor and fidelity, bringing promising changes to one of the nation’s oldest, largest, and most notorious charter sectors.

In autumn 2015, Governor John Kasich and Ohio legislators passed a landmark, bipartisan charter reform bill (House Bill 2). This legislation sought to strengthen accountability and transparency, align incentives to ensure quality schools, and rid the sector of conflicts of interest and loopholes that had threatened public trust. House Bill 2 was legislation that we at Fordham strongly supported and were pleased to see enacted into state law.

Among its myriad provisions, the legislation:

  • Ratchets up state oversight over its numerous charter authorizers (more than sixty as of last year). Among the key accountability tools is Ohio’s sharpened authorizer evaluation system that now includes revocation for a poor rating.
  • Eliminates “authorizer hopping.” While Ohio’s plethora of authorizer options allowed schools to find one that fits their needs, it also allowed low-performing schools to escape accountability by switching authorizers. Ohio’s charter reforms now prohibit this, with few exceptions.
  • Empowers charter governing boards to exercise independent control over their schools—and puts safeguards in place to reduce the likelihood they are being controlled by a management company.

But as studies and vast amounts of experience have taught us, whether these legislative reforms bear fruit or wither on the vine hinges largely on implementation. Now that a year has passed since Governor Kasich signed the legislation, we thought it was time to take a first close look. How are these reforms being implemented—with vigor and care, or with neglect? Are there any early indications that the reforms are improving sector performance? Alternatively, are any unintended consequences becoming clear?

To analyze these questions, we looked at several key data points, including trends in Ohio’s charter school closures and startups. We also reviewed each House Bill 2 provision, searching for evidence of implementation or enforcement by state authorities. Three key findings emerge:

  • Ohio’s charter sector is becoming more quality focused. In 2016, twenty-one charters closed across the state, among the highest numbers of school closings on record in Ohio. The schools had received low ratings on state report cards, suggesting that Ohio’s tougher accountability policies are—as they should—decreasing the likelihood that underperforming schools will just go on forever. Additionally, a very small number of new charter schools opened in fall 2015 and 2016—just eight new startups in both years—the lowest numbers of new school openings in Ohio’s charter history. This indicates that authorizers are vetting new schools more diligently as the pressure rises to open schools that promise quality. However, this also raises the troubling possibility that reforms are impeding charter growth, perhaps even deterring potentially excellent schools from entering the sector.
  • Ohio’s rigorous authorizer evaluation system has teeth. In October 2016, the Ohio Department of Education released its first round of high-stakes authorizer ratings under a revamped evaluation system. (Initial evaluation legislation passed in 2012, but that iteration had not been thoroughly implemented.) Twenty-one out of sixty-five total authorizers received an overall Poor rating—the lowest possible—while another thirty-nine were rated Ineffective, the second lowest rating. Authorizers rated Poor had their rights revoked, pending appeal, while Ineffective authorizers are now subject to a quality improvement plan overseen by the state and are prohibited from opening new schools. Poor rated authorizers represent only a small portion of the overall sector—responsible for just 8 percent of Buckeye charter schools; Ineffective entities authorize the majority of charters (62 percent).
  • State authorities are implementing forty-nine out of fifty of the House Bill 2 provisions in a verifiable way. Many of the legislative provisions require state agencies—e.g., the Ohio Department of Education or State Auditor—to enforce or verify adherence to the new charter law. To their credit, these executive agencies are taking their responsibilities seriously and carrying out the new charter law.

The hard work of implementation is, of course, far from done in Ohio. Policy makers still need to make some important adjustments to its authorizer evaluation system, and they must find a way to balance the tighter accountability environment with the need to grow new schools that give families and students the quality options they deserve. Ohio’s charter sector, for instance, would greatly benefit from more generous startup investment dollars—not to mention more equitable operational and facilities funding—to help quality schools replicate or launch promising startups from scratch. Lastly, empirical research will be required to help us grasp whether Ohio’s sector performance, post-reform, improves compared to prior studies that uncovered disappointing results.

In the end, we offer some good news: The implementation of major charter reform in Ohio is off to a strong start. Yes, we know that bad reputations are hard to shake. But before making broad generalizations, come and take a closer look at the changes—for the better—happening right here in America’s heartland.


The American Federation for Children (AFC) recently released its third annual poll on school choice. The national poll surveyed just over 1,000 likely November 2018 voters early this January via phone calls.

To determine general support and opposition, AFC posed the following question: “Generally speaking, would you say you favor or oppose the concept of school choice? School choice gives parents the right to use the tax dollars associated with their child’s education to send their child to the public or private school which better serves their needs.” By and large, the findings indicate broad support for school choice—68 percent of those surveyed support school choice compared to 28 percent who oppose it. These numbers are similar to AFC results from previous years: 69 and 70 percent of likely voters who expressed support for school choice in 2015 and 2016, respectively.

In addition to overall percentages, AFC broke out the survey numbers by specific demographic groups. Seventy-five percent of Latinos and 72 percent of African Americans support school choice compared to 65 percent of Whites. In terms of political affiliation, 84 percent of Republicans support school choice (up slightly from 80 percent in 2016), compared to 55 percent of Democrats (down from 65 percent in 2016); 67 percent of Independents voiced support for choice. Of the four generations surveyed, Millennials had the highest level of support for choice with 75 percent. 

The AFC survey also finds that seven types of choice gain majority support. They include: special needs scholarships (83 percent support), public charter schools (74 percent support), scholarship tax credit programs (73 percent support), education savings accounts (69 percent support), opportunity scholarships (58 percent support), virtual learning (59 percent support), and school vouchers (51 percent support).

Pollsters also questioned respondents on their support for “two potential school choice proposals that may be introduced in Congress.” Seventy-two percent of potential voters expressed support for a federal scholarship tax credit, and 51 percent of voters supported Trump’s proposal of a $20 billion school choice program.

It’s worth noting that the way pollsters ask questions matters. While the report did mention wording for a few questions, the majority of questions aren’t provided. Nevertheless, other national polls find similar levels of support for school choice

SOURCE: “2017 National School Choice Poll,” American Federation for Children, (January 2017). 


Ohio charter schools have long reported struggling in their efforts to secure school facilities. A soon-to-be released report, “An Analysis of the Charter School Facility Landscape in Ohio,” from the Ohio Alliance for Public Charter Schools, the National Charter School Resource Center, the Charter School Facilities Initiative, managed by the Colorado League of Charter Schools, and the National Alliance for Public Charter Schools surveys school principals to get the most detailed look to date of Ohio charter school facilities. The survey, which includes data from 81 percent of Ohio's brick and mortar charter schools, examines multiple aspects of charter facilities including the size, uses, and cost per student of each.

Please join Fordham and the Callender Group to hear the report’s authors share the data and Ohio charter schools/school networks talk about what the report means on-the-ground.

Thursday, February 2, 2016
8:30 - 10:00 am

Chase Tower - Sixth floor conference room B
100 East Broad Street
Columbus, OH 43215

Kevin Hesla, National Alliance for Public Charter Schools and report co-author
Jessica M. Johnson, Esq., Colorado League of Charter Schools and report co-author

Tiffany Adamski, Regional Director Midwest at iLEAD
Andrew Boy, Founder and Chief Executive Officer, United Schools Network
Lyman Millard, consultant at Breakthrough Schools

Mark Real

Doors open at 8:00 am and a light breakfast will be served.



We look at Ohio’s Quality Counts ranking, district-charter collaboration options, and more

Education Week just issued its twenty-first “Quality Counts” report card for states. Ohio’s grades are so-so—and nearly identical to last year’s. Yet with a “C” overall and ranking twenty-second nationally, the Buckeye State’s standing relative to other states has fallen dramatically since 2010 when it stood proud at number five.

Ohio’s slide in EdWeek’s Quality Counts ranking has become easy fodder for those wishing to criticize the state’s education policies. Those on the receiving end of blame for Ohio’s fall have included: Governor Kasich (and the lawmakers who upended former Governor Strickland’s “evidence-based” school funding system), Ohio’s charter schools (never mind that nothing whatsoever in the EdWeek score cards takes them into consideration!), and even President Obama (specifically for his 2009 Race to the Top program). I’ve lost track of the number of times I’ve heard or read that Ohio’s plummeting ranking is incontrovertible evidence of things gone awry.

An almost-twenty slot drop in rankings sounds terrible, but my guess is that many people who lament it don’t know what the ratings comprise or that EdWeek’s indicators have changed over time. Let’s take a look at the overall rankings, and then take a deeper dive into changes to Ed Week’s report card to help explain Ohio’s decline.

Table 1 shows Ohio’s performance on the Quality Counts assessment from 2010 to 2017. Ohio’s national ranking and points earned have slipped since 2010 (used here as a starting point as it represents the high-water mark for the state on this particular report card), even while its grade continues to hover in the B-minus to C range.

Table 1: Ohio scores on Education Week’s Quality Counts report card

The 2017 report card is based on three sub-categories.

  • Chance for Success—a measure that includes educational inputs and outputs across the life span, such as family income, parent educational levels, preschool and kindergarten enrollment, fourth- and eighth-grade NAEP scores, and adult educational attainment.
  • K-12 Achievement—looks at student performance on the National Assessment of Educational Progress (NAEP), graduation rates, and percent of students scoring 3 or above on AP exams, as well as gaps in NAEP proficiency between poor and non-poor students.
  • School Finance—a measure that includes state funding systems’ reliance on local property wealth as well as other measures of equity, per pupil expenditures, and share of taxable resources spent on K-12 education.

Graph 1: Ohio’s Quality Counts sub-scores, 2010-2017 (three categories)

As Graph 1 depicts, Ohio’s sub-scores are largely flat. Chance for Success dropped by one point in the last eight years; K-12 Achievement fell by less than two points; and School Finance three and a half points. So why has Ohio’s overall rank suffered so greatly?

There are at least two reasons behind Ohio’s fall from fifth that have nothing to do with its actual performance. The first is the change in composition of the report cards and the other has to do with national context.

Prior to 2014, Education Week graded states in six categories instead of three. Two of these—“Standards, Assessments, and Accountability” and “Transitions and Alignment”—were among Ohio’s top-rated categories, with Standards the only category in which Ohio ever received an A. Table 2 below makes clear that the shift to just three categories in 2015 essentially knocked out Ohio’s highest-rated components, causing an overall decline in its score. Indeed, the largest single-year drop occurred between 2014 and 2015 when Quality Counts was downsized.

Table 2: Ohio’s sub-scores on Education Week’s Quality Counts report card (0-100)

Graph 2 includes all of Ohio’s sub-scores over time, including the three that were removed from the report card (shown in red and oranges below).

Graph 2: Ohio’s Quality Counts sub-scores, 2010-2017 (six categories)

The change in EdWeek’s report card at least partially explains Ohio’s decline in score as well as rank over the decade. Still, readers might wonder why Ohio dropped so suddenly from twelfth to twenty-sixth in 2014—a year before Standards, Assessments, and Accountability (and the other metrics) were removed. That brings us to the second reason for Ohio’s change in relative rankings, which likely had less to do with what Ohio did or didn’t do, and more to do with what happened in other states. The Buckeye State was an early adopter of Common Core state standards and aligned assessments, voting to adopt them in 2010. And as a Race to the Top winner the same year, it was ahead of the pack in adopting new statewide teacher evaluation systems (a component of the old “Teaching Profession” category). It is unsurprising that Ohio as an early adopter of reform earned high marks on these measures.

Ohio’s fourteen-slot fall between 2013 and 2014 very likely resulted from the changes rapidly occurring in other states that earned them extra points Ohio already had under its belt. Other states adopted and implemented the Common Core, tougher assessments, and teacher evaluation systems—all of which boosted their scores. Ohio’s actual score only dropped by a point the year its ranking plummeted. Conversely, its score fell three times as much the following year (2015) while its ranking rose by eight slots. The lesson here is that relative rankings are just that—relative.

Table 3:  Ohio’s dramatic ranking shifts from 2013-2015

Resorting to broad generalizations about Ohio education based solely on the national Quality Counts report card is misinformed at best and intellectually dishonest at worst. Even so, much of the Quality Counts metrics are valuable and can help inform policy priorities. Ohio’s poor showing for NAEP achievement gaps by income; preschool and kindergarten enrollment; funding equity; and adult educational attainment should be especially concerning to us. Ohio leaders wanting the state to compete as a thriving place of opportunity for families should keep close tabs on these metrics, always pushing for ways to move the needle. What they shouldn’t do is pay attention to hyperbolic claims about the demise of our state—at least, not based on this particular report card. 


Peter Cunningham recently called district-charter collaboration the “great unfilled promise” of school choice. He explains the possibilities by pointing to a host of cities that are already benefiting from collaboration: In New York City, districts and charters are partnering to improve parent engagement. In Rhode Island, charters are sharing with district schools their wealth of knowledge on how to personalize learning effectively. Boston has district, charter, and Catholic schools working together on issues like transportation and professional development and has successfully lowered costs for each sector. The SKY Partnership in Houston is expanding choice and opportunities for students. The common enrollment system in New Orleans has solved a few long-standing problems for parents (like issues with transparency), and partnerships in Denver have set the stage for even more innovation. Though the type and extent of collaboration differs in each of these places, the bottom line is the same: Kids benefit.

Here in the Buckeye State, there are thousands of kids in need of those benefits. Our most recent analysis of state report card data shows that within Ohio’s large urban districts (commonly known as “the Big Eight”), proficiency rates were far below the state average in fourth- and eighth-grade math and ELA. Each of these districts has a significant population of disadvantaged students similar to those that the district-charter partnerships in other cities are serving well. It stands to reason that district-charter collaborative models would greatly improve both the opportunities for and the outcomes of Ohio students.

But things in the Buckeye State are a bit more complicated. For starters, Ohio charters don’t have access to the same resources that charters in other states do, and that’s led to some serious squabbling about who’s stealing from whom. A 2016 survey of Ohio charter principals found that the biggest barriers to growth included lack of funding, trouble securing facilities, and a lack of district cooperation on issues like transportation and student records. Districts, meanwhile, claim that they’re the ones who are being shortchanged. Working to reform funding and access to facilities could go a long way toward making the sectors more amenable to collaboration.

But even with meaningful reform, the tensions between districts and charters in Ohio could remain. After years of being pitted against one other, trust is in short supply. The truth is that there are few tangible incentives for charters and districts to stop finger-pointing and start working together. But if the goal of both sectors is to do what’s best and right for kids, then collaboration can’t remain a pipe dream. When districts and charters continue to operate in silos, kids pay the price.

Fortunately, Cleveland’s existing district-charter partnership is a positive sign of what could be in the Buckeye State. Plans were initiated back in 2012 with the Cleveland Plan, and by 2014, the Cleveland Metropolitan School District (CMSD) began to engage with local charters in earnest. The city became a "Gates Compact City" in 2014—charters and the district signed a pledge to “improve collaboration and work toward shared goals” and were provided with a planning grant to support their work. Progress has continued since, and the Cleveland Education Compact now boasts subcommittees in which district and charter school leaders work jointly on issues like professional development, special education, policy/advocacy/funding, and enrollment and record sharing. No partnership is perfect, and CMSD and its charter partners will certainly have their ups and downs as they determine how best to collaborate moving forward. But the willingness from both groups to work together is worth acknowledging, and their collaborative model is worth considering in other Ohio cities.

Many folks in Ohio didn’t think charter reform could ever become a reality, but in 2015, the Ohio General Assembly passed House Bill 2, a comprehensive reform bill with provisions designed to incentivize higher performing authorizers, charter school boards, and management companies. Despite the short time frame, we’ve already started to notice the charter sector changing for the better. With these comprehensive reforms under our belt and $71 million in CSP funds waiting in the wings, 2017 could be a good year for Ohio charter schools. A good year would certainly be welcome. But a great year would be even better—and a great year becomes far more possible if Ohio can solve the problem of limited collaboration between districts and charter schools. 


NOTE: The Thomas B. Fordham Institute occasionally publishes guest commentaries on its blogs. The views expressed by guest authors do not necessarily reflect those of Fordham.

Last fall I retired to Northeast Ohio, where my wife and I have family, from Washington state, where I’d been staff to the State Board of Education and the state legislature. In perusing the Plain Dealer one morning, I felt that I could as well have been back in Olympia. 

The story described new state high school graduation requirements linked to higher standards defining readiness for college and career that had been set by Ohio’s State Board of Education and the fierce backlash ensuing from superintendents and others. The State Department of Education calculated that nearly 30 percent of high school juniors were likely to fall short of graduating next year if the new requirements were applied to them. Superintendents organized a protest rally—dubbed by one State Board member a “march for mediocrity”—on the statehouse steps. In light of the concerns voiced, the Board created a task force to make a recommendation on whether the requirements should be changed or phased in in some manner.

That the present controversy resonates with my experience in Washington is not such a surprise. I could have moved to Ohio from many other states grappling with the same question: Can you have higher standards for granting a high school diploma geared to measures of career and college readiness without adversely affecting graduation rates, at least for a while? How do we best resolve the inherent tension between the twin goals of higher standards and higher graduation rates?

While the parallels between Washington and Ohio are striking, there are significant differences, too. I relate the Washington story not for its intrinsic interest, but in the hope that there may be things to be learned from the Evergreen State experience.  

While Ohio requires students to earn a cumulative passing score on a series of end-of-course tests to graduate, Washington has set passing scores on statewide assessments developed by the Smarter Balanced Assessment Consortium (SBAC) linked to Common Core State Standards as a bar for earning a state-approved diploma. Legislation enacted in 2013 required the State Board to establish the scores students must reach to obtain a state diploma. “The scores established by the state board of education for the purpose of earning a certificate of academic achievement and graduation from high school may be different from the scores used for the purpose of determining a student’s career and college readiness,” the act said.

In rule, the Board adopted aspirational language declaring that, “The state’s graduation requirements should ultimately be aligned to the performance levels associated with career and college readiness.” But there is no law requiring that the two be so aligned, and pressure from “the field” will continue to be strong that they not be aligned lest graduation rates slip. 

In a November 2014 presentation on alternative assessments, a consultant to the Board posed the big, unpleasant but unavoidable question, “How can we increase the rigor of a high school diploma and the number of students graduating at the same time?” It is the same question confronting the Ohio State Board now. 

In January 2015, the Washington Board formally stated an intent to adopt an “equal impact” approach to setting the minimum, “or cut,” scores for graduation. Under this approach, the cut scores for the Smarter Balanced assessments would result in the same projected percentage of students earning a diploma as would have been the case had the state still been using the old, superseded state assessments aligned with earlier, less rigorous state standards. 

In August, the Board adopted scores implementing that policy. Superintendents supported the action as fair to students while their districts continued to transition to the new standards and assessments. Business-backed education reformers criticized it as a retreat from standards. “The Washington State Board of Education today fell short of setting a clear path for our state toward all students graduating high school prepared for their next step in life,” said the Seattle-based League of Education Voters.

A former school board member and community college instructor asked, “Isn’t it time that we became serious about education? Setting cut scores based on how many this will allow to pass just further depreciates a high school diploma. We know by community college remediation rates that the current high school diploma is not acceptable. Why continue this myth?”

The Board stated that the initial score-setting begins the process of moving toward more rigorous college-and career-ready standards, and “was not done to compromise or confuse our goal.” Its good intent is not to be doubted. To reiterate, however, there is no law requiring that scores established for graduation move toward and ultimately equal those for career and college readiness, and none on the horizon. I will watch from my new home to see whether the Board’s intent is realized in a political environment not friendly to it.

As I keep an eye on events in Washington, having been so involved for so long, I’ll also be following the deliberations of the task force created by the Ohio State Board of Education with great interest.

I find much to credit in Ohio’s graduation requirements. Having students achieve a minimum cumulative number of points on state end-of-course tests rather than meet a standard on each consortium-designed test and the alternative pathways to graduation offered, including an industry credential and established workforce readiness assessment, are worthy of examination, and perhaps emulation, by other states. 

Change is hard, we all know. It is prudent for the Board to step back and, with the help of the task force, consider what may be the best way to implement the new requirements while doing the least harm. At the same time, the Board should resist the inevitable pressure to back away from higher standards. It is critical that board members not lose their focus on what will best prepare students—meaning this generation of students and not some future one—for success in life after high school in a very different and more challenging world from when my generation was in school. The stakes are very high.

Jack Archer served as director of basic education oversight at the Washington State Board of Education until his retirement in 2016.  He lives in Fairview Park.


Much prior research indicates that youngsters from single-parent families face a greater risk of poor schooling outcomes compared to their peers from two-parent households. A recent study from the Institute for Family Studies at the University of Virginia adds to this evidence using data from Ohio.

Authors Nicholas Zill and W. Bradford Wilcox examine parent survey data from the National Survey of Children’s Health. This dataset contains information on 1,340 Ohio youngsters—a small but representative sample. The outcomes Zill and Wilcox examine are threefold: 1) whether the parent had been contacted at least once by their child’s school for behavioral or academic problems; 2) whether the child has had to repeat a grade; and 3) a parent’s perception of their child’s engagement in schoolwork.

The upshot: Buckeye children from married, two-parent households fare better on schooling outcomes, even after controlling for race/ethnicity, parental education, and income. Compared to youngsters from non-intact families, children with married parents were about half as likely to have been contacted by their school or to have repeated a grade. They were also more likely to be engaged in their schoolwork, though that result was not statistically significant.

An estimated 895,000 children in Ohio live in a single-parent household, according to the Annie E. Casey Foundation. Each of them may feel the same love and affection as their peers from married families, but the stark reality, as indicated by study after study, is that on average, they face disadvantages manifested in lower schooling outcomes. The challenge for schools is to help all of their students—including ones from single-parent families—to beat the odds.

Source: Nicholas Zill and W. Bradford Wilcox, Strong Families, Successful Students (Institute for Family Studies).


More than sixty years after Brown v. Board, traditional district schools are more often than not still havens of homogeneity. Static land use guidelines, assignment zones, feeder patterns, and transportation monopolies reinforce boundaries that functionally segregate schools and give rise to the adage that ZIP code means destiny for K-12 students. Asserting that student diversity is an object of increasing parental demand, at least among a certain subset of parents of school-age kids, the National Charter School Resource Center has issued a toolkit for charter school leaders looking to leverage their schools’ unique attributes and flexibilities to build diverse student communities not found in nearby district schools. The report cites a number of studies showing academic benefits of desegregated schools, especially for low-income and minority students. It is unlikely that the mere existence of documentable diversity is at the root of those benefits. More likely, it is a complicated alchemy of choice, quality, culture, and expectations that drives any observable academic boosts. Garden-variety school quality is a strong selling point for any type of school, but this toolkit sets aside that discussion to focus on deliberately building a multi-cultural student body for its own sake. Bear that in mind as we go forward.

Building diversity is not easy, even in a flexibly run and technically borderless charter school. The toolkit provides “context about research and the legal and regulatory guidance” in four main areas required to be addressed: defining, measuring, and sharing school diversity goals; planning school features to attract diverse families; designing recruitment and enrollment processes; and creating and maintaining a supportive school culture.

Goal-setting and recruitment are thorny from the start, the report warns, as using racial or cultural characteristics to even set an enrollment target is riddled with concerns around quotas and discrimination. In states where charter location is less regulated, the calculus may be how to attract families of color to a school in a predominantly white neighborhood. In states like Ohio, where charter location is limited to low-performing school districts, the calculus may be reversed. Either way, the toolkit provides valuable guidance for negotiating these potential pitfalls. Also addressed – although not solved by any means – are the severe constraints charters in many states face in terms of facilities and transportation. The mechanics and legalities of diversification can easily overwhelm the best of plans before the desired diverse student body even gets in the door.

My colleagues here at Fordham have written extensively about the challenges of teaching kids who are at vastly different levels of achievement, which is more likely in a diverse school. The new toolkit has a section on school staffing, training, and professional development, but the resources highlighted there are more about cultural awareness and discipline practices than actually teaching the students being recruited. I highly recommend Mike Petrilli’s 2012 book The Diverse Schools Dilemma for more on the latter.

The tips and guidance in this toolkit are helpful and may give charter school leaders insight into areas where well-intentioned plans to build a diverse student body could unexpectedly founder. But with no discussion of academic quality as a key means of attracting students, we are left with a roadmap to somewhere we might want to go. But not until after we’ve visited the bigger and better-known stops along the way.

SOURCE: “Intentionally Diverse Charter Schools: A Toolkit for Charter School Leaders,” National Charter School Resource Center (January, 2017)


NOTE: The State Board of Education of Ohio on December 13, 2016 debated whether to change graduation requirements for the Class of 2018 and beyond. Below are the written remarks of John Morris, given before the board.

Members of the Board,

Thank you for giving me a moment to offer testimony on behalf of the construction industry. Members of the industry sent me here to thank you for setting a new higher bar with the class of 2018 graduation requirements. We are excited that this board has supported maintaining high standards for graduating and earning a diploma in the State of Ohio. Members of the construction industry were very pleased when the phase out of the Ohio Graduation Test was announced in favor of multiple end-of-course exams and the opportunity for an industry credential to help a student graduate. We expect this new system to be an improvement over the current system that graduates many without the skills to succeed in college and continuously FAILS to introduce others to the hundreds of thousands of pathways to employment via industry credentials.

For many decades, industries such as construction and manufacturing enjoyed a steady stream of individuals coming directly from "vocational" schools with earned industry credentials and experience. The construction industry regularly took graduates and gave them advanced placement into level three of apprenticeship, only two years from full journeyman status. That luxury and system of education is now all but dead and gone. The increased emphasis on a pre-college curriculum for all has severely diminished the ability of high school students to learn a skilled trade while in school and the elimination of middle school "shop" classes has reduced the number of students who learn at young age of their God-given talents in the trades. You see when I was in sixth grade, I learned in middle school shop class that I was not going to be a carpenter when I struggled to build the birdhouse. I knew I would not be a welder or plumber when I could not properly seal two metal objects together; but I was one who could wire a lamp and do the math needed to calculate wattage. When I struggled in high school, I chose to become an electrician. I owned my own company by the age of 28. I paid my way through college debt-free thanks to a trade. I now hold multiple masters degrees and taught economics at the University of Cincinnati. An industry credential and skilled trade open the door to opportunity for me and it can for many others.

The current graduation requirements that offer alternative pathways through industry credentials are perfectly designed to fix a broken system that mistakenly tells every child who gets a diploma that they are college-ready. We all know this is not the case; yet we've heard that many are already talking about making the tests easier and/or reducing the number of points required to earn a diploma rather than increasing the emphasis on schools to work with industries like construction to help steer students into choices that don't involve certain college failure and accumulation of unforgiveable debt. We applaud this board and the Ohio Department of Education for creating a system that includes pathways to graduation via industry credentials and sincerely hope that you hold onto the idea of higher standards for graduation. We all know that not everyone should go to college and industry credentials offer an alternative pathway to success in life—one without failure and debt. The only way school districts will pursue these pathways is to maintain the system as it is. Do not make their jobs easy. Keep the standards as they are written and make the superintendents do what they should be doing anyway—working to find a path to success for all students, not just those who are on a college path. Construction professionals stand ready to help school districts build pathways to credentials in our industry and are pleased to offer Ohio Students a debt-free pass to a lifetime of earnings through a skilled trade.

Mr. Morris is president of the Ohio Valley Construction Education Foundation, based in Springboro, Ohio.


We ring out the old Ohio education news from 2016 and ring in the new

At the end of November, we asked you—our loyal Ohio Gadfly readers—to tell us what you thought were the top education stories for 2016. The choices were numerous and we appreciate all of the responses. In the spirit of “ringing out the old,” we give you the Top 5:

  1. House Bill 2 (HB 2): It is difficult to overstate the importance of this wide-ranging reform of Ohio’s charter school policies, which went into effect in February of this year. Almost immediately, we observed “HB2 effects” rippling throughout the sector, particularly in terms of sponsor decision-making around school closures. Additionally, “sponsor hopping” (in which schools seek out the sponsor of least resistance when anticipating a contract non-renewal) disappeared virtually overnight. Completion of the new, rigorous sponsor evaluations that were strengthened by HB 2 occurred in October (more on these later). Befitting the top placement for this story in 2016, there is much more to say. Stay tuned to the Ohio Education Gadfly for our detailed analysis of the early implementation of HB 2, expected in the New Year.
  2. ECOT vs. ODE: Ohio’s largest online charter school was embroiled in a lawsuit with the Ohio Department of Education for much of 2016. In simplest terms, the dispute centers around the manner in which e-schools track student attendance and scaled-up requirements from the state to more precisely document not just student “learning opportunities” but actual participation in class as measured by hours logged in. The department’s attendance audit of ECOT found that just 6,312 pupils could be documented as full-time students, even as the school received funding for over 15,000 students. ECOT may have to return 60 million dollars in state funding. In response, the school has fought the audit findings in court and in the court of public opinion. Expect to hear more about this issue in 2017.
  3. School report cards: Ohio released not one but two school report cards in 2016. Due to the delays in receiving the 2014-15 PARCC scores, the state released that year’s data in late February, roughly six months in arrears. In September, Ohio published school report cards again—this time on its usual schedule—for the 2015-16 school year. The upshot from both years’ data: Ohio’s more challenging standards meant that fewer students were deemed “proficient” on state exams. For some, this bitter pill has not been easy to digest, but it has been a necessary shift. For far too long Ohio created the false impression that the overwhelming majority of students was proficient on state tests—“doing just fine”—even as one in three college freshmen fell into remedial education. A more honest appraisal of student achievement is now coming into view, and we offer our kudos to policy makers for holding the line on higher proficiency standards.
  4. Charter sponsor evaluations: HB 2 puts sponsors (aka authorizers) at the front and center of Ohio’s charter reform efforts. As many know, sponsors are the entities tasked with overseeing the financial, academic, and operational performance of charter schools and holding them accountable when necessary. Sponsors underwent a new and rigorous evaluation, the results of which came out in October. Five sponsors were rated Effective; thirty-nine were Ineffective; and twenty-one sponsors in the state were ranked Poor, the very bottom category.  No sponsors received the state’s top rating, Exemplary. Sponsors rated Poor are no longer allowed to sponsor schools, while Ineffective sponsors face consequences as well, albeit less severe. The sponsor evaluation system is not perfect and needs some tweaks—among them, making compliance less burdensome and ensuring that student growth comprises a larger portion of the academic grade. But given Ohio’s historic problems with loose vetting of schools, sponsor hopping, and an overall poor track record, the new evaluations for sponsors—and sanctions tied to ratings—are a necessary step on Ohio’s road to charter improvement.
  5. High school graduation debate: Ohio is phasing in tougher graduation standards in the form of more challenging end of course exams starting with the Class of 2018. (Goodbye OGT!) An analysis by the Ohio Department of Education indicates that as many as one third of current high school juniors are not on track to meet those requirements and may not receive a diploma at the end of their senior year. This realization triggered an outcry among school officials clamoring for a lowering of the new requirements. It fell to the State Board of Education in its final meeting of the year to listen to testimony, discuss, and decide whether any changes would be made. In the end, the board decided to authorize a workgroup to study the issue and report back in four months’ time. Speaking of testimony, we have an idea that could ease some of the tensions: Check it out here.

If there is a theme to this year’s top education stories, it has to be increased expectations and accountability among all actors in Ohio’s K-12 education sector. We believe this is good news for Buckeye students and parents. State policymakers must stay the course and keep student success—genuine success that signals true college or career readiness—as the goal, even though holding firm will inevitably cause short-term angst. If high expectations remain, 2016 could well be the year which future analysts remember as the start of Ohio’s education renaissance.


In late 2016, we at the Ohio Gadfly asked for your predictions on the most important education issues of 2017. Here were your prognostications, along with—as you might expect from us at the Gadfly—commentary on how we hope these debates will unfold in the year to come.

Number 5: School accountability

It’s no surprise to see school accountability on our readers’ list of big issues for 2017. In the coming year, Ohio will submit a revised plan for accountability under the new Every Student Succeeds Act. Fortunately, the law doesn’t require the Buckeye State to undertake a major overhaul of its accountability policies. Ohio can and should stay the course on its key policies (with minor adjustments; see number 2 below). For instance, policymakers should maintain the use of a performance index and student growth measures or value added; they should also preserve a transparent A-F grading system. As Ohio’s ESSA plan is reviewed and debated, policymakers must ensure that accountability policies uphold high expectations for all pupils and offer clear information on school quality.  

Number 4: E-schools

With over 30,000 Ohio pupils attending virtual charter schools, the Buckeye State has one of the largest e-school sectors in the nation. Online learning can be a valuable alternative for many students: Some may need the flexibility and safety that e-schools afford, while others might find their own home to be a more productive learning environment than a crowded or disrupted classroom. Nevertheless, full-time e-schooling appears to be falling short of its full potential. Rigorous research has found that the average student loses academic ground when he enrolls in a virtual school, without regard to the academic level at which he started. Meanwhile, Ohio’s largest e-school has been embroiled in a lawsuit with the state over how it verifies student attendance. So what’s next for e-schools? It’s hard to say for certain, but even their harshest critics are likely to admit that online learning is here to stay. This means that policy makers need to focus on creating the conditions that can help all of Ohio’s online learners thrive. (See here and here for some policy ideas.) Moving forward, it will be important to heed the advice of Michael Horn who writes in Forbes: “Harnessing their [e-schools’] benefits while reigning in their downsides is critical.”

Number 3: The new federal administration

On January 20th, Donald Trump will be inaugurated as the forty-fifth president of the United States. As has been widely reported in the media, Trump has nominated Betsy DeVos to be U.S. Secretary of Education. What is clear is that DeVos is a staunch supporter of school choice, including vouchers and charter schools, and that Trump voiced his own support for choice on the campaign trail. But how their leadership will affect Ohio’s education policies is much less clear. The Buckeye State is already home to several robust choice options (including both charters and vouchers), and recently received a large federal grant to support the replication of high-quality charters. It also remains far from clear how a federal private-school choice program would work, or whether it would even be desirable. As analyst Joy Pullman of the conservative web magazine The Federalist recently told Time, “If DeVos and Trump love school choice, and the children it benefits, they will keep the federal government far, far away from them.” For those interested in what the new administration might do in the area of choice, tune into Fordham’s January 18th event, “A New Federal Push on Private School Choice? Three Options to Consider.” 

Number 2: Every Student Succeeds Act (ESSA)

As noted above, Ohio has a strong school accountability framework, and the bulk of it will likely remain in place under ESSA. Still, policymakers should use the ESSA state plan to make an important adjustment to accountability. They need to place more weight on student growth measures or value added, a more poverty-neutral measure, in the overall school-grading formula. (Overall A-F grades will come online in 2017-18, though they are already used for sponsor evaluations.) Under the current framework, the state will label almost all high-poverty schools as failures—Ds and Fs—due to the overemphasis of measures correlated with demographics, such as proficiency. Regrettably, this type of system fails to distinguish high-performing, high-poverty schools from true failures. This is unhelpful—even misleading—to local leaders and families who rely on report card grades to make decisions. It’s also unfair to the terrific educators at Ohio’s finest high-poverty schools who, in all likelihood, will also find themselves stuck with low ratings. A few other states such as Arkansas, Colorado, Idaho, and Oregon have implemented summative ratings that place at least half the weight on student growth measures. Ohio should follow their lead, and the ESSA plan would be the perfect opportunity to recalibrate report cards and ensure a proper weighting on growth.  

Number 1: Charters and choice

According to Ohio Gadfly readers, charters and choice will be the hottest topic in 2017. As these matters take center stage—if they aren’t already—it will be essential to focus on quality and results. If public charter and private schools of choice are to flourish over the long run, they need to offer students and families a higher-quality experience than the conventional options. And to win the confidence and trust of policymakers, taxpayers and the general public alike, they must consistently demonstrate positive pupil outcomes. Some Buckeye schools of choice have proven themselves, but others are mediocre or worse. As Ohio makes its long overdue turn to quality, policymakers will need to uphold accountability, including the House Bill 2 charter reforms, while also supporting the growth and replication of high-performing schools. If they can achieve these goals, we should start seeing the promise of choice begin to be fulfilled in 2017.

* * *

With our erudite readers, we too anticipate these five issues to dominate Ohio’s policy discussion in 2017. But if last year taught us anything, it’s to prepare for the unexpected! We look forward to keeping you posted on these issues and more in the New Year. 


NOTE: The State Board of Education of Ohio is today debating whether to change graduation requirements for the Class of 2018 and beyond. Below are the written remarks that Chad Aldis gave before the board today.

Thank you, President Gunlock and state board members, for allowing me to offer public comment today.

My name is Chad Aldis. I am the vice president for Ohio policy and advocacy at the Thomas B. Fordham Institute, an education-oriented nonprofit focused on research, analysis, and policy advocacy with offices in Columbus, Dayton, and Washington, D.C.

High school diplomas are supposed to signal whether a young person possesses a certain set of knowledge and skills. To its credit, Ohio is phasing in new graduation standards that will do that by better matching the expectations of post-secondary institutions, employers, and our armed forces. The new standards ask our young people to demonstrate readiness by either passing end of course exams (EOCs), achieving a remediation free ACT or SAT score, or earning an industry credential.

After years of low graduation standards, Ohio’s new requirements are a major step in the right direction. We need to set the expectations high for the young men and women who will become our business and civic leaders; scientists and engineers; teachers and law enforcement officers; and leaders in many other professions.

I’m not here to say that the decision on graduation standards is an easy one. No one should withhold a diploma from a deserving student. But at the same time, the state shouldn’t award meaningless credentials either. As you debate these issues, I would like to offer two suggestions.

First, I urge the board to exercise patience and not make a rash decision to adjust graduation standards. Over the past few years, we as a state have done much work to signal to families, taxpayers, and students that Ohio is raising expectations for all students. Backtracking prematurely on graduation standards would send the exact opposite message.

Let’s also keep in mind that many students are stepping up to meet these higher expectations. According to last month’s presentation by the Ohio Department of Education, 65 percent of the class of 2018 is on-track to meet the EOC requirements. This doesn’t even include students who will graduate via the career and technical pathway or will achieve the necessary EOC exam points after retakes. At the very least, we should see what the data look like after this school year before deciding whether changes are absolutely needed.

A bigger question is whether lowering the number of EOC exam points necessary to graduate is the right way to address the problem. While it’s one of the few public policy solutions available to you as the State Board, it’s a rather blunt instrument that could reduce the incentive to get every student ready for success after high school. Moreover, once lowered, this is going to be incredibly difficult to increase.

Over the longer term, it might make more sense for state leaders to work towards a multi-tiered approach to awarding diplomas. This would help to ensure that all hard-working students receive the credential they need to take their next step in life, while also creating an incentive structure that encourages our young men and women to aim for higher goals. A tiered approach would build on the honors diploma that Ohio already has in place, and could work like this.  

At a base level, Ohio could create a standard-issue diploma—offered by either the state or local school districts—signifying that pupils have met their coursework requirements, though demonstrated only basic level skills on exams. One step up could be a diploma indicating college and career readiness. To earn this credential, students would need to meet rigorous benchmarks on end of course exams (probably 18 to 21 points), college admissions exams, or complete a demanding industry certification. This diploma would line up closely with Ohio’s new graduation requirements—perhaps even a little more challenging. At a third level, the state could award a prestigious diploma geared to the expectations of our most selective colleges and universities. To enhance their value and offer incentives to students, the state could even tie merit scholarships to the top two diplomas. Through continued transparency, school districts should continue to be incentivized to get as many students as possible to the college and career ready and honors diplomas.

A tiered approach to awarding diplomas recognizes that different students leave high school with a different set of knowledge and skills. But one might ask whether it would lower expectations for certain pupils. In my view, it would not: Every student in Ohio would have the opportunity—and hopefully the incentive—to aim for a diploma that is aligned with her long-term goals. On the other hand, lowering the number of points required on EOC exams would lower expectations for all Ohio students.

For some young people, attaining the standard Ohio diploma will be something they and their families celebrate and cheer. It will also open doors to employment and post-secondary education. For others, meeting the demands of the prestigious, tertiary level diploma will be a source of tremendous pride, and put them on the pathway to leadership in our highest-need career fields.

Ohio is taking an important step forward in raising its graduation standards. I encourage the board to exercise patience and not rush into a decision. At the same time, let’s explore a different approach to awarding diplomas—one that can incentivize students from across the entire academic spectrum to reach their full potential.  

Thank you for the opportunity to offer public comment.


Back in 2012, the National Association of Charter School Authorizers (NACSA) began evaluating and ranking state charter laws based on eight policies they consider “cornerstones of charter school excellence.” These policies—quite reasonable in our view—are based on the principles of access, autonomy, and accountability and require each state to:

  1. Have at least two high quality authorizers, one of which is an alternative to the local district
  2. Endorse national professional authorizer standards
  3. Evaluate authorizers regularly or as needed
  4. Sanction authorizers that do not meet professional standards or that oversee persistently failing schools
  5. Require authorizers to report annually and publicly on the academic performance of each school they oversee
  6. Require authorizers to maintain charter contracts with academic performance expectations, and encourage high-performers to replicate
  7. Maintain strong renewal standards that permit authorizers to hold schools accountable for failing to meet academic expectations
  8. Close charter schools that perform below an established threshold of academic performance

In their latest annual state policy analysis, NACSA notes a few key changes from last year. Michigan, for instance, gets props for establishing default closure for schools that perform beneath a minimum threshold. Missouri smoothed the way for high-performing charters to replicate, and now mandates annual charter school performance reports. Kudos also go out to Washington for the advocacy campaign that restored the state’s charter law, and to New York’s courts for upholding authorizers’ right to implement “a strong standard of renewal.”

As for rankings, the results are largely similar to the previous year’s report. The forty-four states with charter laws were evaluated based on the eight cornerstone policies, and could earn up to thirty-three points on NACSA’s rubric. The top three states—Indiana, Nevada, and Washington—tie for first place with perfect scores; Ohio and Alabama follow with thirty-two and thirty-one points, respectively.

The report notes that while some states, like Nevada, have risen in the rankings after adopting NACSA’s recommended policies in response to concerns over sector quality, other states, like Alabama, have newer charter laws and thus don’t yet have evidence of implementation outcomes. And it’s fair to point that that some states with great charter school outcomes—like New York—get only middling scores from NACSA—proof that not all that matters can be easily measured.

The lowest-ranking states—Oregon, Wyoming, Alaska, Maryland, Virginia, and Kansas in order—earn between five and zero points—with the last-place Sunflower State the only to get a glaring goose egg.

The most informative part of the report by far, though, is the individual state profiles and their detailed breakdown of each state’s status. These profiles don’t just show points earned, overall score, and ranking, but also include a description of the state’s charter landscape, a comparison between last year and this year’s scores, and specific recommendations to improve the quality of charter school oversight in that particular state. 

Overall, NACSA’s latest report is a useful analysis that both identifies problems and offers solutions. Despite their focus on improving charter law, however, the authors are careful to admit that good policy is only part of the equation—committed advocates and rigorous implementation are also vital.

SOURCE: “On the Road to Great Charter Schools: State Policy Analysis 2016,” National Association of Charter School Authorizers, (December 2016).


The Data Quality Campaign (DQC) recently released its analysis of state report cards—the annual summations of student and school achievement data that states are required to make available to the public under the Every Student Succeeds Act and, previously, No Child Left Behind—to determine if they’re easy for parents and other community members to find and understand. The authors examined report cards in all fifty states and D.C. for content, presentation, accessibility, and clarity, using more than sixty data points all together.

Unsurprisingly, they found that most were too difficult to find and interpret. Nineteen states maintain labyrinthine department of education websites that require three or more clicks to arrive at the report card after a simple Google search. Once found, they often comprise confusing displays, organization, and jargon that make the information difficult to interpret. For example, across the fifty-one jurisdictions, authors found more than five terms that referred to students of low-income families.

Over a dozen were also out-of-date. Only four state report cards contained all the student performance data that was first required fifteen years ago under No Child Left Behind, and ten states’ latest assessment scores were from the 2012–13 or 2013–14 school year.

More specifically, twenty-three state reports failed to include school quality measures other than test scores, such as attendance or graduation rates. Thirty-eight omit student growth data, either because they don’t track them or don’t report them. And not a single one provided readers with school funding data.

On the bright side, some report cards, such as Ohio's or Washington, D.C.'s, were deemed to be of high quality and ought to serve as examples for other states. They provide valuable information with simple layouts and minimal text. And others, including Minnesota’s and Wisconsin’s, even contain interactive pages that allow users to compare data points and graphs.

Overall, however, the DQC asserts that the information provided by most state report cards is insufficient and suffer from a lack of transparency. They encourage states to seize the opportunity provided by the Every Student Succeeds Act to design more accessible and useful annual report cards that provide community members with the information needed to make important decisions and improvements. As we at Fordham have argued, easily interpretable data are crucial for those working to enact reform, ensure accountability, and provide all students with the education they need to succeed.

SOURCE: “Show Me the Data: State Report Cards Must Answer Questions and Inform Action,” The Data Quality Campaign (December 2016).


Following the lead of our D.C. colleagues, we totted up the most-read articles posted on Ohio Gadfly Daily in 2016.

The Top Five editorial posts are a microcosm of the issues we address regularly in an effort to advance educational excellence in a very real way here in the Buckeye State:

1. House Bill 420: Opting out of accountability by Jamie Davies O’Leary (published January 25)

At the height of the pushback against Common Core-aligned testing in Ohio, HB 420 was born. It would have allowed schools and districts to exempt from certain accountability measures those students whose parents opted them out of taking standardized tests. We cautioned against the inadvertent deterrent effect on testing participation and the erosion of the state’s accountability system.

2. How will ESSA change Ohio’s school report cards? by Jessica Poiner (published June 13)

Ohio’s accountability and report card system was reasonably robust before the advent of the Every Student Succeeds Act (ESSA), but as we discussed in detail back in June, the myriad new reporting requirements would engender a number of changes for the Buckeye State to be in compliance with ESSA. Our point-by-point analysis is a must-read for anyone who wants to know what the state’s report card system will look like in the ESSA era.

3. ‘Elected’ school boards and the dangerous illusion of democracy by Aaron Churchill (published March 3)

Spurred by the suggestion that elected school boards are de facto “better” than appointed charter school boards, we dug into the data around that little slice of democracy known as the school board race and were not impressed.

4. The problem with graduation rate statistics by Aaron Churchill (published June 13)

The assertion that one school (or school type) should be judged more harshly because of its low graduation rate is missing the point, we argued in this piece. In fact, the way graduation rates are calculated in Ohio can mask important details about student mobility and its effects on both the “sending” and “receiving” schools’ graduation rates.

5. Where are Ohio’s teachers when we need them? by Jamie Davies O’Leary and Elaine Laux (published August 8)

Data released mid-year from the Civil Rights Data Collection (CRDC) indicated a huge problem with teacher absenteeism across the country. Ohio was no exception and this dig into the data from our largest cities pointed out troublesome truths.

* * *

We hope you will visit Ohio Gadfly Daily throughout 2017 as these issues and more take center stage again. 


We look at the pros and cons of Ohio's charter operator report cards, try to ease the growing pains of College Credit Plus, review a holiday grab-bag of education policy news stories, and more

Most Ohio Gadfly readers know that we typically offer in-depth commentary one topic at a time. This tendency assumes (pardon the holiday metaphor) that one huge present is preferred—like the Lexus tied up in a bow. We recognize that other folks might prefer a bundle of gifts. So, for those yearning for a little more diversity in their inbox, this one is for you. (No white elephants, we promise.)

A win on ESSA accountability

In late November, the U.S. Department of Education released its revised and final regulations on school accountability under the federal Every Student Succeeds Act (ESSA). In a victory for high achievers, the feds made it crystal clear that states are permitted to use a performance index—as Ohio has long done—as an indicator of student achievement. Regrettably (see here and here for why), the previous draft regulations would have likely forbidden performance indices and forced states to use proficiency rates instead. Now it’s full steam ahead on the performance index as Ohio drafts its ESSA state plan.

Information in the palm of your hand

Kudos to state leaders who are making Ohio’s report card data useful and accessible to policy wonks and the general public alike. A recent Data Quality Campaign (DQC) publication spotlights Ohio’s school report cards as exemplars for providing “data that is valuable to my community” and displaying clear information. The report also notes that Ohio’s database houses the large majority of the data elements DQC deems important for public review (fifteen out of twenty-three). Meanwhile, the Ohio Department of Education last week launched a smart phone app where users can receive updates and check out school report cards anywhere and anytime. Is your local school board member—or maybe real estate agent—waxing poetic on how lovely the schools are? Now you can get the lowdown on the data and see for yourself. As President Reagan once said, “trust but verify.”

No thanks on the similar students measure

In the recent charter reform legislation, state lawmakers ordered the Ohio Department of Education to “conduct a study to evaluate the validity and usefulness of using the ‘similar students measure.’” In a report issued in late November, the Department concluded after said study that the measure was “neither valid nor useful” for use in the Ohio’s accountability system. The measure, pushed by a charter advocacy group and ECOT, adjusts a school’s achievement rate depending on its demographics (for more, see here). One of the central problems, however, is that the measure would set lower proficiency expectations for disadvantaged children. As Chris Woolard, the Department’s accountability chief, told the Columbus Dispatch: “Our system right now has high expectations for all students. This [measure] violates that basic principle that we want all students to be able to succeed.”

Auditor Yost on inter-district open enrollment

Ohio Auditor of State Dave Yost recently released a report on the fiscal impact of inter-district open enrollment. The main takeaway: Districts should weigh the costs and benefits of accepting additional pupils via open enrollment. Under state law, districts are not obligated to accept open enrollees—though state funding follows students, offering districts a financial incentive to do so (no local dollars transfer however). According to his cost-benefit calculations for four Northeast Ohio districts, one posted a net loss of $1,282 per incoming open enrollee, while another gained a whopping $4,563 per open enrollee. The fiscal impact, as the Auditor explains, depends in large part on capacity. When a district has “empty seats,” the cost of educating an open enrollee is minimal—teachers wouldn’t need to be hired, for example—but it would gain the funding tied to the student. The reverse might be true when a district is at or near capacity: Marginal costs could exceed the benefit. The Auditor understands the finances of open enrollment, but this analyst at least wonders whether economic concerns could be used as an excuse by public schools to not accept all comers. (“Sorry kid, we just don’t have the capacity.”) This begs a couple questions: a) just how many districts in Ohio are at full capacity, including suburban ones that prohibit open enrollment altogether; and b) for districts without excess capacity—but facing increasing demand—should the state support expansions, so they are not turning away students?

AP scores of 2 = proficient?

In an amendment to Senate Bill 3, a deregulation bill that passed last week, state legislators added language that would deem an Advanced Placement (AP) score of 2 equivalent to proficiency on certain state end-of-course exams (EOCs). Ohio high schoolers may substitute AP test results for EOCs in the following content areas: US History, US Government, and science (substitutions are not allowed in math or English). This raises an eyebrow, because an AP score of 2 is typically considered mediocre—the second lowest on AP’s 1-5 scoring scale. It’s a score that colleges and universities won’t accept for course credit—a minimum 3 or 4 is required. In addition, an AP score of 3 is needed for schools to earn credit on Ohio’s Prepared for Success report card component. It may be true that an AP score of 2 is technically a closer equivalent to EOC proficiency than a 3 (AP tests are likely more difficult), but it does seem peculiar to call an unsatisfactory AP score “proficient.” Did the student demonstrate proficiency in the AP course? According to the test results, it’s not clear she did. Or maybe this predicament calls into question the notion that different standardized tests are so easily substitutable. 

We hope you enjoyed this package of ed news gifts. Stay tuned in the New Year as we continue to track these stories and much more!


One of the big Ohio education stories of 2016 was the growing popularity of College Credit Plus (CCP), a program that provides students three ways to earn college credit from public or participating private colleges: by taking a course on a university campus; at the student’s high school where it’s taught by a credentialed teacher; or online. Many students and families have found that the program saves them time and money and provides valuable experience. For families with gifted or advanced students, it is a chance for acceleration even as early as seventh grade; for students in high-poverty rural and urban areas, it may be the only way to take high-level courses in basic subjects, let alone electives.

Before registering, students in grades 7-12 must be admitted to the college based on their readiness in each subject they plan to take a class in—a decision made by each higher education institution and determined by GPA, end-of-course (EOC) exam scores, and other available data. Once admitted, students can register for any course the school offers, except for those that are considered remedial or religious. (The latter restriction is presumably intended to keep church and state separate while a child is enrolled in a public school.)

Most of the media coverage of the growth of College Credit Plus has focused on its cost, but in October, the state released an overview of preliminary information gathered during the first year (and part of the second year) of the program. Here are a few of the most interesting data points:  

  • During the 2015-16 school year, over 52,000 students took classes from 23 community colleges, 13 universities, and 35 private institutions of higher education in Ohio.
  • Participation varies by student race, with African-American and Hispanic pupils underrepresented when compared with their share of the grade 7-12 population.
  • Participation also varies by income level, though the data aren’t clear enough to draw conclusions (the economic status of 45 percent of CCP students is listed as “unknown”).
  • Unsurprisingly, most students took courses in the five main core content areas: English (24 percent), social sciences (18 percent), math (13 percent), science (13 percent), and arts and humanities (11 percent).
  • Just over 90 percent of courses taken by CCP students resulted in credits earned. Three percent resulted in a failing grade; 2 percent resulted in a withdrawal, and 4 percent had no grade reported.
  • The overwhelming majority of CCP courses were taken on high school campuses and most utilized a high school teacher. Student GPAs did not vary significantly based on location.

The preliminary data suggest a few areas that need attention as Ohio works to ensure that CCP is functioning as intended. 

Pay close attention to passage rates

Eyebrows should rise when seeing that over 90 percent of courses taken by CCP students resulted in credits earned. Other data points—such as state test scores and ACT scores—show a troubling lack of proficiency that one might expect would translate to a smaller percentage of students earning credit. Similarly, average scores on Advanced Placement  (AP) exams indicate that far fewer than 90 percent of AP courses taken result in college credit earned or even in what the College Board terms a “qualifying score” (3 and up). Why, then, are CCP’s passing percentages so high?

One reason may be that CCP’s eligibility requirements permit only college-ready students to enroll. By restricting enrollment to students who have successfully demonstrated that they are college ready, usually through widely accepted measures of college readiness like the ACT  and Compass, CCP passing percentages may be high because the college readiness requirement is acting as an effective gatekeeper. But this could also be considered suspect (i.e. the requirement may be a so-low-as-to-be-meaningless bar) considering how many of the participating post-secondary institutions are “open enrollment” campuses. It’s also possible that CCP courses—most likely those taught on a high school campus by a secondary instructor—just aren’t rigorous enough. Passing these courses is determined by the teacher, not by an external review such as Advanced Placement uses, and we live in an era of grade inflation.

How to ensure that courses taught on high school campuses are rigorous?

The majority of CCP courses (nearly 61 percent) were taught on a high school campus by a secondary instructor—an educator who is already teaching at the high school but has earned additional credentials. Although the state data showed that student GPAs varied only slightly based on class location, that’s no reliable gauge of course rigor. The state’s report notes that “monitoring quality and participation when the course is taken on the high school campus” is an “item to discuss.” Policymakers should talk to representatives from K-12 and higher education for ideas on how to maintain rigor, including how best to train secondary teachers to teach post-secondary classes and to evaluate student work by post-secondary standards rather than K-12 criteria.

Maintain entrance requirements for students

Although some folks have bemoaned the challenges that students face in qualifying for CCP, the college-readiness restriction is critical for two reasons. First, it ensures that only students who are academically prepared for the rigors of college are able to participate, a requirement that, if forcefully and dutifully applied, should prevent students from the double-whammy of a failing grade on both their high school and college transcripts. Second, students who are ineligible for CCP one year can still become eligible the following year if they are able to demonstrate that they have achieved college readiness; this could provide students with more motivation to work hard to reach the bar. A college freshman who isn’t college-ready, on the other hand, has no options except expensive, non-credit-bearing remedial courses. Still, we must keep in mind the softness of a “college readiness” criterion when determined and applied by an open-access college.

Keep CCP and co-requisite remediation separate

A bill now before the General Assembly (House Bill 474) would create a “CCP Co-requisite Remediation Pilot Program.” This would aim at high school seniors in need of remediation in math and English by allowing them to “simultaneously enroll in a remedial course and an introductory college course in the same subject area, or enroll in an introductory college course that incorporates remedial curriculum.” Whatever the merits (and weaknesses) of co-requisite remediation, it’s illogical to push college-level remediation into high school via a program that is, by law, intended only for college-ready students. Remediation on college campuses occurs because students didn’t learn what they needed to learn in high school. Why, then, would the state allow students who are still in high school—and thus still have a chance to prepare for college prior to enrollment—to take college level work for which they are unprepared? Why not encourage high schools to do a better job preparing their students for college, rather than shoving those same students further along the path? The risks for students, whose grades in CCP courses appear on both their high school and college transcripts, are just too high.

Take a deep dive into the underrepresentation of minority students in CCP

Participation gaps aren’t only a CCP problem; AP courses face a similar issue. Although a ton of analysis has been done on AP participation gaps, it’s more difficult to diagnose the cause of CCP’s participation gaps (mostly for African American students) based solely on the information released by the state. Analysts should gather more information and investigate what could be causing the discrepancy. We can assume—based on state test scores—that too few minority students are prepared to qualify as college ready while still in high school. But there could be additional factors at play. Is more and better outreach needed in particular schools? Are some schools subtly discouraging participation or less likely to facilitate the high school-located classes intended to minimize transportation challenges? Answers to these questions could begin to narrow participation gaps.


CCP is new and we should expect glitches and growing pains. There are no easy solutions to all the problems that it faces, but the initial uptake by students suggests that College Credit Plus is worth continued attention and improvement.   


Ohio’s charter school reform discussions have mostly focused on sponsors—the entities responsible for providing charter school oversight. Overlooked are the important changes in Ohio’s charter reform law (House Bill 2) around operators. Operators (aka management companies) are often the entities responsible for running the day-to-day functions of charter schools; some of the responsibilities they oversee include selecting curriculum, hiring and firing school leaders and teachers, managing facilities, providing special education services, and more. (To get a sense of the extent of operator responsibilities, read through one of their contracts.)

Extra sunshine on operators has been especially needed in a climate like Ohio’s, where operators historically have wielded significant political influence and power not only with elected officials but even over governing boards. For instance, one utterly backwards provision pre-HB 2 allowed operators to essentially fire a charter’s governing board (with sponsor approval) instead of the other way around—what NACSA President Greg Richmond referred to as the “most breathtaking abuse in the nation” in charter school policy.  

HB 2 installed much-needed changes on this front, barring the most egregious abuses of power and greatly increasing operator transparency. The legislation required that contracts between charter boards and operators be posted on the Ohio Department of Education (ODE) website; that operators collecting more than 20 percent of a school’s funding provide a detailed statement of expenditures; that ODE post a simple directory of operators so the public could know which operators were affiliated with which charter schools—information surprisingly difficult to come by outside of inside charter circles; and that ODE publish an annual academic performance report for operators. These new provisions were at once somewhat obvious, yet revolutionary. Such is the Ohio charter story. 

The new performance reports are out, and that’s a great step forward for Ohio where public information on operators has been historically lacking. But the reports are disappointing in their lack of depth and breadth. The image below shows one report in its entirety; fifty-three operators received a similar half-page report delineating academic performance, attendance, student demographics, and staffing data.

Here are a few observations about the reports and where they could be improved.

  • Operators are not matched with their affiliated schools. This information is available by viewing a separate spreadsheet on ODE’s website, but each management company’s schools should be listed within the report card itself to provide context. Readers should not have to search through multiple spreadsheets and documents to piece this information together.  
  • There are no data on individual schools. Along with the charter schools run by each operator, the performance report should provide key report card ratings for each school. What good is a report card that lists a score for “Center for School Improvement, LLC,” an operator with no known website, without knowing which schools it oversees or how they each perform in key areas like performance index and growth? District report cards contain links to their schools’ ratings; so should operator reports.
  • Academic ratings don’t effectively differentiate quality because almost every operator received a low rating. Nine operators received a “0” academic rating; seventeen received a “1” and five received a “2.” It appears that the scores (1-5 correlating with an A-F scale) were calculated in the same manner as academic ratings for sponsors. (The report does not include a methodology for calculating the operators’ academic rating.) If so, that means that student growth was counted as just 20 percent of the overall score. That’s a problem, because the other indicators composing the score are highly correlated with students’ socioeconomic backgrounds. Overall low ratings among charter operators are primarily a function of the fact that they serve so many at-risk students. The same would and will be true for traditional urban public school districts should the state calculate them in the same manner. The system fails to meaningfully distinguish between some of Ohio’s best operators—networks that get poor students who are behind grade level and move them to performing above the state average, like United Schools Network—and some of its lackluster ones. That needs fixing.
  • There is no distinction between for profit and non-profit charter management companies. Charter opponents tend to speak about the charter sector in broad brush strokes. They often generalize about the “privatized” or “corporate-run” charter industry while failing to acknowledge that there are a fair number of schools in Ohio that contract with non-profit management organizations. Many choice critics seem to genuinely misunderstand the distinction or be unaware of which entities are which. A designation of non-profit versus for-profit status on each operator’s report cards could help improve public understanding and either prove or disprove people’s preconceived notions.
  • The reports focus heavily on inputs. This includes a plethora of data on teachers and staff while at the same time providing hardly anything about actual performance. Readers can see the number of music teachers staffing an operator’s schools, but have no idea which schools they are or how they perform. Student enrollment numbers are not even provided—they should be.
  • It isn’t clear how operator is defined. Fifty-three operators are listed in ODE’s operator database elsewhere, but only forty-nine received a report card. Why? The recent competitive facilities grant award for top-performing charter schools listed some eligible operators (defined earlier this year by ODE and the Ohio Facilities Construction Commission as such), yet not all of those operators received report cards. It’s unclear how the state is defining what constitutes an “operator” or why this definition would differ from the facilities grant eligibility list or from its own master spreadsheet.
  • Expenditures per pupil is listed, but lacks context. Those numbers range from $1899 to $10,880 among various operators, but without information on overall revenue and expenditures or school-by-school information. To be fair, HB 2 required any management company earning more than a 20 percent fee from a school’s annual gross revenues to provide a more detailed financial accounting, which includes information on salaries, wages, benefits, utilities, buildings, equipment, and more. But to the best of my knowledge, this information isn’t available publicly yet—at least not in a way that is easy to find and navigate. That should change.

Ohio evaluates sponsors in significant part based on their schools’ performance, and these evaluations include detailed information about schools’ academic results as well as compliance with various rules and laws. For operators, however—entities that are actually running schools day to day, and in some instances collecting more than 90 percent of schools’ public funds—there is very little information.

HB 2’s operator transparency provisions are necessary to provide valuable information to governing boards, sponsors, taxpayers, the public, and parents more broadly. Taken together, the newly available information on operators is a step forward for Ohio’s charter sector, and ODE deserves credit for creating the first operator performance reports and doing it on time. However, there is still much room to improve the report. In the interest of transparency, Ohio should move toward a much more robust and detailed 2.0 version. 


One in seven adults’ ages 18-24 in Ohio lacks a high school diploma and faces bleak prospects of prospering in our economy. Dropouts earn $10,000 less each year than the average high school graduate according to the U.S. Census Bureau, are almost twice as likely to be unemployed, and typically earn an average annual income of $20,241 which hovers just above the poverty line for a family of three in Ohio. Dropouts also drag down the Ohio economy; over the course of their life, they consume an estimated $292,000 in public aid beyond what they pay in taxes.

To mitigate the number and cost of dropouts, Ohio has permitted the creation of ninety-four dropout prevention and recovery schools. Collectively, these schools enrolled sixteen thousand students in the 2015-16 year. They serve at-risk and re-enrolling students—pupils who previously dropped out but are now re-entering the education system—with the aim of graduating students who might otherwise slip through the cracks.

To hold these schools accountable for successfully educating at-risk students, Ohio has created an alternative report card. This report card assigns an overall rating of “Exceeds,” “Meets,” or “Does Not Meet” standards based on the school’s state assessment passage rate, graduation rate, ability to achieve progress from year-to-year, and the achievement gaps between student groups. Prior to 2012-13, dropout-recovery schools were rated on the same report-card indicators as all public schools.

Whether this new alternative accountability framework appropriately captures the success of these schools is up for debate. This past summer, a committee of legislators and civic leaders debated the definition of quality, heard from community members and school leaders, and reviewed the components of the current report card. The committee failed to recommend any changes (it had to meet a legislative deadline of August 1), though a new committee convened in November to continue this important work. (Disclosure: Fordham’s Chad Aldis has been named to this newly reconstituted committee.)  

Should the committee choose to maintain the state’s recently created alternative report card, some adjustments are needed to ensure that high-performing dropout-recovery schools are distinguished from schools that continually fail to improve the learning of at-risk students.

Attention should be paid to one component in particular of the accountability rating—the progress measure that, generally speaking, gauges whether dropout-recovery students are making at least one year of academic growth. A very large majority of dropout-recovery schools appears to be falling short of growth expectations.  In 2015-16, just seven schools exceeded the progress standards, eighteen met them, and an overwhelming majority of schools—sixty-nine of them—failed to meet the state’s standard for academic progress. In the previous year (2014-15), only one school exceeded standards, thirty-three met standards, and fifty-nine failed to meet the growth expectations. Given these results, the committee should review this measure’s methodology and confirm that the norm-referenced group used to calculate student growth along the NWEA’s Measures of Academic Progress test is appropriate for dropout-recovery students. Ohio law requires dropout-recovery schools to use a norm-referenced exam, not state exams, to gauge student growth over time. Ensuring that we accurately and fairly capture student progress, especially for pupils who may be years behind, should be a high priority.

Additionally, the way Ohio evaluates graduation rates should make certain that schools are not punished for taking in students who “drop-in” years after the expiration of their expected four-year graduation rate. (Four-year graduation rates, along with extended rates—up to eight years—are included in the alternative accountability system.)

Many also take issue with dropout-recovery schools being measured against the adjusted cohort graduation rate, as dropout-recovery schools face the consequences of a student’s previous school passing them on from one grade to the next without accomplishing adequate academic progress.  A student’s transcript may report that they are in the eighth grade when they have really only mastered reading and math skills at the sixth-grade level. Yet, dropout-recovery schools are held accountable for graduating that student in four years.  Testimony from school leaders during the summer’s dropout prevention and recovery school study committee emphasized the time crunch schools feel as soon as students who are academically far behind step through their doors. School leaders should not face perverse incentives to reject or rush students through the curricula because they are on the hook to meet four- or five-year graduation rates.   

Finally, the performance standards should be adjusted to reflect high yet attainable standards. Currently, for dropout-recovery schools to attain a “Meets” graduation rate, they must graduate just 8 percent of all eligible students in four years. This standard is much lower than that of traditional schools, which must graduate 84 percent of seniors to earn a C rating and 93 percent to earn an A on the four-year graduation indicator. However, dropout-recovery school’s overall graduation rate standard could be set so low to account for the challenges these schools face in meeting typical adjusted cohort graduation rate timelines. Moving forward, the committee should evaluate the performance standards alongside their respective measures to ensure these are aligned and appropriately rigorous. Should the committee decide to phase in higher performance standards, they should also consider what supports, like re-engagement programs, could be implemented to help schools meet these targets.

As the new dropout-recovery committee works into 2017 to define what quality means for these schools, they should give thought to Ohio’s current system and explore ways to better distinguish high-performing from low-performing dropout-recovery schools. The quality of education for at-risk students, and by extension, Ohio’s long-term economic condition, is at stake.


From the latest issue of the journal Economics in Education Review comes a fascinating paper in which author Metin Akyol creates mathematical models that simulate the effects of private school vouchers on the overall education system. It is not a study of an actual voucher program, but instead a thought experiment meant to test whether both universal and targeted voucher programs can increase the efficiency of the education system as a whole. As strange as this may seem to lay readers, there is in fact a long history of such econometric analyses—and their findings are often worthy of consideration.

Akyol’s complex model can’t be fully explained in this short review, but some features are worth noting. It incorporates the findings of empirical voucher studies to increase its reliability. It simplifies the real world in an effort to find the signal in the noise. Every household therefore has only one child, and the hypothetical school district has neither magnet schools nor charters. And one of its defining assumptions is that more efficient public school spending is an effective proxy for increased educational quality. In other words, it presumes that the money saved by greater efficiency can be reinvested in ways that improve outcomes.

Regardless of how one feels about all this, the model ends up producing outcomes that are very similar to empirical findings regarding actual programs. In one important example, the positive effects on voucher-eligible students who do not opt to leave their district school (found empirically by David Figlio in Ohio) are predicted in Akyol’s targeted-voucher model. Public schools in the model are observed to “up their academic game” to retain students when voucher competition is introduced. Additionally, the model predicts that students lowest on the income scale will be least likely to use vouchers, even in a model where vouchers are universally available and not means-tested. This stands to reason, considering that real-world vouchers often fall short of full private school tuition. (To some extent, it was also borne out in Figlio’s research.) 

Also interesting is the difference in effects between a universal voucher program and a targeted one. Akyol ran models that replicate the prime goal of vouchers—make private schools affordable for more children—in two ways: by manipulating the voucher availability and by simply changing the family income distribution. The results were not the same. The universal voucher model led to an observable decline in “peer group quality” for those students at the lowest end of the income spectrum who did not take the vouchers. This decline in quality was also present to a lesser extent in the model where vouchers were targeted at low-income students. But it was absent in a model that gave high-ability students lower voucher amounts than their lower-ability peers. This appears to be the theoretical sweet spot that results in the most favorable overall outcome: more students were able to access private schools, and public schools felt the competition keenly enough to improve their academics for the students who remained.

None of this means we must redesign real-world voucher programs based on any one of the mathematical models presented in this paper. But to the extent that modeling can predict real-world outcomes, choice advocates and policymakers ought to consider the results, which can elucidate the potential benefits and challenges of particular voucher designs. If the incoming Trump/DeVos education department is going to prioritize vouchers as a means for improving education, Akyol’s mathematical models have at the very least led him to offer some sage advice: “…the outcomes of a voucher program hinge on its design.”

SOURCE: Metin Akyol, “Do educational vouchers reduce inequality and inefficiency in education?Economics of Education Review (December, 2016).


We offer solutions to Ohio’s high school diploma dilemma and its teacher evaluation enigma, and show how KIPP: Columbus provides a pathway to success for its students.

KIPP Columbus achieves extraordinary outcomes for its students, predominantly students in poverty and students of color—a fact worth celebrating by itself. In 2015-16 in Ohio’s Urban Eight cities, KIPP Columbus was in the top five percent of all schools (district and charter) on student growth and among the very best (top 3 percent) in Columbus. But it’s not just KIPP’s academic data that are impressive. KIPP Columbus, led by Hannah Powell and a visionary board, has a rare knack for forging powerful partnerships at every turn—ones that strengthen KIPP students, their families, and the entire community near its campus. This year, KIPP launched an early learning center in partnership with the YMCA of Central Ohio to serve infants, toddlers, and pre-school aged youngsters. In a neighborhood lacking high-quality childcare and early learning opportunities, it’s an investment not just for KIPP students, but for the community at large. KIPP Columbus also partners with the Boys and Girls Club of Columbus, Battelle Memorial Institute, and other community organizations.

This profile is about KIPP graduate Steve Antwi-Boasiako, an immigrant and first-generation college student now attending Vanderbilt University, whose entire family has been uplifted by the school. His story illustrates the depth of KIPP’s commitment to students’ long-term success. The “KIPP Through College” program tracks data on what colleges turn out to be a good fit for students (many of whom are first-generation college-goers), assists families with applications and financing information, and even partners with 83 colleges and universities nationally so that cohorts of KIPP students improve their odds of successful completion. In a world where just nine percent of low-income students attain a four-year college degree, KIPP students’ attainment rate (44 percent) is truly remarkable, and even ten percentage points higher than that of the population at large (34 percent).

Steve was an inspiration to younger students while attending KIPP: Columbus;
he now attends Vanderbilt University

We urge you to read Steve’s story. In it, KIPP Columbus’s mission to “prove what’s possible” truly comes to life. Indeed, his remarkable trajectory reminds us that not only will students meet our expectations when we place them high, but will often exceed them beyond our wildest hopes.


The Every Student Succeeds Act (ESSA) has put the future of teacher evaluations firmly in the hands of states. Ohio is now in full control of deciding how to develop and best implement its nascent system.

It should come as no surprise to folks in the Buckeye State that the Ohio Teacher Evaluation System (OTES) has significant room for improvement. Since its inception in 2009, approximately 90 percent of Ohio teachers have been rated in the top two categories and labeled “skilled” or “accomplished.” Unfortunately, there isn’t significant evidence that the system has impacted the quality of Ohio’s teacher workforce, perhaps because there is no statewide law that permits administrators to dismiss teachers based solely on evaluation ratings. Meanwhile, OTES also doesn’t appear to be delivering on the promise to aid teachers in improving their practice.

A quick glance at the ODE-provided template for the professional growth plan, which is used by all teachers except those who are rated ineffective or have below-average student growth, offers a clue as to why practice may not be improving. It is a one-page, fill-in-the-blank sheet. The performance evaluation rubric by which teachers’ observation ratings are determined doesn’t clearly differentiate between performance levels, offer examples of what each level looks like in practice, or outline possible sources of evidence for each indicator. In fact, in terms of providing teachers with actionable feedback, Ohio’s rubric looks downright insufficient compared to other frameworks like Charlotte Danielson’s Framework for Teaching.

Another problem with OTES is the way it incorporates student learning. Ohio law requires that all teacher evaluations include a student growth component, which consists of test results. For teachers with a valid grade- and subject-specific assessment, that means value-added measures. Unfortunately, only 20 percent of Ohio teachers are able to be measured using the state assessment either in whole or in part.[1] Another 14 percent receive growth scores from a separately administered vendor assessment that increases the testing burden placed upon students and schools. The student growth component for the remaining 66 percent is based on locally developed measures that tend to be both ineffective and unfair: shared attribution, which evaluates teachers based on test scores from subjects they don’t teach, and Student Learning Objectives (SLOs), which are extremely difficult to implement consistently and rigorously and often fail to effectively differentiate teacher performance. In short, the state hasn’t quite figured out how to fairly evaluate all teachers using student achievement data.

A meaningful overhaul of Ohio’s system should aim to solve four significant problems. First, it should address the current framework’s failure to fairly evaluate all teachers. Second, it should do a far better job of differentiating teacher performance. Third, it should provide actionable feedback to all teachers. And finally, it must positively impact the overall quality of the workforce. Crafting a system that does all this is easier said than done. Fortunately, there’s evidence that focusing solely on a rigorous classroom observation cycle, rather than student growth measures, could be the solution.   

In a recent piece for The 74, Matt Barnum examined research on teacher evaluation work in Chicago, including an analysis of a pilot system that focused solely on classroom observations and the system’s impact on the labor market. The analysis found that the first year of the pilot resulted in an 80 percent increase in the exit rate of the lowest-performing teachers; the teachers who replaced exiting educators proved to be higher performing than those who exited. Overall, the findings suggest that evaluation systems based solely on rigorous observations of teacher practice can impact the quality of the workforce. This type of system would also remedy the biggest problem with Ohio’s evaluation structure, which is that current student growth measures unfairly evaluate teachers in many subjects and grade levels.

The second and third problems with the current system—effectively differentiating teachers and offering better feedback—are also improved by zeroing-in on an improved observation cycle. In general, observations provide more detailed information about the complex job of teaching than a list of raw scores ever could. More information means more opportunities to pinpoint variances in performance, but only if the system uses a high-quality rubric and takes advantage of multiple perspectives by including  outside observers and  peer observers. Improving observer training and ensuring a mix of announced and unannounced observations is also important.

When it comes to offering better feedback, it’s widely acknowledged that teachers find evaluations most helpful when they're given actionable feedback on their practice. This type of feedback only comes from observation of practice. Plenty of other sectors understand this. Professional football teams prepare for their next opponent by studying game film. Players study their future opponents, but they also study their own performance from the previous game—the choices they made, what they could have done better, and what they need to continue doing. Teacher evaluations should offer the same opportunities. Teacher coaching, teacher collaboration (which research says can lead to student achievement gains), and peer reviews—all of which have been found to improve teacher practice—are only effective if they include rigorous observation of practice.

It’s true that assessment results are a form of feedback. But as a former teacher, I can attest to the fact that studying test results (value-added or otherwise) was hardly the most effective way for me to improve my pedagogy. I needed to know why my students did or didn’t do well, and that answer couldn’t be found on a data spreadsheet no matter how hard I looked. A far better use of my time and the best way to make me a better teacher faster would have been actionable feedback that came from observing my practice—which, after all, is what most impacted my student’s test scores in the first place.

In summary, research shows that evaluation systems based solely on rigorous observations of teacher practice can impact the quality of the teacher workforce. Research also shows that improving teacher practice can be done through observations conducted by well-trained observers using high-quality frameworks and rubrics. Taken together, it seems that one way to improve Ohio’s teacher evaluation structure is to pilot a system that focuses solely on rigorous classroom observations. Stay tuned for a detailed explanation of what such a system could look like.

[1] The 20 percent is made up of teachers whose scores are fully made up of value-added measures (6 percent) and teachers whose scores are partially made up of value-added measures (14 percent).


As a form of credentialing, high school diplomas are supposed to signal whether a young person possesses a certain set of knowledge and skills. When meaningful, the diploma mutually benefits individuals who have obtained one—it helps them stand out from the crowd—and colleges or employers that must select from a pool of many candidates.

In recent years, however, Ohio’s high school diploma has been diluted to the point where its value has been rightly questioned. One of the central problems has been the state’s embarrassingly easy exit exams, the Ohio Graduation Tests (OGT). To rectify this situation, Ohio is phasing in new high school graduation requirements starting with the class of 2018. Under these new requirements, students must pass a series of seven end-of-course assessments in order to graduate high school, or meet alternative requirements such as attaining a remediation-free ACT score or earning an industry credential.

The end-of-course exams have proven tougher for students to pass than the OGT, leading to concerns that too many young people will soon be stranded without a diploma. One local superintendent called the situation an “apocalypse,” predicting that more than 30 percent of high school students in his district would fall short of the new standards. He wasn’t alone, as an estimated 200 superintendents and school board members recently voiced their concerns at a Statehouse rally. An analysis by the Ohio Department of Education suggests that statewide, almost one in three pupils from the class of 2018 aren’t on a sure track towards a diploma.

This has put the state in a bind. On the one hand, in an era of heightened standards, no one wants to backtrack and hand out meaningless credentials. On the other hand, policymakers are right to worry about leaving thousands of pupils without a diploma. In today’s economy, such students are likely to struggle to find employment and are unable to join the military.

Can Ohio move forward on high standards—even re-inflating the value of the diploma—without leaving young people behind? Yes, but it will take some rethinking about how the state awards its high school credentials.

The most reasonable alternative, also suggested by several prominent education analysts (including our own Checker Finn), is for Ohio to pursue a multi-tiered approach to awarding diplomas. This would help Ohio maintain high achievement standards in the face of pressure to lower them while also building an incentive structure that could push students to achieve at higher levels. Ohio already has an honors diploma for students who go above and beyond in their coursework—a good start. Yet the honors diploma does not rely on state assessment results nor is it widely recognized as a measure of the accomplishments of Ohio’s highest achievers.

Here’s how a beefed-up, tiered system of awarding diplomas could work. At the base level, Ohio could create a standard-issue diploma signifying that pupils have persevered through thirteen years of school—a certificate of completion more or less. These students would have met their core coursework requirements yet fallen short of the stringent benchmarks of Ohio’s end-of-course exams. If we’re being frank, this is where Ohio has been with tying diplomas to the OGTs over the past decade or so (and perhaps also with its predecessor exam, the Ninth Grade Proficiency Test). One step up would be a college- and career-ready diploma that indicates students have demonstrated readiness, either by meeting rigorous academic targets on state exams or completing a demanding industry certification. This lines up more closely to Ohio’s new graduation requirements. Finally, the state could award a third diploma—a certificate of exceptional accomplishment—that the most academically able students receive. This diploma’s benchmarks could be geared to the expectations of the state’s most selective colleges and universities and would be a cause of celebration for students, parents, schools, and communities. The state could also tie the diploma to a merit-based college scholarship program.

A tiered approach would have several benefits. First, the state could maintain high expectations for all graduates, yet simply award various diplomas depending on whether pupils fell short, reached, or considerably exceeded the end-of-course exam standards. Second, by issuing at least a basic-level diploma, the state could avoid the repercussions of potentially denying one in three students a diploma. Third, by awarding a diploma with distinction, the state may incentivize some pupils to accumulate more “human capital.” For instance, consider a junior who has already secured a college- and career-ready diploma. She may not feel motivated in her senior year—a case of “senioritis.” But with an honors diploma in play, and clear benefits for earning one, there is more reason to work hard. Fourth, the state could allow pupils meeting the criteria for the college- and career-ready diploma as juniors (or earlier) to graduate, and then offer them the funds saved by foregoing their senior year to defray the cost of college. Fifth, a tiered diploma would help young people as they enter the workforce. For college students, it could help them in the competition for internships; likewise, a more meaningful diploma should aid those seeking full-time employment directly after high school.

Ohio’s old and outgoing high school diplomas didn’t signal much of anything to anyone. The requirements were ridiculously easy and practically everyone got a diploma. This diminished its value. Now the state is ratcheting up its graduation requirements: As State Superintendent Paolo DeMaria told the Columbus Dispatch, “This is all about giving greater meaning to a high school diploma.” State leaders should not back down on rigorous graduation standards simply to accommodate more diplomas. But neither should they be hard-headed. The best way forward for Ohio is to remake the diploma and reject the notion that there is only one way forward. 


Italy has an achievement gap—one that may sound familiar to Americans. PISA scores show a marked gap between Italian students and those of other OECD countries in both math and reading. Digging into the data, Italian education officials found their own in-country gap: Students in the wealthier north perform far better than students in the poorer south. As a result of all of this, starting in 2010, schools in Southern Italy were offered an opportunity to participate in an extended learning time program known as The Quality and Merit Project (abbreviated PQM in Italian). A new study published in the journal Economics of Education Review looks at PQM’s math and reading intervention, which consisted of additional teaching time after school in four of the poorest—and lowest-performing—regions in the country.

A couple of things to note: PQM intervention was focused not on improving PISA test scores, but on improving scores on the typical tests taken by students in lower secondary school (equivalent to grades six to eight in the U.S.). There is no enumeration of which/when/how many tests these students typically take and the researchers are not attempting to make a connection between the intervention and PISA test scores. We as readers should not either. The poor performance of Italian students on PISA simply shone a light on poor performance elsewhere, and perhaps more importantly, unlocked the funding (from the European Union’s Regional Development Fund) that paid teachers to implement an intervention aimed at closing the detected gap.[1] Deciding to initiate the PQM intervention was voluntary on the part of schools. That allowed researchers to match schools that participated with similar schools that didn’t participate. Using results from the typical tests taken by lower secondary school students, the analysts compared changes in test scores before and after the intervention.

The report had two key findings. First, PQM had a positive effect on average test scores in math, but no impact on reading scores. Second, the impact differed depending on pre-intervention achievement: students in the lowest-achieving schools—in the bottom third—made significant gains on math due to the program. For students attending schools in the top two-thirds of achievement, the impact of the after-school program was null in both math and language. According to this evaluation, then, the program worked in a narrow sense—in just math and for the lowest-achieving students.

Researchers conclude that additional in-class instruction time as an intervention in reading is not particularly helpful to students in grades six through eight. “This result is consistent with other studies in the literature showing that it is much harder to intervene on reading and comprehension skills,” they write, “rather than on skills involving practice, like maths, because a large part of literacy work takes place through general vocabulary training in the home environment.” In other words, improving the “skill” of reading is much more than a matter of spending more time on it once fluent decoding has been learned. (We would add that it also relates to content knowledge—something that certainly can and should be taught in school.) However, this research indicates that quantitative reasoning and mathematical knowledge—increased by repetition and “skill building”—responds positively through more time spent on task, especially for low-achieving students.

We need to be careful about the conclusions we draw based on the numerous caveats and unknowns here (not to mention the differing culture and language), but a detailed look at the benefit to students of additional time on task is no bad thing. A longer school day is often seen as a cure-all for students with poor test scores and is sometimes the raison d'être of certain school types. Perhaps a more targeted approach to additional seat time the proper approach.

SOURCE: Erich Battistina and Elena Claudia Meroni, “Should we increase instruction time in low achieving schools? Evidence from Southern Italy,” Economics of Education Review (December, 2016).

[1] As befits this particular journal, the economics of the PQM intervention is addressed in the report. There is not enough space to summarize it here, but it could be instructive to American policymakers about ways to use data to improve outcomes.


Many prior studies have found that low-income students have less qualified teachers based on measures such as years of teaching experience, teacher licensure test scores, certification status, and educational attainment, but they say very little about how these differences relate to closing the achievement gap, nor do they examine the magnitude of how differences in access to effective teachers might impact performance.

Yet a new Mathematica study is full of surprises. It examines low-income students’ access to effective teachers in grades four through eight over five years (2008–09 to 2012–13). “Low income” is defined as being eligible for free and reduced-price lunch (FRPL), and “high income” includes everyone else (so not much nuance there). The sample includes twenty-six geographically diverse, large school districts across the country, with a median enrollment of 70,000. And analysts measure the effectiveness of each teacher in the district using a value-added model.

There are five key findings.

First, contrary to conventional wisdom, teachers of low-income students are nearly as effective as teachers of high-income students on average (a difference of one percentile point). Specifically, the average teacher of a low-income student is just below the fiftieth percentile, while the average teacher of a high-income student is at the fifty-first percentile.

Second, high- and low-income kids have similar chances of being taught by the most and least effective teachers. For example, 10 percent of both high and low income kids are taught by one of the top 10 percent of teachers in a district.

Third, teachers hired into high poverty schools are equally effective as those hired into low poverty schools. Though both the new hires are less effective than the average teachers, and high poverty schools have more new hires than low poverty schools, neither makes much of a difference because those differences are already small and the performance of new hires improves fast: on average, they become as effective as the average teacher after one year.

Fourth, not surprisingly, on average, teachers who transfer to schools that are higher in poverty than the one they left are less effective than the average teacher. Yet those differences don’t impact equity much because just under 4 percent of all teachers transfer to schools in a higher or lower poverty category anyway (a little more than 4 percent move between schools with similar poverty rates).

Fifth and finally, teacher attrition doesn’t much affect access to effective teachers among high- and low-income kids because the leavers are equally effective among high- and low-poverty schools. Only in a small subset of districts (three out of twenty-six) did they find inequity in access to effective teachers—and it was in math only. In those three districts, if you provided high- and low-income kids with equally effective teachers from fourth to eighth grade, you’d see a reduction in the student achievement gap by at least a tenth of a standard deviation, which is equivalent to four percentile points over a five-year period.

With all that said, the sample the study uses is not nationally representative, even though it is geographically diverse and mostly includes large districts that are lower performing. It also mirrors the types of achievement gaps we see nationally, including in NAEP performance. Therefore, these findings may not hold in small districts or rural areas, for example.

Furthermore, it’s possible that the poorest children in the country (say, those at the tenth percentile of the income distribution) are in fact getting less-effective teachers than the richest kids (those at the ninetieth percentile, for example). But this study couldn’t examine that question because it relied on a binary definition of socio-economic status (i.e., whether a student was eligible for FRPL or not)—and again, findings are not nationally representative.

Still, analysts conclude with a simple summary: The achievement gap arises from factors other than students’ access to effective teachers.

Given that this bottom line finding is the result of an expensive study commissioned by a federal agency and conducted by a well-regarded research shop, it represents a big debunking of conventional wisdom.

SOURCE: Eric Isenberg et al., “Do Low-Income Students Have Equal Access to Effective Teachers? Evidence from 26 Districts,” Institute of Education Sciences, U.S. Department of Education (October 2016).


As another year ends, we want you to tell us what you think were the most important Ohio education stories in 2016 and what you predict will be the top story next year.

This is the easiest task you’ll be asked to do today. It’s only two questions and should only take a minute to complete. You can preview the questions below. When you’re ready to take the survey, click here or on the image below.

Just like the voting booth, whatever you submit will be confidential. Of course, if you want to write and tell us why, we may even feature your piece on our blog.  

Thanks for your participation.