Ohio Policy

Like much of Know Your Charter’s (KYC) charter school coverage, today’s report, “ Belly Up: A Review of Federal Charter School...
Too much of what we hear about urban public schools in America is disheartening. A student’s zip code—whether she comes from...
Implementation of the Every Student Succeeds Act (ESSA) is looming on the horizon, and education leaders and policy makers are in...
NOTE: This is the introduction to Fordham Ohio's latest report— Pathway to Success: DECA prepares students for rigors of college...
Earlier this week, the Ohio Department of Education announced a new award for schools that exceeded expectations for student...
Ohio’s student growth measure—value added—is under the microscope, which provides a good reason to take another look at its...
Last month, Attorney General Mike DeWine toured Citizens Academy, one of the eleven charter schools in the Breakthrough Schools...
Auditor of State Dave Yost
I am a conflicted man. Professionally, I lead Ohio’s auditing staff, a team of financial experts whose job it is to verify that...
The passage of comprehensive charter school reform in the form of House Bill 2 was supposed to move charters past the...
In K–12 education, states have historically granted monopolies to school districts. This tradition has left most parents and...
Since their inception in 1999, Buckeye charter schools have grown rapidly. According to the National Alliance for Public Charter...
Since the passage of House Bill 2 , much attention has been paid to how Ohio’s charter sector can build on policy reforms and...
When Mayor Nan Whaley came into office in 2014, she showed great political courage in making education a top priority, something...
Editor’s note: This is the second post in a series about the performance of Ohio’s urban high schoolers. The first post examined...
If you follow Ohio education news, you’ve likely seen coverage of the breakout success of College Credit Plus (CCP). Local papers...
Regular Gadfly readers know that we usually rely on two metrics when analyzing school performance—Ohio’s performance index and...
On February 25, 2016, Ohio released report cards for the 2014-15 school year—the first in which the state administered next...
Management sage Peter Drucker once said, “If you want something new, you have to stop doing something old.” In recent years,...
The 2015 Fordham Sponsorship Annual Report is our opportunity to share the Fordham Foundation’s work as the sponsor of eleven...

NOTE: This is the introduction to Fordham Ohio's latest report—Pathway to Success: DECA prepares students for rigors of college, realities of life—researched and written by former Dayton Daily News editor and journalist Ellen Belcher. You can read the full report here. It is the first in a series of charter school student profiles.

Too much of what we hear about urban public schools in America is disheartening. A student’s zip code—whether she comes from poverty or economic privilege—often predicts her likelihood of educational (and later-life) success. Motivated by this unacceptable reality, some schools have worked relentlessly against the odds to deliver excellent educational opportunities to students no matter their background. Charter schools in particular have played a role in creating high-quality choices for urban students. Many are led and staffed by incredible visionaries who hold high expectations for all students and have made it their mission to ensure that more inner-city kids make it to (and through) college. When we hear about these schools, it behooves us to pay attention—to celebrate them, study them, and do our damnedest to support them. While there’s no silver bullet for fixing what ails urban public education, there are common undercurrents of success worth observing and learning from. Just as important, we should hear from students themselves. There’s no more compelling case for high-quality school choice than hearing about the life-changing impact it has had on students and their families.

In this instance, that choice comes in the form of a charter high school: the Dayton Early College Academy (DECA), an island of excellence in one of Ohio’s poorest and most academically challenged districts. Ninety-five percent of its students are non-white, and three out of four are economically disadvantaged; yet 98 percent of DECA’s students were proficient on the reading portion of the Ohio Graduation Test, and 100 percent were in math, compared to 67 percent and 57 percent for Dayton Public. The college attendance and completion rates further set it apart as a model of urban high school success. The unique opportunities and supports it provides to students—both academic and personal—are showcased briefly through the story of Khadidja, an inspiring young woman whose future is so bright, it nearly blinds.

Photo credit: Stephanie Henry

Khadidja's experience in an excellent charter school has helped forge a very different future than the one facing many of her urban peers. We hope that her story reminds readers that student voices are vital in day-to-day reform conversations and that expectation-shattering, odds-defying charter schools like hers are worth fighting for.


Earlier this week, the Ohio Department of Education announced a new award for schools that exceeded expectations for student growth, the “Momentum Award.” Any school or district earning straight As on the state’s value-added measures was eligible, assuming it had at least two value-added subgroups (an idea my colleague Aaron explored last year). One hundred and sixty-five of Ohio’s 4,200 schools earned the recognition in its inaugural round.[1] The state also recognized schools and districts earning all As on every report card measure—forty-six schools and two districts achieved this outstanding feat.

We’re most excited about the Momentum Award because it gives credit to schools that make significant contributions to student growth regardless of where students enter in terms of raw achievement. In addition to earning an overall A, winning schools made gains with at least two of the following subgroups: students with disabilities, students who are low-achieving, and gifted students—populations that are often underserved or overlooked.  

It’s been said time and time again that growth measures are essential to any state’s accountability system because they show the contribution a school makes to individual student learning and because they are not correlated with student poverty (as achievement measures are). But it’s worth repeating, given that Ohio is undergoing a review of its value-added measure and the last year has proven we are not immune from attempts to weaken or replace it.

A quick glance at this year’s Momentum Award winners illustrates that student poverty truly doesn’t matter when it comes to scoring well on value-added. This year’s winners include public charter schools serving very high percentages of students in poverty, like Columbus Collegiate West, KIPP Columbus, and Entrepreneurial Preparatory Academy (Woodland Hills) in Cleveland. The list also includes traditional public schools with similar demographics, like North Linden Elementary in Columbus. Schools with very low percentages of poor students also proved it was possible to deliver more than a year’s worth of learning for their students, including Bexley Middle School (9.7 percent economically disadvantaged) and Liberty Tree Elementary School (Olentangy Local in Delaware County – just 2.9 percent economically disadvantaged).

Graph 1 shows the number of schools winning the award, by their percentage of economically disadvantaged students. As you can see, schools serving very high percentages of economically disadvantaged students (80 to 100 percent) won the most Momentum awards. But schools from all poverty quintiles are represented. About as many schools from the top two poverty quintiles (those serving poor student populations ranging from 60 to 100 percent) won Momentum Awards as those from the lowest two quintiles.

Graph 1: Number of Momentum Award–winning schools by poverty quintile (2014-15)

Graph 2 depicts the total percentage of schools statewide falling into each poverty quintile, as well as the percentage earning Momentum awards, to illustrate how many schools in each category exist more broadly and whether the awards in general were earned proportionately.[2] For instance, schools serving students populations that were 60 to 80 percent disadvantaged earned the award at the lowest rate, but there’s also the fewest of those schools generally (just 11.6 percent of Ohio’s total schools). Schools with predominantly poor student populations are slightly overrepresented among Momentum Award winners—a reflection, perhaps, of how these schools must keep their feet on the gas pedal, focusing relentlessly on helping students achieve meaningful gains. Meanwhile, schools from the lowest poverty quintile are also slightly overrepresented. This dispels the notion that schools with wealthier student populations (who are typically high-achieving) have a harder time helping students achieve more than a year’s worth of growth.  

Graph 2: Percent of Ohio schools and Momentum Award–winning schools by poverty quintile (2014-15)

In contrast, the overwhelming majority of schools earning all As did not serve high or even moderate percentages of students in poverty: nine out of ten straight-A schools had student populations that were less than 40 percent economically disadvantaged. This underscores yet again what we already know about the persistent and well documented relationship between poverty and achievement.

Unfortunately, Ohio’s current report cards are stacked heavily in favor of achievement-based metrics, and there is little opportunity for schools serving high percentages of poor students to earn high marks across the board. As the state determines how it will calculate overall letter grades in 2017–18, it’s important to keep data like these in mind. This year’s Momentum Award illustrates that any school—regardless of its student composition—can score well on value-added measures. As such, Ohio legislators and policy makers should consider how to more fairly weight each report card measure as part of the overall grade, and place more emphasis on student growth.

In the meantime, kudos to the Ohio Department of Education for creating this award recognizing high-growth schools, and congratulations to this year’s Momentum schools for making significant contributions to student learning no matter their background.

[1]Of all Ohio schools, about 1,400 earned value-added grades.

[2] This is an admittedly rough comparison. All Ohio school buildings are shown in blue, yet only schools with students in any/all grades 4–8 receive value-added scores and were therefore eligible for the Momentum award—those shown in orange.


The Every Student Succeeds Act (ESSA), which was signed into law by President Obama in December, has been hailed as a bipartisan effort to fix the most problematic provisions in NCLB.

Two oft-repeated criticisms of the old law were that it forced unfunded mandates onto schools and that its focus on reading and math achievement narrowed curricula. Congress responded by rolling dozens of federal grants into one block grant program called the Student Support and Academic Enrichment (SSAE) grant.

The SSAE grant is billed as a flexible funding source intended to empower states to improve student academic achievement by increasing capacity. In order to receive SSAE funds, states are required to submit applications to the department and then distribute the majority of funds to local districts through another application process. The activities that local districts are entitled to use SSAE funds for fall into three categories: efforts to promote a well-rounded education, safe and healthy students, and the effective use of technology. (These were also the areas of focus of the preexisting federal programs that were rolled into this block grant.)

The amount allotted to each state will depend on annual appropriations. According to the Foundation for Excellence in Education, if the program is fully funded at its authorization level—a total of $1.6 billion—grants will range from $3,700,000 (Wyoming) to $188,600,000 (California). Fordham’s home state of Ohio would have access to $61,500,000. That may be optimistic, though, since SSAE’s preexisting programs were funded in 2016 for only $400 million. (You can check out projections for every state here.)

While 95 percent of a state’s allocation must be reserved for district awards, states are permitted to use the remaining percentage for state activities.[1] Acceptable activities include: identifying and eliminating barriers to the integration of programs and funding streams; providing monitoring, training, technical assistance, and capacity building to districts that are awarded SSAE funds; and supporting districts in providing programs and activities that fall into one of the three previously mentioned categories.

Of course, the most interesting part of the SSAE grant is what districts can spend it on. Here’s a look at the three spending categories:

Well-rounded education opportunities

The emphasis on—and targeted funds for—a well-rounded education (WRE) is a clear response to curriculum narrowing. Programs and activities under this category can be coordinated with other schools and community-based programs. It can also be conducted in partnership with higher education institutions, businesses, nonprofits, community-based organizations, or other public or private entities with a demonstrated record of success. The law provides a host of examples—but no requirements—for what could constitute a WRE program. They include: college and career guidance and counseling; programs and activities that use music and the arts to support student success (programs like this come to mind); programming that improves instruction and engagement in STEM fields (including the creation and enhancement of STEM schools); raising achievement through accelerated learning programs; activities that promote the development and strengthening of programs to teach American history, civics, economics, geography, or government; foreign language instruction; environmental education; programs that promote volunteerism and community involvement; programs that support the integration of multiple disciplines; and any other activities that support student access and success in well-rounded education experiences. ESSA mandates that a district must use no less than 20 percent of its SSAE allocation for WRE programs.

Safe and healthy students

The funds in this category—which must also equal no less than 20 percent of a district’s allocation—are intended to develop, implement, and evaluate programs that make schools safer and healthier. In particular, programs are intended to foster supportive and drug-free school environments and promote parental involvement. Like the WRE category, the safe and healthy students section allows districts to conduct programs in partnership with higher education institutions, businesses, nonprofits, community-based organizations, or other entities with a demonstrated record of success. Activities include (but aren’t limited to): drug and violence prevention programs that are evidence-based; school-based mental health services based on trauma-informed practices and provided by qualified health professionals; programs that integrate health and safety practices into school or athletic programs; programs that prevent bullying or harassment; mentoring and counseling for all students; establishing or improving dropout and reentry programs; high-quality training for school personnel in areas like suicide prevention and trauma-informed classroom management; school-based violence and drug abuse prevention; designing and implementing a “locally tailored” plan to reduce exclusionary discipline practices; and implementation of schoolwide positive behavioral intervention.

Effective use of technology

Districts are also required to use a portion of an SSAE grant to improve their use of technology in increasing academic achievement, growth, and digital literacy. While this section doesn’t have a minimum spending percentage, districts are not permitted to spend more than 15 percent of funds on purchasing technology infrastructure. Potential spending areas include: providing educators and administrators with professional learning tools, devices, and content to personalize learning for students and develop and share high-quality educational resources; building technological capacity and infrastructure by procuring content or purchasing devices and software; developing innovative strategies for delivering curricula using technology; implementing blended learning projects; providing professional development in the use of technology; and providing students in rural, remote, and underserved areas with digital learning experiences and access to online courses.


By addressing—and funding—important aspects of education that fall outside the purview of reading and math, ESSA has the potential to set a new normal for what kids have the opportunity to learn. As with many other provisions in ESSA, the success of the SSAE grant will depend largely on states’ and districts’ creativity and commitment to student achievement. But if it’s done right, it could make kids healthier, more well-rounded, and more tech-friendly.

[1] According to ESSA, states are not permitted to use more than 1 percent of their allocation for administrative costs. This includes costs related to publicly reporting how districts spent SSAE funds and the degree to which districts made progress on their identified objectives and outcomes.


Implementation of the Every Student Succeeds Act (ESSA) is looming on the horizon, and education leaders and policy makers are in need of accurate information regarding stakeholder perceptions and opinions. The Northwest Evaluation Association (NWEA) recently answered that call by releasing a comprehensive survey of perceptions of K–12 assessment. The survey asked a range of assessment-related questions to superintendents, principals, teachers, parents, and students.    

Some of the results are unsurprising. For instance, more than seven in ten teachers, principals, and superintendents say that students spend too much time taking assessments. Their opinions on specific tests vary, however. Six in ten teachers rate their states’ accountability tests as fair or poor, but most gave a thumbs-up to both formative assessments and classroom tests and quizzes developed by teachers. The approval gap between state tests and other assessments is most likely due to their perceived usefulness. While state tests give a summative picture of student performance, they aren’t designed to provide diagnostic information or inform instruction—functions that classroom tests and formative assessments perform well. (Of course, let’s not forget that NWEA makes millions of dollars selling a formative assessment.)

In contrast to teachers and administrators, three out of four students and approximately half of parents believe that students spend the right amount of time (or too little time!) taking assessments. Large majorities of parents consider all types of testing—including classroom tests and formative assessments—helpful to their children’s learning. A lack of communication between teachers and parents appears to be a problem: While 87 percent of teachers reported using assessment data to discuss student progress with parents, only 38 percent of parents said that their children’s teachers often or very often discussed their children’s assessment results with them. The fact that teachers and parents have such different views is troubling, but it could be explained by a lack of teacher training—most teachers claimed that although they’ve received training on how to use assessments, they have not received training on communicating assessment outcomes. As a result, only 38 percent of teachers feel very prepared to communicate results to parents. The gap could also be explained by the tense atmosphere surrounding accountability issues, which can sometimes put teachers’ and parents’ interests at odds.

Interestingly, low-income parents have different views about and experiences with assessment than their middle- and high-income counterparts. For instance, 33 percent of parents with a household income under $60,000 agree or strongly agree that state tests improve learning—compared to only 16–17 percent of families with a household income between $60,000 and $119,999. In addition, educators working in low-income districts are more likely than those in middle- or high-income districts to say that students spend too much time taking assessments. Principals in low-income schools are more likely to say that they have a data coach and have developed an assessment plan, and teachers in low-income schools are more likely to say that they modify teaching based on assessment results and use those results to collaborate with peers. 

Although 61 percent of parents say they believe that parents should have the right to opt their children out of state assessments, only 15 percent say they actually plan to opt out their own children. Furthermore, the vast majority of teachers (87 percent) said that they have never or rarely had a conversation with parents about opting out. Numbers are similar for principals (87 percent) and superintendents (82 percent).     

In its conclusion, NWEA offers a few recommendations—including a note on how important it is for states and education organizations to foster open dialogue and provide information on the new federal law to administrators, educators, students, and parents. In particular, states and education agencies should dedicate resources to training teachers on best practices for assessment and data usage so that the communication gap between teachers and parents will lessen.

SOURCE: “Make Assessment Work for All Students: Multiple Measures Matter,” Northwest Evaluation Association, (May 2016).  

I recently wrote about two studies whose results showed promise in the use of co-requisite remediation (students simultaneously taking a developmental and a credit-bearing course in the same subject). The strategy is aimed at getting college students up to speed faster, thus cutting time and costs associated with degree completion (both in two-year and four-year colleges). Now two more studies on this topic offer additional insights.

First up is Iris Palmer’ plan to scale up co-requisite remediation models based on the experiences of pilot programs in five states. These pilots either a) fully replaced traditional prerequisite remediation with a co-requisite model as described above or b) created two different tracks into which students were slotted based on ACT score cutoffs identified by the community colleges. She identifies the subtle variations that different colleges employed (class size, test cutoff points, integration of remediation with credit-bearing content, etc.) and identifies the stakeholders within college hierarchies who would have the best vantage point and leverage to make the needed systemic changes. Who knew that registrars had that kind of power? I jest, but Palmer insists that redesigning an institution’s remediation process “needs to be someone’s full-time job” to be done right—and that state-level support for institutions (funding, policy changes, best practices, etc.) is crucial as well. These are stolid institutions whose practices don’t change easily.

Secondly, the Community College Research Center (CCRC) at Columbia concurred with Palmer that co-requisite remediation worked better than the traditional prerequisite model at getting students to credit-bearing courses, but they wanted to know whether it was more efficient. Their recent research brief built a cost-benefit model for both remediation models as practiced in Tennessee. Researchers found that even with big upfront costs for making the systemic changes Palmer described, co-requisite and prerequisite remediation models ended up costing nearly the same to run. The obvious higher cost of running two courses at once at the outset (credit-bearing courses must be offered to all students who are in remediation, requiring additional instructors to accommodate the increased enrollment) was generally balanced by fewer remedial courses being needed down the line. However small, it’s still an increase in expenditures. But when cost-per-successful-student is factored in, the efficiency of co-requisite remediation was significantly higher, especially in math. Those want to blunt the remediation crisis in higher education will seize upon the cost-per-success numbers as the full picture; but we must tread carefully. The upfront costs for colleges are substantial, and it is a sea change for any college to commit even more resources to students who are, for all intents and purposes, unready for college. Additionally, co-requisite remediation, even at its best, does not work for every student. The CCRC report ends with this last point and calls for more investigation into improving student assignment and program delivery.

If the co-requisite remediation movement is to gain traction, it will be important for colleges to keep a firm eye on that cost-per-successful-student figure rather than on the initial outlay and systemic upheaval that must happen first. Hopefully, promising reforms in K–12 education (higher standards, early college programs, CTE, mastery learning, etc.) will lessen the need for remediation of any kind in college—even if it takes many years.

SOURCE: Iris Palmer, “How to Fix Remediation at Scale,” New America (March 2016)

SOURCE: Clive Belfield, Davis Jenkins, and Hana Lahr, “Is Corequisite Remediation Cost-Effective? Early Findings from Tennessee,” Community College Research Center, Teachers College, Columbia University (April 2016)

The Ohio State Board of Education chose Paolo DeMaria as the next state superintendent of public instruction earlier this month. Mr. DeMaria is a former state budget director, education advisor to two governors, high-level staffer with the Ohio Department of Education and the Ohio Board of Regents, and a current principal with Education First Consulting. His dedication to improving education is obvious and is matched only by his impeccable qualifications.

Mr. DeMaria brings a calm, thoughtful, and analytical approach to the agency’s work. But there is even more to be glad about in terms of this choice: For the first time in many years, the sitting governor did not send a representative to sit in on candidate interviews for state superintendent. This deliberate move away from the politicization of the selection process is a positive step and may have played a small role in the usually fractious board unanimously selecting Mr. DeMaria (even with a number of other highly qualified candidates from which to choose). Just as impressive, DeMaria scored points with many by asking for a lower base salary than originally offered, to be supplemented by a performance-based bonus option. A class act with important implications.

The selection of Mr. DeMaria drew strong praise from many, including the editors at the Columbus Dispatch. Hopefully, the incoming state superintendent can draw on this goodwill as he tackles the many difficult decisions that come with his new position. He’s likely going to need that support sooner rather than later.

SIGN UP for updates from the Thomas B. Fordham Institute