Hard lessons from Ohio’s innovation fund

Ohio Department of Education

John Mullaney

NOTES: John Mullaney is the Executive Director of the Nord Family Foundation. Both authors were part of the Straight A Fund advisory board in FY 14-15.

This piece originally appeared in a slightly different form in the Cleveland Plain Dealer.

In 2013, Governor Kasich and Ohio legislators enacted the Straight A Fund, one of the nation’s largest statewide competitive grant programs for K-12 education. The idea sprang from philanthropic foundations across Ohio who saw this as an opportunity to have the state co-invest in truly innovative approaches to teaching, learning, and assessment. A 2009 report Beyond Tinkering was presented to the legislature with specific recommendations to initiate such a fund. With $250 million in state funding over fiscal years (FY) 2014 and 2015, Straight A awarded sixty-one grants in amounts ranging from about $200,000 to $15 million. The fund was intended to spark innovative thinking and practices with the goal of boosting student achievement and reducing costs.

Yet after an initial burst of excitement, enthusiasm for Straight A waned. Collaborators from the philanthropic sector expressed concerns about the legislative emphasis on cost-savings, which some felt eclipsed the focus on innovation and achievement. In FY 2016-17, state lawmakers slashed funding to $30 million over the biennium; this June, the legislature pulled the plug and cut off program funding entirely. Anecdotes suggest that some of the grants may have been successful, but unfortunately—as we discuss in more detail below—we know of no hard evidence on whether the program has led to improved student learning.

We both think it’s critical to incentivize promising education practices, and we’re rooting for Straight A to succeed. This can mean new approaches not only to instruction, but also assessment and even organizational management in schools. As members of the Straight A grant advisory panel, we had a unique opportunity to see, up close and personal, certain aspects of implementation. Based on our observations, we believe policymakers can draw important lessons about the design of an innovation fund. We offer our thoughts not as Monday morning quarterbacks; rather, if Ohio decides to reboot an innovation fund—a worthy goal indeed—we believe giving careful thought to the following issues is essential.

Funding amounts

Many of the Straight A grants were multi-million-dollar projects. Grants of $13 to $15 million far exceeded, by many millions, grants made to districts and schools by Ohio’s private foundations. While some of the grants were awarded to multi-district consortia, the sizeable amounts seemed to present challenges. For one, the massive size of the overall program and the grant amounts likely created unrealistic expectations for immediate, home-run results. In addition, as lottery-winning horror stories remind us, managing an unexpected windfall is a surprisingly difficult task. Indeed, in an early round of grants, we heard that some grantees might have issues spending down the funds on a relatively short schedule. It’s not clear whether districts ultimately felt rushed, but the large amounts plus time constraints may have led to less than optimal spending decisions.

While the funding amounts surely made a splash, a modest fund may have been more workable for grantees while also delivering the innovative spark lawmakers were aiming for. While not in the realm of education, rigorous research on federal R&D grants for energy projects finds that pilot grants of about $150,000 packs as big a punch as its larger and more burdensome program that awards $1 million grants. In a future iteration, Ohio lawmakers could consider scaling back grant amounts to more reasonable amounts. Such a program could offer educators a chance to test-drive new ideas without much fuss. For larger, more complex projects that require major investments, such as launching new schools or scaling successful pilot programs across multiple districts, lawmakers could create a separate grant, perhaps with different timelines and application procedures. A “tiered” structure for grants with different purposes, processes, and awards would be more akin to the U.S. Department of Education’s i3 Fund.

Grant application

Having read dozens of grant applications, we can say that the application itself was cumbersome. Applicants had to answer a bevy of questions that asked among other things: how the project would affect student learning; how it would be sustainable; how the project would affect instruction; and how it would reduce costs. Predictably, many of the applicants, putting their best foot forward, offered long-winded, jargon filled responses. The long, clunky application may have deterred some schools from applying—perhaps those lacking professional grant-writing help. It also seemed to favor complex, collaborative (and highly ambitious!) projects over more modest school- or classroom-level initiatives.

A slim, straightforward application should encourage a wider pool of applicants. State lawmakers can assist in this regard: In the appropriating legislation, they can place bare minimum parameters around the program. For example, one of the legislative goals of Straight A was to achieve “cost savings”; in turn, applicants had to answer fiscal questions, which meant engaging treasurers to fill out complicated financial forms. An open-ended law and a hassle-free application should be more inviting to classroom teachers and principals wanting to try something different.

Application review

As grant advisors, we received briefings on the Ohio Department of Education’s (ODE) “blind” application review process. The emphasis on the blind review was a political move to avoid skepticism among several administrators we spoke with that the sheer amount of the grants would open itself to favoritism and a rigged system. To avoid the appearance of favoritism, the process developed an overemphasis on statistical analyses and rankings to ensure fairness—but it also lacked a key element of a successful grant program: interaction between the grantor and potential grantee. A communication from private philanthropy to the ODE suggested they change the model to include site visits with potential grant recipients, stating clearly that in private philanthropy, no board would ever approve a grant request without a personal site visit with grantees. As professionals in both the philanthropic and venture capital sectors would tell you, one learns as much about the true capabilities of an organization by meeting with the principals involved as from an examination of the business plan.

That advice went unheeded and instead the second round continued with a purely mechanical review, with statisticians crunching numbers, and decisions made in a locked room in Columbus. Moving forward, policymakers should insist on incorporating an interview with prospective grantees as part of the process. State officials and/or members of a grant advisory board could interview those who make it through a screening stage. This would offer an opportunity to ask clarifying questions about the application, provide a better picture of the context in which the project will take place, and start building a trusting relationship between the grantor and potential grantee. After an interview, a final decision could be made.

Rigorous evaluation

Although the legislation was lavish in its funding, there was no appropriation for evaluation. Shortly after implementation, we raised concerns about the apparent lack of a robust evaluation of Straight A. ODE has invested $200,000 in research on the program, which included a survey and a broad analysis of report card data. But we know of no rigorous empirical analysis of Straight A—either as a whole or at individual grantee sites. Any evaluation of Straight A has been kept in-house and certainly has not been well publicized. Without solid, widely disseminated evidence, it’s just not clear whether the program has met any of its goals to make a difference for students or provide cost efficiencies in doing so.

How can Buckeye policymakers better ensure proper evaluation? The state could prioritize randomized, experimental projects, perhaps giving them “extra credit” in the application review. These types of projects would be more conducive to what Harvard’s Tom Kane calls “short cycle clinical trials,” which offer a quick turnaround and quality evidence on the effectiveness of innovations. If randomization isn’t possible or if such an evaluation would prove too expensive relative to the size of the project, then the grantee could be encouraged to conduct a quasi-experimental study, such as on the one conducted on a modest-sized tutoring program from Youngstown. Any foundation making investments of this size would bring in outside expertise to evaluate these programs, and Ohio has many organizations of high repute more than capable of conducting research.

As for transparency, ODE could create a central library of studies—a What Works Clearinghouse of sorts—from individual grantees. This would allow easy access to educators and the public on which innovations worked and which did not. Rigorous evaluation and this level of transparency would also help the public and policymakers gain a sense of the overall impact of the innovation fund, as well as provide evidence on which innovations hold promise for replication.


Ohio legislators were on the right track with the adoption of an innovation fund. Yet Straight A had design problems that limited its ability to meet—or prove that it met—program goals. If Ohio policymakers carefully consider the issues above, we believe that the next iteration of Straight A will better spur innovative practices that benefit both students and the professional educators that work with them for years to come.

Aaron Churchill
Aaron Churchill is the Ohio Research Director of the Thomas B. Fordham Institute.