Ohio Policy

EDITOR’S NOTE: This short review originally ran in Education Gadfly Weekly on July 23, 2014. Here we present the original review with an added Ohio perspective.

This new report from the University of Arkansas compares the productivity of public charter schools and district schools, both in terms of cost effectiveness and return on investment (ROI). For the cost-effectiveness analysis, the authors consider how many test-score points students gain on the 2010–11 NAEP for each $1,000 invested; to measure ROI, the authors used, among other data, student-achievement results from CREDO’s national charter school study (that matched students via a “virtual twin” methodology). The key finding: For every $1,000 invested, charter students across the United States earned a weighted average of an additional seventeen points in math and sixteen additional points in reading on NAEP, compared to traditional district students, controlling for student characteristics such as poverty and special-education status. This translates into charters nationwide being 40 percent more cost effective. Meanwhile, Buckeye State charters are less cost effective than national charters, though still more so than their district counterparts within the state. Ohio charters averaged nine additional NAEP points in both reading and math per $1,000 in funding relative to comparable districts. The researchers...

Daniel Navin

EDITOR’S NOTE: This blog post was first published on the United States Chamber of Commerce’s website on Wednesday, July 23, 2014, and is reprinted here by permission of the author.

Ohio has had statewide learning standards in mathematics and English Language Arts in the past, but these standards were not rigorous and not aligned with the demands of college and the workplace. The outcome was low academic expectations which resulted in too many students not being college ready, and a short supply of graduates with the basic abilities needed for success in the workplace, including critical thinking and problem solving skills.

The dismal statistics below underscore to a significant extent the reality of the “quality of education” in Ohio:

  • Just 27% of Ohio fourth graders were proficient in reading on the National Assessment of Educational Progress (NAEP) test, compared to 83% who were deemed proficient on the state’s reading exam;
  • 31% of Ohio’s 2013 high school graduates who took the ACT exam met none of the college-ready benchmarks;
  • 41% of Ohio public high school students entering college must take at least one remedial course in English or math; and,
  • Nationally,
  • ...

The Hispanic population in the United States continues to grow, with Hispanics making up nearly 17 percent of the total population. This population is young (33 percent is of school age) and is changing the demographics of schools in many states, Ohio among them. From 2000–10, the Hispanic population in Ohio grew to approximately 350,000 individuals, representing 3 percent of the state’s total population. That’s obviously smaller than in, say, Texas, but the number is rising.

Unfortunately, Hispanic students in Ohio schools are struggling. On the Ohio Achievement Assessment (OAA), administered in May 2013, Hispanic children scored lower than the state average in both reading and mathematics at every grade level tested. Similarly, on the National Assessment of Educational Progress (NAEP) in 2013, Hispanic students in Ohio scored, on average, seventeen points lower than their white peers in fourth-grade reading and fifteen points lower in fourth-grade math. Further, only 66 percent of Hispanic students in Ohio graduate from high school, compared to 80 percent for all students. These results indicate that the achievement gap remains wide...

Inter-district open enrollment often flies under the radar in discussions about school choice. It may be that way because it has been around so long (established in 1989 and operating in its current form since 1998); perhaps because it is not universally available or because many of the most-desirable districts do not allow open enrollment; or perhaps because it is choice “within the family” (that is, the traditional district family). Despite its usual low-profile, two recent newspaper stories shined light on the topic of open enrollment, showing a disconnect between those administering this unsung school choice program and those who actually use it.

From a district’s point of view, open enrollment can easily devolve into “just business” – dollars in and dollars out to be accounted for year after year. Just check out this story from Hancock County in Northwest Ohio. Net financial “winners”—those districts that have more open-enrollee students coming in than leaving—seem to be fine with the system, as might be expected. But net financial “losers” are objecting more strenuously as the losses go on. Their objections, however, often have very little to do with why students are attending a school outside of their “home” district. In...

Yitz Frank

Earlier this year, two articles published in the Columbus Dispatch claimed that students using vouchers to attend private schools in Ohio perform worse than their peers attending public schools. The focus of the March 8 article and the subsequent March 16 editorial was on extending the third grade reading guarantee to students using vouchers (a measure eventually signed into law). In an effort to bolster this argument, the article referenced data suggesting that 36 percent of third-grade voucher students would be retained compared to only 34 percent of public school students. Other articles in the Cincinnati Enquirer and the Canton Repository made similar comparisons that negatively portrayed the performance of students using an EdChoice Scholarship. However, Test Comparison Summary data released this week by the Ohio Department of Education shows a very different picture of how voucher students are performing. The key is using the right comparison group.

The data used in the articles referenced above incorrectly grouped the results of all public school students in the state, including many affluent public schools, and then compared their results with those of voucher students. However, these scholarships are not available to all students. Students...

Six inches of squish

On this week's podcast: A lunch fight, a School Choice Ohio lawsuit, the DOE's My Brother's Keeper initiative, and Amber reviews NCTQ's Roll Call report.

Amber's Research Minute

Roll Call: The Importance of Teacher Attendance by Nithya Joseph, Nancy Waymack, and Daniel Zielaski, (Washington, D.C.: National Council on Teacher Quality, June 2014).

The National Council on Teacher Quality (NCTQ) released an alarming new report today on teacher absenteeism in America’s urban public schools. While teacher absences were unacceptably high across most of the school districts that NCTQ analyzed, Cleveland and Columbus public schools earned the unhappy distinction of having the most teacher absences of them all. NCTQ’s analysts used district-level data from 2012-13 to calculate the number of teacher absences in forty of the nation’s largest urban school systems. The results were, on the whole, woeful: teachers across these districts were absent, on average, eleven days during the school year. (The length of a school year is roughly 180 days.) NCTQ’s analysis excludes days missed due to major illness or maternity leave, and did include days missed for professional development.

Teacher absenteeism borders on a crisis in Cleveland and Columbus. Cleveland’s teachers missed an average of sixteen days while in Columbus, teachers missed fifteen days—good for the highest and second-highest absentee rates in this study. Meanwhile, in Cincinnati—the only other Ohio district that NCTQ analyzed for this study—teachers missed an average of twelve days of school. (In a separate study, NCTQ found that Dayton’s teachers were absent nearly fifteen days.)...

Cleveland’s teachers union is in a fit over the district’s increased utilization of Teach For America (TFA) to fill teaching positions. Instead of griping, the labor union should think instead of the larger human-resource crisis the district faces. The district has a myriad of human-resource struggles and, as we’ll see, one of them is its aging workforce.

The backstory, in brief, is the following. For Fall 2014, the Cleveland Metropolitan School District (CMSD) has approved the hiring of forty new TFA teachers. This more than doubles the nineteen TFA corps members that the district hired for the 2013-14 school year. TFA is a highly regarded organization that recruits and trains talented young people to teach in high-need schools across the nation.

But, as the Cleveland Plain-Dealer reported recently, the teachers union doesn’t seem to be on board—and that’s too bad. In light of its opposition, here’s a fact the union should chew on.

In 2012-13, CMSD had the highest percentage of teachers with more than ten years of experience of all districts in Ohio. Indeed, 89 percent of its teaching force had more than ten years of experience.[1] As a reference point, the...

I joined the Twittersphere yesterday for a forum on blended learning moderated by Matt Miller, superintendent of Mentor School District in Northeast Ohio. (Find the tweets at #ohblendchat.) The conversation engaged, by my estimation, fifty or so educators who in 140 characters or less discussed what “blended learning” is, how they’re implementing it, what benefits they’re seeing, and what some of the barriers and misconceptions are.

The forum was a great opportunity to learn how blended learning is playing out in the field. From the chat, I came away with three takeaways:

1.)    There is increasing definition around what blended learning is and is not. First, what it is not: putting students in front of a computer and expecting them to learn. Nor does blended learning slavishly conform to a single method of instruction (e.g., lecture, online, project-based). What is blended learning, then? A few of the key phrases used to define blended learning included personalized learning, a combination of instructional deliveries, collaborative learning, and even controlled chaos.

2.)    Teachers say their feedback on students’ work is swifter and their engagement with all students increases in a blended-learning environment compared to conventional ones. Several educators...

Last week, the Ohio Senate passed House Bill 487, also known as the Education Mid Biennium Review (MBR) with overwhelming support (by a vote of twenty-seven to five). The MBR contains a wide variety of education-policy changes, including some modifications that affect Ohio’s academic content standards and assessments.

Ohio’s current learning standards, adopted in 2010 by the State Board of Education, include standards for students in grades K–12 in English language arts, math, science, and social studies. When the standards were adopted four years ago, there was public input but little fanfare or controversy. That changed about a year ago, when critics began focusing on the math and English language arts standards, a.k.a. the Common Core State Standards (CCSS).

As opposition to the CCSS heated up all over the country (the standards were adopted by forty-five states), the focal point in Ohio was House Bill 237, which proposed repealing CCSS completely. The bill, sponsored by Representative Andy Thompson, received two hearings in the House Education Committee, with the last hearing in November 2013 drawing more than 500 people to the Statehouse.

The Senate’s changes in the MBR address some of the chief concerns raised at the November bill hearing. The key proposed changes are described below.

  • Reinforce local control: The bill introduces statutory language designating school-district boards as the sole authority in determining and selecting textbooks, instructional materials, and academic curriculum. It also requires local school boards to establish a parental advisory committee to review the selection of textbooks, reading lists, and academic curriculum. While CCSS supporters have consistently maintained that curriculum would remain a local decision, these changes add legal certainty to that assertion.
  • Protect state independence: Ohio, like every state that adopted the CCSS, did so willingly and has the ability to withdraw from the standards at any time. In fact, Indiana has done exactly that (for better or worse). The Senate language expressly prohibits the state from entering into any agreement that would give control over the development, adoption, or revision of academic standards to any other entity—including the federal government. It also prohibits the State Board of Education from entering a multistate consortium for the development of science or social-studies standards. (That’s just as well, because the “national” standards for science and social studies range from mediocre to awful.)
  • Allow for public review: While the public has always had the opportunity to weigh in when the state adopts academic content standards, the Senate’s language adds some additional structure to the process. It creates academic standards review committees for English language arts, math, science, and social studies. The committee’s membership includes the state superintendent, chancellor of the Board of Regents, an educator, a parent, and three content-area experts with members appointed by the Speaker of the House, president of the Senate, and governor. The respective committees have the ability to review the standards and assessments in their content area to determine if they are both appropriate and support improved student performance. However, the State Board of Education retains statutory responsibility for the adoption of content standards.
  • Protect student data privacy: In an era where identity theft and monitored communications (sometimes by our own government) have become commonplace, it’s reasonable for parents to express apprehension about how their children’s educational records are used and who has access to them. This issue became a rallying cry for CCSS critics. The Senate’s changes require the State Board of Education to provide strict safeguards to protect the confidentiality of personally identifiable student data. In addition, in the course of testing, it prohibits the collection or sharing of personal information about the student or the student’s family with any entity, including the federal or state government.
  • Smooth the transition to the new standards and assessments: The CCSS are far more rigorous than Ohio’s previous standards in math and English language arts (which were mediocre), and the new state assessments in those areas (PARCC exams) are also expected to be more challenging. As a result, it is likely that proficiency scores around the state will fall considerably when the new test is administered. This has prompted angst among educators around the state, as many of the state’s accountability measures and sanctions are tied to academic performance. The Senate has proposed delaying any consequences for schools or districts that struggle on state assessment results and earn low grades on the state report card during the 2014–15 school year. These include sanctions related to No Child Left Behind, formation of academic-distress commissions, new eligibility for EdChoice Scholarships, and automatic closures for low-performing charter schools. It has also allowed but not required districts and teachers to delay using student-achievement data for teacher evaluations next year. Finally, it’s softening the impact of the new assessments themselves by allowing districts to administer paper-and-pencil versions during the first year (in the future they’ll be online) at no charge. This will give districts more time to build the required technological infrastructure.

It’s too early to tell which of these changes will become law as the MBR still has to go to conference committee to allow the House and Senate to work out their differences. However, with these changes, the Ohio Senate appears to have effectively threaded the needle. It has reasserted Ohio’s commitment both to high-quality standards designed to prepare our students for success after high school and to rigorous assessments aligned to those standards. Meanwhile, the Senate has rightly listened to the reasonable concerns of parents and teachers across the state. Hopefully, educators around the state can breathe a little easier knowing that the standards they’ve been working hard to implement over the past four years won’t be changed in the final hour.

Like the Cleveland Browns on a Sunday afternoon, the Ohio General Assembly is fumbling about with the state’s value-added system. One month ago, I described two bizarre provisions related to value-added (VAM) that the House tucked into the state’s mid-biennium budget bill (House Bill 487). The Senate has since struck down one of the House’s bad provisions—and kudos for that—but, regrettably, has blundered on the second one.

To recap briefly, the House proposals would have (1) excluded certain students from schools’ value-added computations and (2) changed the computation of value-added estimates—the state’s measure of a school’s impact on student growth—from a three-year to a one-year calculation.

I argued then that the House’s student-exclusion provision would water-down accountability, and that reverting to the one-year estimates would increase the uncertainty around schools’ value-added results.

The Senate has struck down the House’s exclusion provision. Good. But it has failed to rectify the matter of the one-versus-three-year computation. In fact, it has made things worse.

Here’s the Senate’s amendment:

In determining the value-added progress dimension score, the department shall use either up to three years of value-added data as available or value-added data from the most recent school year available, whichever results in a higher score for the district or building.

Now, under the Senate proposal, schools would receive a rating based on whichever VAM estimate is higher—either the one-year or the three-year computation. (Naturally, schools that just recently opened would not have three years of data; hence, the “as available” and “up to” clauses.)

Huh? How is this rational accountability? The Senate seems to have fallen into the Oprah zone: “you get an A, you get an A, everybody gets an A!”

I exaggerate, of course. Not everyone would get an A, based on the “higher score” policy. But let’s consider what happens to school ratings, under the three scenarios in play—the one-year value-added computation (House), the three-year computation (current policy), and the higher score of the two scores (Senate).

Chart 1 compares the letter-grade distribution under the one-year versus three-year (i.e., multi-year) estimates. As you can see, the three-year scores push schools toward the margins (As and Fs), while at the same time, diminishes the number of schools in the middle (Cs). This is to be expected, given what we know about the greater imprecision of the one-year value-added estimates. Greater imprecision tends to push schools toward the middle of the distribution, sans clear evidence to suggest they’ve had a significant impact, either positively (A) or negatively (F). In short, when the data are “noisier”—as they are under the one-year estimates—we’re more likely to wind up with more schools in the mushy middle.[1]

Chart 1: Clearer view of value-added impact under multi-year scores: Multi-year scores push schools toward the margins (A or F); One-year scores push schools toward the middle (C)

Source: Ohio Department of Education. For 2012-13, multi-year VAM scores are available publicly; the author thanks the department for making schools’ one-year VAM scores accessible at his request. (One-year VAMs, school by school, are available here.) Notes: The one-year A-F ratings are simulated, based on schools’ one-year VAM scores for 2012-13. (The “cut points” for the ratings are here.) The multi-year A-F ratings for 2012-13 are actual letter grades, based on schools’ VAM scores (up to three years) from SY 2011, 2012, 2013. Chart displays the school-level (district and charter) distribution of letter grades (n = 2,558).

Now, let’s look at the Senate’s “higher-score” proposal—the real whopper of them all. Consider chart 2 which also includes the higher of the two value-added scores (the green bar). What you’ll notice is that the number of As would likely increase under the proposal, so that virtually half the schools in the state would receive an A. On the other end of the spectrum, the number of Fs would be cut in half, so that just one-in-ten of the schools in the state would receive an F.

Chart 2: Roughly half of schools would get A under “higher score” provision

Are half the schools in Ohio making significant—meaningfully significant—gains for their students? And are just one-in-ten schools failing to move the achievement needle in a significant way? Let’s get real.

As I’ve maintained, current policy—the three-year computation—is the best course for policymakers. It gives us the clearest look at a school’s impact, both good and bad, on student performance. The finagling of value-added isn’t just an academic exercise, either—it has considerable implications for the state’s automatic charter school closure law, voucher eligibility, academic distress commissions, and a number of other accountability policies. Can Ohio’s policymakers rectify this value-added mess? As with the Browns’ playoff chances, here’s hoping!

[1] Of course, not all schools in the C range are there because of imprecision—some schools may have had a clear, statistically insignificant impact on learning gains.


Last week, School Choice Ohio sued two Ohio school districts for their failure to comply with a public-records request. The organization is seeking directory information for students eligible for the EdChoice Scholarship Program from the Cincinnati and Springfield Public Schools. Actions to enforce public-records requests are rarely exciting, but the outcome of SCO’s effort could have important ramifications for tens of thousands of students and their families across the state.

Despite being a national leader in providing private-school choice options to students—Ohio has five separate voucher programs—there isn’t an established mechanism for informing families eligible for the EdChoice Scholarship program (Ohio’s largest voucher initiative) about their eligibility. The law doesn’t require school districts or the Ohio Department of Education to perform this vital function.

Enter School Choice Ohio (SCO), a Columbus-based nonprofit organization, which has worked tirelessly since the beginning of the EdChoice program to conduct outreach to families across the Buckeye State who are eligible to send their child to a private school via a voucher. SCO typically sends postcards and makes phone calls letting families know that their children may be eligible, giving them a toll-free number to call for an information packet and answering any questions families may have about eligibility and the private-school options in their area.

This is critical work, as the EdChoice Scholarship is designed to provide students in Ohio’s very lowest-performing schools the option to attend a private school.

To conduct this outreach, SCO makes a public-records request for directory information to superintendents of school districts whose students are eligible for the EdChoice Scholarship. “Directory information” can encompass a number of district-chosen parameters but typically includes a student’s name, address, phone number, grade level, and school-building assignment. It is the kind of information you might find in a student directory handed out to families along with a student handbook at the start of each school year.

The statutory language clearly states that if directory information is collected and distributed at all, it is a public record that can be requested by a nonprofit as long as it isn’t “for use in a profit-making plan or activity.” In other words, as long as it’s for a nonprofit exactly like SCO.

How do we know all this? We both worked at SCO for many years, making these public-records requests and helping interested families contacted via directory information.

In the main, districts grudgingly but professionally complied with public-records requests for directory information. One perennial resistor to SCO’s public records requests was Cincinnati City Schools. Every year they would send back a letter through their lawyer saying, in essence, “We know we’re supposed to collect directory information, but we don’t so we can’t give it to you.”

That means thousands of Cincinnati families every year whose children were eligible for a scholarship to a private school of their choice stayed put in their bottom-of-the-heap school simply because they didn’t know another option existed.

Springfield, meanwhile, has collected directory information of the type SCO requests, and they have provided it to SCO in the past. However, as the Springfield News-Sun notes, the board passed a policy change in 2013 which redefined “directory information” to remove anything that would identify a student and so they could not comply with SCO’s latest request.

Unfazed by its own policy constraints, the district continued releasing identifiable directory information to what it calls its “partners” after the policy change. When asked about the legal action last week, Springfield Superintendent David Estrop said, “We are trying to protect our own students from false and inaccurate information.” The characterization of SCO’s work aside, such ad hoc provision of data is not allowed under Ohio’s public-records law. Hence, SCO’s legal action.

We commend SCO for taking this bold and much-needed action. The EdChoice Scholarship program provides opportunities to some of Ohio’s most disadvantaged students who, through no fault of their own, have been assigned to a school that is not, under the state’s criteria, effectively educating its students. To deny these students and their parents information about their private-school options strikes us as a particularly low blow. If SCO isn’t successful, we’d urge the legislature to say enough is enough and to step in and require either school districts or (preferably) ODE to notify these families of their eligibility. After all, in Ohio, when your local school isn’t performing well, you have a choice. You deserve to know about it.

A great deal of hand-wringing has occurred in recent years concerning the United States’ poor academic performance relative to other nations. The anxiety is no doubt justified, as students from countries like South Korea, Japan, and Hong Kong are beating the pants off American pupils on international exams. It’s not just the East Asian countries: even the Swiss, Canucks, and Aussies are cleaning our clocks. But what about Ohio’s students? How does its achievement look in comparison to other industrialized nations? Like most states, not well, according to this new PEPG/Education Next study. To determine how states rank compared to the rest of the world, researchers link 2012 PISA results—international exams administered in thirty-four OECD countries including the U.S.—and state-level NAEP results for eighth graders in 2011. The researchers discovered that Ohio’s students fall well short of the world’s highest performers. When examining math results, Ohio’s proficiency rate (39 percent) falls 15 to 25 percentage points below the highest-achieving nations. (Korea, the worldwide leader in math, was at 65 percent proficiency; Japan was at 59 percent; Massachusetts, the U.S. leader, was at 51 percent). In fact, Ohio’s proficiency rate places us somewhere between Norway’s and Portugal’s achievement rates in this grade and subject. Moreover, Ohio’s weak international performance isn’t just a matter of our students having lower family resources relative to other nations. For example, among students whose parents had a high level of education, Ohio’s math proficiency rate (50 percent) still fell twenty points below the international leaders’ math proficiency rates (Korea, at 73 percent; Poland, at 71 percent). Ohio’s alarmingly mediocre achievement relative to the rest of the world only reinforces our need to raise educational standards so that students—from all family backgrounds—can compete with their international peers.

SOURCE: Eric A. Hanushek, Paul E. Peterson, and Ludger Woessmann, Not just the problem of other people’s children: U.S. student performance in global perspective (Program on Education Policy and Governance and Education Next, May 2014).

  • The EdChoice Scholarship Program received a record number of applications this year: over 20,800 students applied during the window, which closed on May 9, up more than 4,000 from last year.
  • The food-service chief of Lima City Schools testified before Congress last week on how well the Community Eligibility Provision is working for families in Lima. Said Ms. Woodruff, “It’s going well. The parents appreciate it, the students are participating and it’s a good fit.”
  • There is a puzzling gap in Ohio between the number of students identified as gifted and the number of gifted students actually being served. A journalist in the Zanesville area tried to demystify the numbers by digging deep into some local schools. The conclusion of her interview subjects is that the state “mandates we test for giftedness, but they don’t fund it.”
  • Piloting of the new PARCC tests are continuing up to the end of the school year in Ohio. Few problems have been reported, and it seems that kids in particular really like the online nature of the testing.