Ohio Policy

Last week, the Ohio Senate passed House Bill 487, also known as the Education Mid Biennium Review (MBR) with overwhelming support (by a vote of twenty-seven to five). The MBR contains a wide variety of education-policy changes, including some modifications that affect Ohio’s academic content standards and assessments.

Ohio’s current learning standards, adopted in 2010 by the State Board of Education, include standards for students in grades K–12 in English language arts, math, science, and social studies. When the standards were adopted four years ago, there was public input but little fanfare or controversy. That changed about a year ago, when critics began focusing on the math and English language arts standards, a.k.a. the Common Core State Standards (CCSS).

As opposition to the CCSS heated up all over the country (the standards were adopted by forty-five states), the focal point in Ohio was House Bill 237, which proposed repealing CCSS completely. The bill, sponsored by Representative Andy Thompson, received two hearings in the House Education Committee, with the last hearing in November 2013 drawing more than 500 people to the Statehouse.

The Senate’s changes in the MBR address some of the chief concerns raised at the November bill hearing. The key proposed changes are described below.

  • Reinforce local control: The bill introduces statutory language designating school-district boards as the sole authority in determining and selecting textbooks, instructional materials, and academic curriculum. It also requires local school boards to establish a parental advisory committee to review the selection of textbooks, reading lists, and academic curriculum. While CCSS supporters have consistently maintained that curriculum would remain a local decision, these changes add legal certainty to that assertion.
  • Protect state independence: Ohio, like every state that adopted the CCSS, did so willingly and has the ability to withdraw from the standards at any time. In fact, Indiana has done exactly that (for better or worse). The Senate language expressly prohibits the state from entering into any agreement that would give control over the development, adoption, or revision of academic standards to any other entity—including the federal government. It also prohibits the State Board of Education from entering a multistate consortium for the development of science or social-studies standards. (That’s just as well, because the “national” standards for science and social studies range from mediocre to awful.)
  • Allow for public review: While the public has always had the opportunity to weigh in when the state adopts academic content standards, the Senate’s language adds some additional structure to the process. It creates academic standards review committees for English language arts, math, science, and social studies. The committee’s membership includes the state superintendent, chancellor of the Board of Regents, an educator, a parent, and three content-area experts with members appointed by the Speaker of the House, president of the Senate, and governor. The respective committees have the ability to review the standards and assessments in their content area to determine if they are both appropriate and support improved student performance. However, the State Board of Education retains statutory responsibility for the adoption of content standards.
  • Protect student data privacy: In an era where identity theft and monitored communications (sometimes by our own government) have become commonplace, it’s reasonable for parents to express apprehension about how their children’s educational records are used and who has access to them. This issue became a rallying cry for CCSS critics. The Senate’s changes require the State Board of Education to provide strict safeguards to protect the confidentiality of personally identifiable student data. In addition, in the course of testing, it prohibits the collection or sharing of personal information about the student or the student’s family with any entity, including the federal or state government.
  • Smooth the transition to the new standards and assessments: The CCSS are far more rigorous than Ohio’s previous standards in math and English language arts (which were mediocre), and the new state assessments in those areas (PARCC exams) are also expected to be more challenging. As a result, it is likely that proficiency scores around the state will fall considerably when the new test is administered. This has prompted angst among educators around the state, as many of the state’s accountability measures and sanctions are tied to academic performance. The Senate has proposed delaying any consequences for schools or districts that struggle on state assessment results and earn low grades on the state report card during the 2014–15 school year. These include sanctions related to No Child Left Behind, formation of academic-distress commissions, new eligibility for EdChoice Scholarships, and automatic closures for low-performing charter schools. It has also allowed but not required districts and teachers to delay using student-achievement data for teacher evaluations next year. Finally, it’s softening the impact of the new assessments themselves by allowing districts to administer paper-and-pencil versions during the first year (in the future they’ll be online) at no charge. This will give districts more time to build the required technological infrastructure.

It’s too early to tell which of these changes will become law as the MBR still has to go to conference committee to allow the House and Senate to work out their differences. However, with these changes, the Ohio Senate appears to have effectively threaded the needle. It has reasserted Ohio’s commitment both to high-quality standards designed to prepare our students for success after high school and to rigorous assessments aligned to those standards. Meanwhile, the Senate has rightly listened to the reasonable concerns of parents and teachers across the state. Hopefully, educators around the state can breathe a little easier knowing that the standards they’ve been working hard to implement over the past four years won’t be changed in the final hour.

Like the Cleveland Browns on a Sunday afternoon, the Ohio General Assembly is fumbling about with the state’s value-added system. One month ago, I described two bizarre provisions related to value-added (VAM) that the House tucked into the state’s mid-biennium budget bill (House Bill 487). The Senate has since struck down one of the House’s bad provisions—and kudos for that—but, regrettably, has blundered on the second one.

To recap briefly, the House proposals would have (1) excluded certain students from schools’ value-added computations and (2) changed the computation of value-added estimates—the state’s measure of a school’s impact on student growth—from a three-year to a one-year calculation.

I argued then that the House’s student-exclusion provision would water-down accountability, and that reverting to the one-year estimates would increase the uncertainty around schools’ value-added results.

The Senate has struck down the House’s exclusion provision. Good. But it has failed to rectify the matter of the one-versus-three-year computation. In fact, it has made things worse.

Here’s the Senate’s amendment:

In determining the value-added progress dimension score, the department shall use either up to three years of value-added data as available or value-added data from the most recent school year available, whichever results in a higher score for the district or building.

Now, under the Senate proposal, schools would receive a rating based on whichever VAM estimate is higher—either the one-year or the three-year computation. (Naturally, schools that just recently opened would not have three years of data; hence, the “as available” and “up to” clauses.)

Huh? How is this rational accountability? The Senate seems to have fallen into the Oprah zone: “you get an A, you get an A, everybody gets an A!”

I exaggerate, of course. Not everyone would get an A, based on the “higher score” policy. But let’s consider what happens to school ratings, under the three scenarios in play—the one-year value-added computation (House), the three-year computation (current policy), and the higher score of the two scores (Senate).

Chart 1 compares the letter-grade distribution under the one-year versus three-year (i.e., multi-year) estimates. As you can see, the three-year scores push schools toward the margins (As and Fs), while at the same time, diminishes the number of schools in the middle (Cs). This is to be expected, given what we know about the greater imprecision of the one-year value-added estimates. Greater imprecision tends to push schools toward the middle of the distribution, sans clear evidence to suggest they’ve had a significant impact, either positively (A) or negatively (F). In short, when the data are “noisier”—as they are under the one-year estimates—we’re more likely to wind up with more schools in the mushy middle.[1]

Chart 1: Clearer view of value-added impact under multi-year scores: Multi-year scores push schools toward the margins (A or F); One-year scores push schools toward the middle (C)

Source: Ohio Department of Education. For 2012-13, multi-year VAM scores are available publicly; the author thanks the department for making schools’ one-year VAM scores accessible at his request. (One-year VAMs, school by school, are available here.) Notes: The one-year A-F ratings are simulated, based on schools’ one-year VAM scores for 2012-13. (The “cut points” for the ratings are here.) The multi-year A-F ratings for 2012-13 are actual letter grades, based on schools’ VAM scores (up to three years) from SY 2011, 2012, 2013. Chart displays the school-level (district and charter) distribution of letter grades (n = 2,558).

Now, let’s look at the Senate’s “higher-score” proposal—the real whopper of them all. Consider chart 2 which also includes the higher of the two value-added scores (the green bar). What you’ll notice is that the number of As would likely increase under the proposal, so that virtually half the schools in the state would receive an A. On the other end of the spectrum, the number of Fs would be cut in half, so that just one-in-ten of the schools in the state would receive an F.

Chart 2: Roughly half of schools would get A under “higher score” provision

Are half the schools in Ohio making significant—meaningfully significant—gains for their students? And are just one-in-ten schools failing to move the achievement needle in a significant way? Let’s get real.

As I’ve maintained, current policy—the three-year computation—is the best course for policymakers. It gives us the clearest look at a school’s impact, both good and bad, on student performance. The finagling of value-added isn’t just an academic exercise, either—it has considerable implications for the state’s automatic charter school closure law, voucher eligibility, academic distress commissions, and a number of other accountability policies. Can Ohio’s policymakers rectify this value-added mess? As with the Browns’ playoff chances, here’s hoping!


[1] Of course, not all schools in the C range are there because of imprecision—some schools may have had a clear, statistically insignificant impact on learning gains.

 

Last week, School Choice Ohio sued two Ohio school districts for their failure to comply with a public-records request. The organization is seeking directory information for students eligible for the EdChoice Scholarship Program from the Cincinnati and Springfield Public Schools. Actions to enforce public-records requests are rarely exciting, but the outcome of SCO’s effort could have important ramifications for tens of thousands of students and their families across the state.

Despite being a national leader in providing private-school choice options to students—Ohio has five separate voucher programs—there isn’t an established mechanism for informing families eligible for the EdChoice Scholarship program (Ohio’s largest voucher initiative) about their eligibility. The law doesn’t require school districts or the Ohio Department of Education to perform this vital function.

Enter School Choice Ohio (SCO), a Columbus-based nonprofit organization, which has worked tirelessly since the beginning of the EdChoice program to conduct outreach to families across the Buckeye State who are eligible to send their child to a private school via a voucher. SCO typically sends postcards and makes phone calls letting families know that their children may be eligible, giving them a toll-free number to call for an information packet and answering any questions families may have about eligibility and the private-school options in their area.

This is critical work, as the EdChoice Scholarship is designed to provide students in Ohio’s very lowest-performing schools the option to attend a private school.

To conduct this outreach, SCO makes a public-records request for directory information to superintendents of school districts whose students are eligible for the EdChoice Scholarship. “Directory information” can encompass a number of district-chosen parameters but typically includes a student’s name, address, phone number, grade level, and school-building assignment. It is the kind of information you might find in a student directory handed out to families along with a student handbook at the start of each school year.

The statutory language clearly states that if directory information is collected and distributed at all, it is a public record that can be requested by a nonprofit as long as it isn’t “for use in a profit-making plan or activity.” In other words, as long as it’s for a nonprofit exactly like SCO.

How do we know all this? We both worked at SCO for many years, making these public-records requests and helping interested families contacted via directory information.

In the main, districts grudgingly but professionally complied with public-records requests for directory information. One perennial resistor to SCO’s public records requests was Cincinnati City Schools. Every year they would send back a letter through their lawyer saying, in essence, “We know we’re supposed to collect directory information, but we don’t so we can’t give it to you.”

That means thousands of Cincinnati families every year whose children were eligible for a scholarship to a private school of their choice stayed put in their bottom-of-the-heap school simply because they didn’t know another option existed.

Springfield, meanwhile, has collected directory information of the type SCO requests, and they have provided it to SCO in the past. However, as the Springfield News-Sun notes, the board passed a policy change in 2013 which redefined “directory information” to remove anything that would identify a student and so they could not comply with SCO’s latest request.

Unfazed by its own policy constraints, the district continued releasing identifiable directory information to what it calls its “partners” after the policy change. When asked about the legal action last week, Springfield Superintendent David Estrop said, “We are trying to protect our own students from false and inaccurate information.” The characterization of SCO’s work aside, such ad hoc provision of data is not allowed under Ohio’s public-records law. Hence, SCO’s legal action.

We commend SCO for taking this bold and much-needed action. The EdChoice Scholarship program provides opportunities to some of Ohio’s most disadvantaged students who, through no fault of their own, have been assigned to a school that is not, under the state’s criteria, effectively educating its students. To deny these students and their parents information about their private-school options strikes us as a particularly low blow. If SCO isn’t successful, we’d urge the legislature to say enough is enough and to step in and require either school districts or (preferably) ODE to notify these families of their eligibility. After all, in Ohio, when your local school isn’t performing well, you have a choice. You deserve to know about it.

A great deal of hand-wringing has occurred in recent years concerning the United States’ poor academic performance relative to other nations. The anxiety is no doubt justified, as students from countries like South Korea, Japan, and Hong Kong are beating the pants off American pupils on international exams. It’s not just the East Asian countries: even the Swiss, Canucks, and Aussies are cleaning our clocks. But what about Ohio’s students? How does its achievement look in comparison to other industrialized nations? Like most states, not well, according to this new PEPG/Education Next study. To determine how states rank compared to the rest of the world, researchers link 2012 PISA results—international exams administered in thirty-four OECD countries including the U.S.—and state-level NAEP results for eighth graders in 2011. The researchers discovered that Ohio’s students fall well short of the world’s highest performers. When examining math results, Ohio’s proficiency rate (39 percent) falls 15 to 25 percentage points below the highest-achieving nations. (Korea, the worldwide leader in math, was at 65 percent proficiency; Japan was at 59 percent; Massachusetts, the U.S. leader, was at 51 percent). In fact, Ohio’s proficiency rate places us somewhere between Norway’s and Portugal’s achievement rates in this grade and subject. Moreover, Ohio’s weak international performance isn’t just a matter of our students having lower family resources relative to other nations. For example, among students whose parents had a high level of education, Ohio’s math proficiency rate (50 percent) still fell twenty points below the international leaders’ math proficiency rates (Korea, at 73 percent; Poland, at 71 percent). Ohio’s alarmingly mediocre achievement relative to the rest of the world only reinforces our need to raise educational standards so that students—from all family backgrounds—can compete with their international peers.

SOURCE: Eric A. Hanushek, Paul E. Peterson, and Ludger Woessmann, Not just the problem of other people’s children: U.S. student performance in global perspective (Program on Education Policy and Governance and Education Next, May 2014).

  • The EdChoice Scholarship Program received a record number of applications this year: over 20,800 students applied during the window, which closed on May 9, up more than 4,000 from last year.
  • The food-service chief of Lima City Schools testified before Congress last week on how well the Community Eligibility Provision is working for families in Lima. Said Ms. Woodruff, “It’s going well. The parents appreciate it, the students are participating and it’s a good fit.”
  • There is a puzzling gap in Ohio between the number of students identified as gifted and the number of gifted students actually being served. A journalist in the Zanesville area tried to demystify the numbers by digging deep into some local schools. The conclusion of her interview subjects is that the state “mandates we test for giftedness, but they don’t fund it.”
  • Piloting of the new PARCC tests are continuing up to the end of the school year in Ohio. Few problems have been reported, and it seems that kids in particular really like the online nature of the testing.
  • The EdChoice Scholarship Program received a record number of applications this year: over 20,800 students applied during the window, which closed on May 9, up more than 4,000 from last year.
  • The food-service chief of Lima City Schools testified before Congress last week on how well the Community Eligibility Provision is working for families in Lima. Said Ms. Woodruff, “It’s going well. The parents appreciate it, the students are participating and it’s a good fit.”
  • There is a puzzling gap in Ohio between the number of students identified as gifted and the number of gifted students actually being served. A journalist in the Zanesville area tried to demystify the numbers by digging deep into some local schools. The conclusion of her interview subjects is that the state “mandates we test for giftedness, but they don’t fund it.”
  • Piloting of the new PARCC tests are continuing up to the end of the school year in Ohio. Few problems have been reported, and it seems that kids in particular really like the online nature of the testing.

A great deal of hand-wringing has occurred in recent years concerning the United States’ poor academic performance relative to other nations. The anxiety is no doubt justified, as students from countries like South Korea, Japan, and Hong Kong are beating the pants off American pupils on international exams. It’s not just the East Asian countries: even the Swiss, Canucks, and Aussies are cleaning our clocks. But what about Ohio’s students? How does its achievement look in comparison to other industrialized nations? Like most states, not well, according to this new PEPG/Education Next study. To determine how states rank compared to the rest of the world, researchers link 2012 PISA results—international exams administered in thirty-four OECD countries including the U.S.—and state-level NAEP results for eighth graders in 2011. The researchers discovered that Ohio’s students fall well short of the world’s highest performers. When examining math results, Ohio’s proficiency rate (39 percent) falls 15 to 25 percentage points below the highest-achieving nations. (Korea, the worldwide leader in math, was at 65 percent proficiency; Japan was at 59 percent; Massachusetts, the U.S. leader, was at 51 percent). In fact, Ohio’s proficiency rate places us somewhere between Norway’s and Portugal’s achievement rates...

Last week, School Choice Ohio sued two Ohio school districts for their failure to comply with a public-records request. The organization is seeking directory information for students eligible for the EdChoice Scholarship Program from the Cincinnati and Springfield Public Schools. Actions to enforce public-records requests are rarely exciting, but the outcome of SCO’s effort could have important ramifications for tens of thousands of students and their families across the state.

Despite being a national leader in providing private-school choice options to students—Ohio has five separate voucher programs—there isn’t an established mechanism for informing families eligible for the EdChoice Scholarship program (Ohio’s largest voucher initiative) about their eligibility. The law doesn’t require school districts or the Ohio Department of Education to perform this vital function.

Enter School Choice Ohio (SCO), a Columbus-based nonprofit organization, which has worked tirelessly since the beginning of the EdChoice program to conduct outreach to families across the Buckeye State who are eligible to send their child to a private school via a voucher. SCO typically sends postcards and makes phone calls letting families know that their children may be eligible, giving them a toll-free number to call for an information packet and answering any questions...

Last week, the Ohio Senate passed House Bill 487, also known as the Education Mid Biennium Review (MBR) with overwhelming support (by a vote of twenty-seven to five). The MBR contains a wide variety of education-policy changes, including some modifications that affect Ohio’s academic content standards and assessments.

Ohio’s current learning standards, adopted in 2010 by the State Board of Education, include standards for students in grades K–12 in English language arts, math, science, and social studies. When the standards were adopted four years ago, there was public input but little fanfare or controversy. That changed about a year ago, when critics began focusing on the math and English language arts standards, a.k.a. the Common Core State Standards (CCSS).

As opposition to the CCSS heated up all over the country (the standards were adopted by forty-five states), the focal point in Ohio was House Bill 237, which proposed repealing CCSS completely. The bill, sponsored by Representative Andy Thompson, received two hearings in the House Education Committee, with the last hearing in November 2013 drawing more than 500 people to the Statehouse.

The Senate’s changes in the MBR address some of the chief concerns raised at the ...

Like the Cleveland Browns on a Sunday afternoon, the Ohio General Assembly is fumbling about with the state’s value-added system. One month ago, I described two bizarre provisions related to value-added (VAM) that the House tucked into the state’s mid-biennium budget bill (House Bill 487). The Senate has since struck down one of the House’s bad provisions—and kudos for that—but, regrettably, has blundered on the second one.

To recap briefly, the House proposals would have (1) excluded certain students from schools’ value-added computations and (2) changed the computation of value-added estimates—the state’s measure of a school’s impact on student growth—from a three-year to a one-year calculation.

I argued then that the House’s student-exclusion provision would water-down accountability, and that reverting to the one-year estimates would increase the uncertainty around schools’ value-added results.

The Senate has struck down the House’s exclusion provision. Good. But it has failed to rectify the matter of the one-versus-three-year computation. In fact, it has made things worse.

Here’s the Senate’s amendment:

In determining the value-added progress dimension score, the department shall use either up to three years of value-added data as available or value-added data from the most recent school year available, whichever results in a higher score...

Life Skills Centers, a group of fifteen dropout-recovery charter schools operated by White Hat Management, is on the decline. Last year’s enrollment (school year 2012-13) was less than half that of 2006. The erosion of Life Skills Centers’ enrollment bucks the steadily rising trend in Ohio’s overall charter enrollment. And within dropout-recovery charters—a special subset of schools that enroll at-risk high-school students—Life Skills Centers’ enrollment losses have also been atypical. Excluding Life Skills, the state’s sixty or so dropout-recovery schools have experienced flat to increasing enrollment trends from 2006 to 2013 with the exception of 2012.[1]

Chart 1: Life Skills Center student enrollment, 2005-06 to 2012-13

Source: Ohio Department of Education Notes: The number of Life Skills Centers has remained constant—fifteen schools—throughout this period except for 2005-06 when there were fourteen schools. There are three former Life Skills Centers (then operated by White Hat) that changed management companies and school names effective July 2012. These schools are not included in the totals in chart 1 or table 1 for any years.

Perhaps the enrollment decline is no surprise, given the low performance of these schools. Table...

I had the good fortune of attending the Association for Education Finance and Policy (AEFP) conference last week. AEFP attracts some of the nation’s finest researchers along with a small smattering of policymakers and advocates. Cutting-edge research on topics ranging from parents and school choice, adequacy in school funding, and value-added accountability were presented, and the working papers are online and well worth perusing.

The conference was a veritable buffet of dialogue on education research and policy, and the following are the three main ideas I took away:

  • First, there is a growing stable of researchers who are willing to tackle challenging but pressing policy issues. A few of the more ambitious projects came from graduate-student researchers who are making valiant efforts to answer thorny and (perhaps) impossible research questions. Some of the interesting studies included preliminary work on a return-on-public-investment model for charter schools, whether “adequacy and equity” court cases have contributed to achievement gains, and whether value-added models of teacher effectiveness have “floor” and “ceiling” effects (i.e., bias VAM estimates of teachers with many low- or high-achieving students). It’s evident that the education-research community is moving in the right direction by making concerted efforts to answer
  • ...

Duplication is not always a good thing. Think about it, most of us don’t carry two cell phones. In a world with limited pants-pocket space, two phones would be senseless, right? Ohio’s school report cards have two essentially-the-same achievement components, both of which receive an A-F letter grade. It’s time to toss one of them for parsimony’s sake.

The first, the indicators-met component, is determined by whether 75 percent of a school’s test-takers reach proficiency on the state’s twenty-four assessments (85 percent for eleventh grade). The second, the performance-index component (PI), is a composite score weighted by the proportion of test-takers who attain each of the state’s five achievement levels.

Though the two indicators differ slightly, they produce very similar results for any given school. In other words, if a school gets a low PI letter grade, it is nearly assured that it will receive a low indicators-met grade. The same is true in the reverse—high PI schools will likely get a high indicators-met grade. Here’s the evidence.

Table 1 shows the letter grades of Ohio’s 3,089 schools by indicators met and PI. As you can tell, the grades correspond closely. For example, 99 percent of schools that received...

The weeping and gnashing of teeth from parents and community members who may be affected by the closure of seven Columbus City Schools is understandable. No one wants to lose institutions that are dear to the heart.

But I would ask this: Where was the outrage from parents and the community when these schools failed to deliver academic results? Why didn’t 700 people come out to the meetings when our own state department of education rated the schools as under-performing? Where were the protests; where were the posters; where were the demands?

For those who might be interested, here’s the dismal three-year performance record of the seven schools on the chopping block. Maybury is the only school in which the case could be made that it’s worth keeping open on the basis of academics.

Source: Ohio Department of Education Notes: In 2012-13, no school received an overall rating. For 2010-11 and 2011-12, “academic emergency” is equivalent to an “F”; “academic watch” is equivalent to a “D”; “continuous improvement” is equivalent to a “C”; “effective” is equivalent to a “B.” High schools do not receive a value-added rating, hence the N/A....

Pages