Ohio Policy

I joined the Twittersphere yesterday for a forum on blended learning moderated by Matt Miller, superintendent of Mentor School District in Northeast Ohio. (Find the tweets at #ohblendchat.) The conversation engaged, by my estimation, fifty or so educators who in 140 characters or less discussed what “blended learning” is, how they’re implementing it, what benefits they’re seeing, and what some of the barriers and misconceptions are.

The forum was a great opportunity to learn how blended learning is playing out in the field. From the chat, I came away with three takeaways:

1.)    There is increasing definition around what blended learning is and is not. First, what it is not: putting students in front of a computer and expecting them to learn. Nor does blended learning slavishly conform to a single method of instruction (e.g., lecture, online, project-based). What is blended learning, then? A few of the key phrases used to define blended learning included personalized learning, a combination of instructional deliveries, collaborative learning, and even controlled chaos.

2.)    Teachers say their feedback on students’ work is swifter and their engagement with all students increases in a blended-learning environment compared to conventional ones. Several educators tweeted about how they have a greater feel for the educational needs of their students. Others described how blended learning allows for more one-on-one instruction and student-teacher conferences. Meanwhile, a few other educators tweeted how blended learning enables them to reach all of their students (i.e., both struggling and advanced...

Last week, the Ohio Senate passed House Bill 487, also known as the Education Mid Biennium Review (MBR) with overwhelming support (by a vote of twenty-seven to five). The MBR contains a wide variety of education-policy changes, including some modifications that affect Ohio’s academic content standards and assessments.

Ohio’s current learning standards, adopted in 2010 by the State Board of Education, include standards for students in grades K–12 in English language arts, math, science, and social studies. When the standards were adopted four years ago, there was public input but little fanfare or controversy. That changed about a...

Like the Cleveland Browns on a Sunday afternoon, the Ohio General Assembly is fumbling about with the state’s value-added system. One month ago, I described two bizarre provisions related to value-added (VAM) that the House tucked into the state’s mid-biennium budget bill (House Bill 487). The Senate has since struck down one of the House’s bad provisions—and kudos for that—but, regrettably, has blundered on the second one.

To recap briefly, the House proposals would have (1) excluded certain students from schools’ value-added computations and (2) changed the computation of value-added estimates—the state’s measure of a school’s impact on...

Last week, School Choice Ohio sued two Ohio school districts for their failure to comply with a public-records request. The organization is seeking directory information for students eligible for the EdChoice Scholarship Program from the Cincinnati and Springfield Public Schools. Actions to enforce public-records requests are rarely exciting, but the outcome of SCO’s effort could have important ramifications for tens of thousands of students and their families across the state.

Despite being a national leader in providing private-school choice options to students—Ohio has five separate voucher programs—there isn’t an established mechanism for informing families eligible for the EdChoice Scholarship program...

A great deal of hand-wringing has occurred in recent years concerning the United States’ poor academic performance relative to other nations. The anxiety is no doubt justified, as students from countries like South Korea, Japan, and Hong Kong are beating the pants off American pupils on international exams. It’s not just the East Asian countries: even the Swiss, Canucks, and Aussies are cleaning our clocks. But what about Ohio’s students? How does its achievement look in comparison to other industrialized nations? Like most states, not well, according to this new PEPG/Education Next study. To determine how states rank...

  • The EdChoice Scholarship Program received a record number of applications this year: over 20,800 students applied during the window, which closed on May 9, up more than 4,000 from last year.
  • The food-service chief of Lima City Schools testified before Congress last week on how well the Community Eligibility Provision is working for families in Lima. Said Ms. Woodruff, “It’s going well. The parents appreciate it, the students are participating and it’s a good fit.”
  • There is a puzzling gap in Ohio between the number of students identified as gifted and the number of gifted students actually being served. A journalist in the Zanesville area tried to demystify the numbers by digging deep into some local schools. The conclusion of her interview subjects is that the state “mandates we test for giftedness, but they don’t fund it.”
  • Piloting of the new PARCC tests are continuing up to the end of the school year in Ohio. Few problems have been reported, and it seems that kids in particular really like the online nature of the testing.

A great deal of hand-wringing has occurred in recent years concerning the United States’ poor academic performance relative to other nations. The anxiety is no doubt justified, as students from countries like South Korea, Japan, and Hong Kong are beating the pants off American pupils on international exams. It’s not just the East Asian countries: even the Swiss, Canucks, and Aussies are cleaning our clocks. But what about Ohio’s students? How does its achievement look in comparison to other industrialized nations? Like most states, not well, according to this new PEPG/Education Next study. To determine how states rank compared to the rest of the world, researchers link 2012 PISA results—international exams administered in thirty-four OECD countries including the U.S.—and state-level NAEP results for eighth graders in 2011. The researchers discovered that Ohio’s students fall well short of the world’s highest performers. When examining math results, Ohio’s proficiency rate (39 percent) falls 15 to 25 percentage points below the highest-achieving nations. (Korea, the worldwide leader in math, was at 65 percent proficiency; Japan was at 59 percent; Massachusetts, the U.S. leader, was at 51 percent). In fact, Ohio’s proficiency rate places us somewhere between Norway’s and Portugal’s achievement rates in this grade and subject. Moreover, Ohio’s weak international performance isn’t just a matter of our students having lower family resources relative to other nations. For example, among students whose parents had a high level of education, Ohio’s math proficiency rate (50 percent) still fell twenty points below the international...

Last week, School Choice Ohio sued two Ohio school districts for their failure to comply with a public-records request. The organization is seeking directory information for students eligible for the EdChoice Scholarship Program from the Cincinnati and Springfield Public Schools. Actions to enforce public-records requests are rarely exciting, but the outcome of SCO’s effort could have important ramifications for tens of thousands of students and their families across the state.

Despite being a national leader in providing private-school choice options to students—Ohio has five separate voucher programs—there isn’t an established mechanism for informing families eligible for the EdChoice Scholarship program (Ohio’s largest voucher initiative) about their eligibility. The law doesn’t require school districts or the Ohio Department of Education to perform this vital function.

Enter School Choice Ohio (SCO), a Columbus-based nonprofit organization, which has worked tirelessly since the beginning of the EdChoice program to conduct outreach to families across the Buckeye State who are eligible to send their child to a private school via a voucher. SCO typically sends postcards and makes phone calls letting families know that their children may be eligible, giving them a toll-free number to call for an information packet and answering any questions families may have about eligibility and the private-school options in their area.

This is critical work, as the EdChoice Scholarship is designed to provide students in Ohio’s very lowest-performing schools the option to attend a private school.

To conduct this outreach, SCO makes a public-records request for directory information...

Last week, the Ohio Senate passed House Bill 487, also known as the Education Mid Biennium Review (MBR) with overwhelming support (by a vote of twenty-seven to five). The MBR contains a wide variety of education-policy changes, including some modifications that affect Ohio’s academic content standards and assessments.

Ohio’s current learning standards, adopted in 2010 by the State Board of Education, include standards for students in grades K–12 in English language arts, math, science, and social studies. When the standards were adopted four years ago, there was public input but little fanfare or controversy. That changed about a year ago, when critics began focusing on the math and English language arts standards, a.k.a. the Common Core State Standards (CCSS).

As opposition to the CCSS heated up all over the country (the standards were adopted by forty-five states), the focal point in Ohio was House Bill 237, which proposed repealing CCSS completely. The bill, sponsored by Representative Andy Thompson, received two hearings in the House Education Committee, with the last hearing in November 2013 drawing more than 500 people to the Statehouse.

The Senate’s changes in the MBR address some of the chief concerns raised at the November bill hearing. The key proposed changes are described below.

  • Reinforce local control: The bill introduces statutory language designating school-district boards as the sole authority in determining and selecting textbooks, instructional materials, and academic curriculum. It also requires local school boards to establish a parental advisory committee to review
  • ...

Like the Cleveland Browns on a Sunday afternoon, the Ohio General Assembly is fumbling about with the state’s value-added system. One month ago, I described two bizarre provisions related to value-added (VAM) that the House tucked into the state’s mid-biennium budget bill (House Bill 487). The Senate has since struck down one of the House’s bad provisions—and kudos for that—but, regrettably, has blundered on the second one.

To recap briefly, the House proposals would have (1) excluded certain students from schools’ value-added computations and (2) changed the computation of value-added estimates—the state’s measure of a school’s impact on student growth—from a three-year to a one-year calculation.

I argued then that the House’s student-exclusion provision would water-down accountability, and that reverting to the one-year estimates would increase the uncertainty around schools’ value-added results.

The Senate has struck down the House’s exclusion provision. Good. But it has failed to rectify the matter of the one-versus-three-year computation. In fact, it has made things worse.

Here’s the Senate’s amendment:

In determining the value-added progress dimension score, the department shall use either up to three years of value-added data as available or value-added data from the most recent school year available, whichever results in a higher score for the district or building.

Now, under the Senate proposal, schools would receive a rating based on whichever VAM estimate is higher—either the one-year or the three-year computation. (Naturally, schools that just recently opened would not have three years of data; hence, the “as available” and “up to” clauses.)

Huh?...

Life Skills Centers, a group of fifteen dropout-recovery charter schools operated by White Hat Management, is on the decline. Last year’s enrollment (school year 2012-13) was less than half that of 2006. The erosion of Life Skills Centers’ enrollment bucks the steadily rising trend in Ohio’s overall charter enrollment. And within dropout-recovery charters—a special subset of schools that enroll at-risk high-school students—Life Skills Centers’ enrollment losses have also been atypical. Excluding Life Skills, the state’s sixty or so dropout-recovery schools have experienced flat to increasing enrollment trends from 2006 to 2013 with the exception of 2012.[1]

Chart 1: Life Skills Center student enrollment, 2005-06 to 2012-13

Source: Ohio Department of Education Notes: The number of Life Skills Centers has remained constant—fifteen schools—throughout this period except for 2005-06 when there were fourteen schools. There are three former Life Skills Centers (then operated by White Hat) that changed management companies and school names effective July 2012. These schools are not included in the totals in chart 1 or table 1 for any years.

Perhaps the enrollment decline is no surprise, given the low performance of these schools. Table 1 shows the five-year cohort graduation rates for Life Skills Centers from 2009-10 to 2011-12. The graduation rates for their pupils are sometimes less than ten percent. The Life Skills Center in Dayton performs the highest among the group: 25 percent graduation rate in 2011-12; 22 percent in 2010-11....

I had the good fortune of attending the Association for Education Finance and Policy (AEFP) conference last week. AEFP attracts some of the nation’s finest researchers along with a small smattering of policymakers and advocates. Cutting-edge research on topics ranging from parents and school choice, adequacy in school funding, and value-added accountability were presented, and the working papers are online and well worth perusing.

The conference was a veritable buffet of dialogue on education research and policy, and the following are the three main ideas I took away:

  • First, there is a growing stable of researchers who are willing to tackle challenging but pressing policy issues. A few of the more ambitious projects came from graduate-student researchers who are making valiant efforts to answer thorny and (perhaps) impossible research questions. Some of the interesting studies included preliminary work on a return-on-public-investment model for charter schools, whether “adequacy and equity” court cases have contributed to achievement gains, and whether value-added models of teacher effectiveness have “floor” and “ceiling” effects (i.e., bias VAM estimates of teachers with many low- or high-achieving students). It’s evident that the education-research community is moving in the right direction by making concerted efforts to answer questions that matter for sound policy and practice.  
  • Second, to cease testing and data collection would cripple promising research avenues. There is growing concern about testing and data collection among education policymakers and the public. The backlash is understandable. But make no mistake: if states backtrack on testing
  • ...

Duplication is not always a good thing. Think about it, most of us don’t carry two cell phones. In a world with limited pants-pocket space, two phones would be senseless, right? Ohio’s school report cards have two essentially-the-same achievement components, both of which receive an A-F letter grade. It’s time to toss one of them for parsimony’s sake.

The first, the indicators-met component, is determined by whether 75 percent of a school’s test-takers reach proficiency on the state’s twenty-four assessments (85 percent for eleventh grade). The second, the performance-index component (PI), is a composite score weighted by the proportion of test-takers who attain each of the state’s five achievement levels.

Though the two indicators differ slightly, they produce very similar results for any given school. In other words, if a school gets a low PI letter grade, it is nearly assured that it will receive a low indicators-met grade. The same is true in the reverse—high PI schools will likely get a high indicators-met grade. Here’s the evidence.

Table 1 shows the letter grades of Ohio’s 3,089 schools by indicators met and PI. As you can tell, the grades correspond closely. For example, 99 percent of schools that received an A for indicators met received either an A or B on PI. One-hundred percent of schools that received a B on indicators met received a B or C on PI. Well over one-thousand schools received an A/B grade combination. There are very few schools that received mixed, high-low ratings:...

Pages