Innovation Ohio and OEA fail to help anyone 'know' anything

With any luck, the “Know Your Charter” website from Innovation Ohio (IO) and the Ohio Education Association (OEA) will go the way of Pets.com and Geocities.com. The new website’s stated aim is to increase the transparency around charter-school spending and academic results by comparing them to traditional public schools. While greater transparency is a worthwhile goal, it looks like Innovation Ohio—a liberal advocacy group founded by former Strickland administration officials—and the Ohio Education Association (OEA)—the state-level affiliate for the nation’s largest labor union—let political spin get in the way of presenting information in a meaningful way.

The website misinforms the public by failing to report essential information about public schools, calling into question how much the website actually helps anyone “know” anything. In particular, Innovation Ohio (IO) and the OEA make the following crucial omissions in reporting basic school information:

1.) They ignore district funding from local property taxes. You’ll notice that the IO-OEA website reports only state per-pupil revenue for districts and public-charter schools. But remember, school districts are funded jointly through state and local revenue sources.[1] By reporting only state revenue, they flagrantly disregard the fact that school districts raise, on average, roughly half their revenue through local taxes (mainly property). Meanwhile, charters, with only a few exceptions in Cleveland, do not receive a single penny of local revenue, which leads to funding inequity between district and charter schools. When local, state, and federal revenue sources are combined, recent research from the University of Arkansas demonstrates that Cleveland charters received a staggering 46 percent less than the city’s district, while Dayton charters received 40 percent less on a per-pupil basis. (These were the only two Ohio cities where a deep-dive funding analysis was conducted for FY 2011.) With this in view, is it any wonder that IO-OEA would want to conceal districts’ local revenue to create the illusion of district cost-effectiveness relative to charters? The key measure when it comes to the cost of education is how many taxpayer dollars—from all sources—support the schools’ efforts.    

2.) They ignore student-growth measures: Website users will also notice that IO-OEA use the state’s “performance index” (PI) as their primary measure for academic comparison. To be sure, the performance index is a critical component of school report cards—it indicates how students in a particular school achieve on state assessments. And when we look across Ohio’s large urban school districts, PI letter grades are depressed; at the same time, many charter schools also receive low PI ratings. For charter and urban district schools alike, achievement results (i.e., performance-index scores) are influenced by the characteristics of their students. The Toledo Blade went as far as to say that raw achievement measures “mislead” on school performance. The Blade is half right: By reporting on student-achievement results alone, all we learn is that students from Ohio’s urban communities struggle with achievement and too many students are performing below grade level. But we don’t learn much about the actual performance of the school.

As a result, outside observers also need to know whether public schools are making an impact on achievement, even when they enroll students who are grade-levels behind. That is why the state’s “value-added” measure is also essential. By estimating a school’s impact on student growth over time, we gain a much clearer view of how the school itself performs. But IO-OEA ignored value-added. Instead, they should have reported, in a high-profile way, both indicators of school quality—the performance index and value-added ratings. When people clearly see both measures, they gain a more-holistic view of school quality, be they charter or district schools. Meantime, reporting both measures isn’t a matter of political bent either. Urban schools of both types, district and charter, can and do receive strong value-added ratings. Rather, reporting value-added ratings, alongside a student-achievement measure, is just sound reporting practice.

3.) They ignore school-level data: School-level data is important for two reasons, one technical and the other practical. However, so far as I can tell, the IO-OEA website contains no school-level information for any district school in Ohio. First, the technical reason for school-level information. To be clear, charter schools are schools. As such, they should first be compared to other schools not to entire districts. For some charters, especially those in urban areas with large school districts, the comparison is especially odd: It is something like comparing the food quality of an individual Burger King restaurant to the food quality of the entire McDonald’s corporation. The scale is totally different. Now to a certain degree, a school-to-district comparison, or school-to-state comparison, is appropriate when the idea is to benchmark a school’s performance. But for the website to place charter-school performance side-by-side with their local district (without any acknowledgement that school-level data exists) lost an opportunity to compare similarly situated schools.  

Second, the practical matter: For school-shopping parents—particularly those in urban areas where school-choice is most prevalent—the website is virtually useless. Parents select schools for their children, not districts. Moreover, students attend traditional public schools that, while part of a district, have their own strengths and weaknesses and are anything but uniform. So the absence of school-level data for district schools restricts the amount of information available to parents who have to make the important choice of which school to send their child, whether charter or district-run. It is unconscionable for IO-OEA to claim that their website allows parents and citizens to compare charters to “traditional public schools.” To entire districts, yes; to schools, no.

In addition to glaring data omissions (and misleading comparisons), the website uses average teacher experience as a high-level school-quality measure. The website reports—and implicitly compares—the average experience of teachers in charters to districts. It does no good to spread the myth to parents and taxpayers that teachers’ experience matters when it comes to their impact on achievement. (Research has shown that, on average, a teacher’s impact plateaus after roughly three to five years of teaching.) Perhaps some parents will feel reassured that their children have experienced teachers; but as for us, there is more reassurance in knowing that a school’s students are achieving against a rigorous standard.

It is perfectly acceptable—and essential—for organizations to report school finance and academic results. But the reporting also has to be done responsibly. We at Fordham recently published a report, Poised for Progress, which details the academic results of charter and district schools in the Ohio “Big Eight” urban areas. (It does not touch on school-finance issues, in part, because the state will not report 2013-14 fiscal data until later this year.) So in the end, IO-OEA missed an opportunity to constructively inform parents, taxpayers, and policymakers on public education in Ohio. They badly miss the mark when it comes to sound reporting practices on public schools; for that reason, parents, taxpayers, and policymakers should use their bandwidth to bookmark more informative websites on school information (as you can see there are much better sources of information).




[1] A relatively small share of public-school revenue—approximately 10 percent statewide—comes by way of federal grant programs. 

 

 
 
Aaron Churchill
Aaron Churchill is the Ohio Research Director of the Thomas B. Fordham Institute.