Ratings without substance

Last month, much to my surprise, both the Education Gadfly and The Wall Street Journal touted the new Standard and Poor's School Evaluation Service. Such praise is premature. S&P has many strengths, but school evaluation has not yet proven to be one of them.

The S&P School Evaluation Service has posted online reports about each school district in Michigan and Pennsylvania, reports that include 1500 education and financial variables under broad headings like student results, learning environments, spending and demographics. The reports describe each district's strengths and weaknesses and compare them with state averages and "peer districts." Participating states reportedly pay $2-2.5 million per year for the service. (To see the reports for yourself, go to http://www.ses.standardandpoors.com.)

Since 1918, S&P has been a leader in providing no-nonsense information to the financial industry.  The firm's reputation for objective and hard-hitting analysis is such that many institutional investors, pension funds and the like are barred from investing in bonds that lack an "investment grade rating" from either S&P or one of its two main competitors.  The firm's recent foray into school assessment, on the other hand, is a bland and pricey re-hash of information already available from many states and school districts.  It is not clear to me what value S&P is adding at this point in their product's development.  Praise for the S&P service seems to be based more upon wishful thinking than objective analysis of its value. 

It would be a great service to the public if S&P or some other private firm offered a hard-hitting, objective analysis of school and district performance. Many would appreciate a service that simplified the data that are already provided by many school districts and state education departments and that produced a simple, easy to understand rating of performance.

That is the sort of thing that S&P offers in its core business.  The firm is an industry leader in municipal bond ratings.  In that business, S&P's professionals analyze a myriad of complex financial, economic and political factors and determine the risk associated with the bonds of a state, county, city, school district or other public agency.  An investor can be confident that a California school district bond rated "A" has generally the same risk as an "A" rated bond from a New Jersey sewer district, and less risk than the bonds from an Illinois county holding a BBB rating. 

When S&P issues a bond rating, they do so with a limited amount of verbiage.  They most certainly do not try to explain the complexities of municipal government or offer much by way of contextual information.  One will not see statements like "this entity is rated B, but it is doing above average compared to cities with similar populations."  Unfortunately, the S&P school evaluation product offers way too much of that sort of editorial embroidery.  It compares schools and districts with others of "similar socio-economic status" and proximity to metro areas.  If you live in a city, and your child has "low SES," this product may very well reinforce low expectations for your child's school. 

Nor has S&P attempted to present a simplified rating of school district performance.  They simply report that a particular district is well above average, above average, below average or well below average compared with other school districts in its own state.  Good luck trying to compare schools in Michigan to schools in Pennsylvania using the S&P service.  You are also on your own if you want to know whether "average" in either of those states means "good enough."

As a private firm, S&P is free to design its own product as it sees fit.  If there is demand, the firm will do well financially.  Well and good.  But what accounts for the willingness - even ardor - of governors to spend taxpayer money on this service as it now stands?  Michigan, for example, has a pretty good website of its own.  Since S&P is totally reliant on the state for the data it presents, there is nothing on the S&P site that is not also available on the Michigan site.  To my eye, the state's site is easier to navigate and more concise in its presentation of data.  All that S&P seems to add is the comparison to state averages and to districts of similar demographics. One imagines the state education department could easily add that to its own site for a lot less than the state is paying S&P. 

The official state site also does a better job of presenting data for individual schools.  S&P's service focuses on districts and presents a very limited set of indicators for individual schools.  In this regard, S&P appears to be behind the times, as most credible and promising improvement efforts are directed at school-level, as opposed to district-level reform. 

Finally, there are some disturbing and unexplained anomalies in the S&P data presentation. For example, an unexplained bar chart reports that the Detroit public schools' graduation rate went from 30% to over 80% and back down to 40% over a three-year period.  This inexplicable and improbable development appears on the state site as well.  S&P's attorneys have inserted the necessary disclaimers about potential inaccuracies in the data provided to S&P by the state, yet one wonders about the value of a system that simply redeploys the state's information, inconsistencies and all.

Let's hope that S&P is planning to become more forthright with this product and to speak more plainly about school performance. Of course, such candor might harm the existing business relationships that S&P and its corporate parent, McGraw-Hill, have with many school districts and states.  But it would be courageous and worthy of the Education Gadfly's - and The Wall Street Journal's - approval.   Similarly, let's hope that the leaders of states that are considering purchasing S&P's school evaluation service are willing to ask tough questions about the value that S&P is adding to school reform efforts. 

Raymond Domanico is Senior Education Advisor to the Metro NY Industrial Areas Foundation, a network of churches, parents, tenant associations and schools working to improve life in the New York City area.  He has studied the public education system in New York for over twenty years from a variety of perspectives, including director of data analysis for the New York City Board of Education, director of the Center for Educational Innovation at the Manhattan Institute, and executive director of the Public Education Association.  The opinions expressed herein are his own.

To read our blurb on the S&P school district reports in the October 24th issue of the Education Gadfly, go to http://www.edexcellence.net/gadfly/issue.cfm?issue=85#1287.

NB: Last week, Just for the Kids (an Austin nonprofit that uses state accountability data to examine school performance), the Education Commission of the States, and the University of Texas announced the establishment of the National Center for Educational Accountability. The new center will focus on the effective use of school and student data and the identification of best practices. For more, see "National Center for Educational Accountability Established to Achieve Excellence in Public Schools."

More By Author

November 13, 2001