Having worked on educator evaluation reform at a state department of education, I do my best to keep up with developments related to the extremely tough work of state-level implementation. I follow New Jersey’s progress especially closely because I took part in the work there (and I’m certainly biased in its favor).
If you also track such stuff, take a look at the “2013-14 Preliminary Implementation Report on Teacher Evaluation" recently released by the NJDOE.
There’s much to like here, including the way the state reports on the history of the program and its focus on district engagement and continuous improvement.
But two things really caught my eye. First, the report has some important data points. For instance:
- The pilot program included thirty districts and nearly 300 administrators.
- More than 25,000 educators took part in some kind of state training in 2013–14.
- The new program may have increased the number of teacher observations around the state by 180,000(!).
- More than half of districts are using some version of the Danielson observation instrument, and most of the remaining districts are using one of four other tools.
Second, the state is betting on “student growth objectives” (SGOs) and putting significant energy into implementing them well.
The state held forty-four SGO workshops from late 2013 through early 2014, then held another thirty-nine “SGO 2.0” sessions this spring, then added more this summer and fall because of demand. According to a state survey, teachers are reporting that SGOs can be beneficial to their practice.
Things aren’t perfect by any means. According to the state’s review, only 70 percent of SGOs were deemed to be “high-quality,” “specific,” and “measurable.” Most of the other 30 percent were found to lack specificity. There was also inconsistency in...