Monday, December 10, 2012
Hot or Cold?
Have you ever played the child’s game, “Hot or Cold?” Perhaps it goes by different names here and now. Well, I’ve been playing the game with officials at the state department of education for the last several weeks. Let me explain.
“Hot-Cold” was a game we often played during indoor recess at school when inclement weather prevented us from going outside to the playground. The game starts when one child leaves the room while the remaining children pick an object visible in the room. Once the child returns to the classroom he or she must walk around the room looking for the chosen object. The “searcher” is directed by chants from classmates of either “warm” or “cold” indicating whether he or she is in proximity to the mysterious object. If the searcher is a distance from the object he or she is redirected with calls of “cold” or “colder” if they continue to move in the wrong direction. As the person moves in the general direction of the item they are encouraged with shouts of “warmer’ until they are very close to it, at which point the class exclaims, “hot!” Then, the person guesses at the identity of the object.
Of course it would be easier if the class just told the person where the object was and what it is. But, that wouldn’t make for a very interesting game. However, when you are submitting your school district’s state mandated Annual Professional Performance Review plan to beat a state imposed deadline or lose state aid to your school district, it sure would help if the state just told you what they want instead of merely issuing periodic edicts analogous to the game’s directions of “cold, colder, warm, warmer.”
Even though the state posted the plan on their web portal in a formatted in a fill-in-the-blank fashion, the parameters they established left enough room for district’s to expend considerable time and effort (which is really another form of money) trying to hit a moving target. The local paper published an article in which a state official was quoted as expressing surprise at the wide variety and scope of plans submitted by districts. http://blog.timesunion.com/schools/ny-state-needs-you/1880/Our first submission was rejected and accompanied by a written explanation of those items where we were deficient. In other words, we were “cold.” I returned to the task of correctly creating a document that stretches to reach
The exactness which the state requires in these documents that carry significant legal weight regarding specified responsibilities governing evaluation protocol involving teachers and principals often means several revisions to the original submission. Each revision requires the superintendent, board of education president and teacher and principal union executives to sign off on the plan.
The interpretations appeared to vary among the examiners. Interestingly, one of the requirements in the plan involves “inert-rater reliability.” Here’s the Wikipedia definition of the concept: In statistics, inter-rater reliability, inter-rater agreement, or concordance is the degree of agreement among raters. It gives a score of how much homogeneity, or consensus, there is in the ratings given by judges. It is useful in refining the tools given to human judges, for example by determining if a particular scale is appropriate for measuring a particular variable. If various raters do not agree, either the scale is defective or the raters need to be re-trained.
It is ironic that in the process of having our submission reviewed we experienced differing analyzes. The initial submission was returned with several deficiencies (for example, I listed the state approved assessment as Measures of Academic Progress, when in fact the official name is, measures of Academic Progress for the Primary Grades. It did not matter that the shortened name of the test was on the same line as the designated grade of Kindergarten. Logic would ordinarily prevail and the evaluator could readily assume that since the name of the test, from the same vendor, was on the same line as a primary grade, and conclude that we had the right test but failed to add, the proviso Primary Grades. I could almost understand that. What baffled me however, was the rejection of language in the plan that was extracted verbatim from the plans of district’s that had already been approved by the state and posted on the state education department website as examples. The reviewer expressed empathy and acknowledged that he had heard the same refrain from other superintendents – but, nonetheless, we went on with the review of our plan during a phone conference. I have heard other superintendents share the same experience of having approved language rejected. It wasn’t an issue of context. These were sentences placed in the same boxes of the plan as other districts who had received state approval. It again illustrated how the state lacked inter-rater reliability, but demanded such a protocol from each district. I submitted our second submission after making the necessary adjustments.
Soon thereafter, I was notified by phone by a different state examiner that despite making the required adjustments, our plan remained deficient. This time, two of the four citations were of items that had been approved by the first examiner. I pointed out my concern regarding the apparent lack of inter-rater reliability among various examiners and was again met with an understanding and empathetic tone from the assessor. Aside from that, we discussed the tweaks needed to secure approval. I was frustrated with the process and stated that I did not appreciate that each very small deficiency required subsequent signatures from heads of stakeholder groups in the district. The examiner tried to assuage my apprehension by suggesting that I email him my revisions, outside of an official submission on the portal, and he would check them and let me know whether the changes would be acceptable. I was still “cold,” but likely moving in a ‘warmer” direction to find the educational Holy Grail. Hooray! Later that day I received an email indicating that he had reviewed the revisions and they look good!
Ah, but it was not to be. Despite the informal approval of the second examiner, he phoned me again and explained that his supervisor found some issues that required attention. Again, I expressed my displeasure with the outcome and requested that we reach a point where new examiners are not introduced into the process to discover even one more infinitesimal point to dispute. We arranged another phone call so he could discuss the issues with me. This time he was joined by a supervisor who was patient as I echoed my earlier complaints with the review process. The changes were tiny enough that it proved irritating. Add four words to a statement and delete the reference to the same statement on the rating scales that were included in the submission. It was like moving some coins from your left pocket to your right pocket. It’s very simple, but without any discernibly valid reason other than because the state says so.
Having made the corrections to the pleasure of the officials, I can collect the required signatures AGAIN and re-submit with hopes of an early Christmas present in the form of final approval so we can move forward to the actual implementation of the plan.