- Court was wrong in ruling: #NYC can release specific teacher data in the form of scores and rankings
- Teachers were first told performance evaluations were for their personal development #BaitandSwitch
- Public data release is not the best way to measure student #achievement @AEIeducation
It's not too often you will find a teachers union critic (which I am) saying, "UFT president Michael Mulgrew is right." But Mulgrew is.
Last week, a state appellate court ruled that New York City can release reports revealing data on student achievement gains on a teacher-by-teacher basis, with teachers' names attached. Mulgrew was right to term this a bad decision and announce that the UFT will appeal the ruling.
The ratings in question cover about 12,000 fourth- through eighth-grade teachers whose kids have taken the state reading and math assessments. Applying a statistical model that incorporates a variety of factors, including student absenteeism, race, class size and so forth, the Education Department determines a teacher's "value-added" score. Using these numbers, teachers across the system could theoretically be ranked, from first to last, based on student gains on the reading and math tests.
Several media organizations (including the Daily News) had sued for access to the individual teacher data. The appellate court ruled for the media outfits, explaining its decision as follows:
"Balancing the privacy interests at stake against the public interest in disclosure of the information ... we conclude that the requested reports should be disclosed. Indeed, the reports concern information of a type that is of compelling interest to the public, namely, the proficiency of public employees in the performance of their job duties."
The court got it wrong. These data, released in this fashion, do not serve a compelling public interest.
The dispute is a replay of one that transpired on the other coast last August. A year ago, the Los Angeles Times unveiled a dramatic analysis in which it contracted with researchers to conduct a value-added analysis for thousands of L.A. teachers. The stories-and the elaborate data set accompanying them-yielded a throwdown marked by questions of accuracy, missing data, statistical reliability and the rest.
But in that case, the L.A. Times could plausibly make the case that it was pushing an inert district to start paying attention to value-added data. Plus, the newspaper got ahold of the information; it wasn't affirmatively released by schools officials. In New York City, where the district has been itching to use the data and no such shock therapy is necessary, the courts are about to undermine that effort.
Don't get me wrong. Student achievement should be incorporated into teacher evaluation and compensation, and transparency is a vital tool for recognizing excellence and shaming mediocrity. But a public data release is the wrong way to get there.
First, at the most technical level, there are enormous questions about the "right" way to construct a value-added model, and teacher evaluations can move markedly depending on the decisions that are made.
Second, in the substantial number of cases where students receive considerable pull-out instruction-or work, for instance, with a designated reading instructor-value-added calculations aren't going to effectively isolate the impact of a particular classroom teacher. Her results might be pulled down by inept colleagues, or lousy teachers might wind up looking better than they are.
Third, there's a profound failure to recognize the difference between responsible management and this sort of public transparency. It's fair for taxpayers to want to know exactly how their money is spent and-and to expect leaders to report on organizational performance. It typically doesn't make sense, however, for the public to get the numbers of citations each cop in the NYPD issues or all the performance reviews a National Guardsman was given by his commanding officer.
Why? Because we recognize that these data are imperfect, limited measures and that using them sensibly requires judgment. Sensible judgment becomes much more difficult when decisions are made in the glare of the media spotlight.
Finally, while it makes good sense to incorporate these data into well-designed performance evaluations, teachers have some justification for feeling like there's been a bait-and-switch. In 2008, then-Chancellor Joel Klein wrote a letter to the city's teachers, assuring that the new Teacher Data Initiative was a "tool to help teachers learn about their own strengths and opportunities for development" and that data reports were "not to be used for evaluation purposes."
In 2010, Klein shifted his stance, writing in an op-ed, "We believe that the public has a right to [individual Teacher Data Reports, with teacher names attached] under the Freedom of Information Law." You can hardly blame teachers for feeling sucker-punched.
Where Chancellor Dennis Walcott stands on all this is not yet clear. He would do well to make known that these data are a sensitive professional evaluation and performance tool-not fodder for the front page of the city's newspapers.
Frederick M. Hess is director of education policy studies at AEI.