Teachers, Administrators Give Mixed Reviews to New Evaluations after Test Run
But survey of educators who took part in tryout of new system shows most not worried about impact on jobs, tenure
For all the debate going on outside classroom walls, New Jersey schoolteachers who actually have been through the new state-mandated evaluation system have not found it to be as nerve-wracking as everyone thinks.
In a survey conducted by a team of Rutgers researchers, teachers and administrators who took part in the two-year pilot rollout of the evaluation system had mixed reactions to the new rules and the potential consequences for their careers.
On one hand, there was a wide range of opinion regarding whether the system was entirely fair and accurate, with administrators expressing much more faith than teachers -- by a more than 2-to-1 margin.
Nevertheless, three-quarters of teachers surveyed by the Rutgers team said they were not worried that the new evaluations – including those newly tied to student performance -- would have a negative impact on their tenure protections.
Even among teachers working to attain tenure, a majority said the new metrics would have little impact or might actually help them more than hurt in keeping their jobs. There were some pockets of anxiety over job security, to be sure, but the Rutgers researchers said it was not widespread – at least not yet.
“There has been a lot of concern about the effects of the new evaluation procedures and changes in the tenure law: Would lots of teachers get fired?” said William Firestone, the lead researcher and professor at Rutgers Graduate School of Education.
“In our interviews, teachers in some districts were very anxious about new procedures that were ambiguous and therefore unsettling,” Firestone said in an email. “However, our survey data suggest that at the end of the day, most teachers are not really worried about losing tenure, and … the (pre-tenure) teachers think the changes will probably help them get tenure.”
The teachers were drawn from more than two dozen districts – from Alexandria Township in exurban Hunterdon County to urban Elizabeth -- that took part in the two-year pilot from 2011-13, leading up to the statewide adoption of the system this school year to meet the requirements of the state’s new teacher-tenure law, TEACHNJ.
The pilot was touted by the Christie administration as a crucial step in the development of the new evaluation system, which requires districts to follow approved models for observing teachers in the classroom and to use student growth measures for up to one-third of the teacher rating.
About 39 percent of the teachers in the pilot districts responded to the survey, the results of which were included in the, released last week, on Rutgers University’s evaluation of the pilot.
The research cited a number of concerns and recommendations, including the following:
One-third of surveyed teachers agreed that the evaluations were helpful in improving instruction, while another one-third said they were not. The rest of those surveyed were neutral.
More than one-third of the teachers -- more than 40 percent, in some cases -- said they found their specific student performance data, including test scores, helpful in improving instruction.
A majority of administrators said they had at least some troubles with new evaluations’ reliance on computerized data-management systems. “Only about a quarter of respondents reported rarely encountering any significant technical problems,” the report read. “However, anywhere from 5% to almost half the respondents might encounter problems, with items like short log-in periods, problems saving information, and crashing the tool being mentioned frequently.”
Michael Yaple, spokesman for the state Department of Education, said the final report provided useful feedback and information as the department prepares to implement the new system.
“The Rutgers report – as well as the input from educators in the field, focus groups, and our statewide advisory group – all give valuable information that can help guide our efforts,” he said. “It’s another tool that gives us insight into the challenges and strengths of the program.”