Revision of State Test Scores Leaves Some Puzzled

John Mooney, Education writer | September 16, 2010 | Education
Few complaints after tweaked tallies lift pass rates, but plenty of questions

The return of student test scores to districts is an important rite of summer for New Jersey public schools. They’re the numbers that help place students in specific classes, measure schools against federal standards, and could someday be used to evaluate and even grant tenure to teachers and principals.

But this summer’s release in itself provided drama, as the state without explanation pulled back the elementary school language arts scores delivered in July and released new, slightly higher ones in the waning days of August.

In the end, the difference may not have been that great for most students – a point or two on the scale — but state officials said the adjustments did raise the statewide pass rate for these tests by up to 5 percentage points across New Jersey.

What exactly those new pass rates are remains a mystery, and some local districts still reported considerable declines. State officials said the scores will not be released publicly for a few more weeks at least.

‘Psychometrics’ and ‘Anchor Items’

But even in higher performing districts, it wasn’t exactly a confidence booster for a state testing system that drives state and local policy — and that costs close to $27 million a year through an independent contractor.

The state’s new director of assessment, Jeffrey Hauger, said the adjustments were due to a technicality in how the contractor, Measurement Inc., equated the language arts section of the NJASK test for grades 3 to 5 across multiple years, so as to keep the scoring consistent for comparisons.

Psychometrics — the science of educational and psychological testing — is arcane stuff, but in short, according to Hauger, Measurement Inc. chose not to include a certain category of test questions as so-called “anchor items” across years, skewing the final numbers. It was caught by outside reviewers, as well as through some unexpected drops in scores, he said.

Hauger, a former assessment director in Moorestown schools, conceded the late changes likely didn’t help districts through what is already a high-pressure situation of analyzing and acting on the results.

“I recognize it’s a big deal because of the timing where reports are released and then needed to be re-released,” he said. “It’s nice to have this information in a timely manner for the placement of students. But I also would rather have information that is correct.”

Few Complaints, But Unease Persists

Districts weren’t much complaining, state officials said, with the changes mostly boosting their pass rates. But with the importance of test scores only rising nationally and statewide, it left some uneasy.

“The change for us wasn’t that cataclysmic, but it did require us to go back and see that everything was all right,” said Robert Gratz, superintendent of Hackettstown schools.

A handful of Hackettstown students’ placements needed to be changed. “Not that many, but we still needed to see that everything was in order,” he said.

“I realize this isn’t an exact science, yet we’ll base a student’s placement or a staff person’s professional standing on these numbers,” he said. “Sometimes it makes you ask, ‘Is it really the student or is it the test?’ “

The occasional anomalies in New Jersey test scores are not unheard of, and overt changes in scoring have led to even more significant rises and falls in passing rates over the last 20 years. An adjustment two years ago in the so-called “cut scores” — or the score needed for passing — led to a big drop in passing rates.

“This is the thing that drives people crazy,” said Christopher Tienken, an assistant professor at Seton Hall University’s College of Education and Human Services, who has studied state assessments nationally.

“To have something like this occur, and this isn’t the first time, it shows that things these are anything but stable,” said Tienken, a former assistant superintendent in Monroe schools.

Tienken said he has seen it time and time again in other states as well, and he repeated his caution to principals about relying too much on the numbers.

“Administrators should not be using them to make high-stakes decisions about kids,” he said. “It’s just bad practice.”