Follow Us:

Opinion

  • Article
  • Comments

Opinion: PARCC -- More Accurate than ASK, and the Numbers Prove It

The real problem with pre-PARCC student assessments is that ASK scores trended high, giving us an inflated sense of student proficiency

laura waters
Laura Waters

Last year some New Jerseyans were outraged by the implementation of PARCC, a new set of assessments that gauges student proficiency to meet state standards in language arts and math. Now that test scores are in and vitriol is down, what can we glean about the contextual value of PARCC?

As a recap, remember that last year’s arguments against PARCC testing had far less to do with the purported accuracy of new tools to measure student learning than concerns about a new evaluation system that links student growth to teacher and administrator job security.

For better or worse, a compromise between state legislators and union lobbyists diminishes that reciprocity. PARCC was not ‘high stakes’ for students and now it’s not high stakes for educators.

This elimination of a major anti-testing talking point gives us an opportunity to strip away the political arguments against PARCC and extract some preliminary comparisons among three different ways we’ve gauged student academic growth.

New Jersey has administered a variety of annual standardized tests since 1975, when the Legislature passed the Public School Education Act “to provide to all children of New Jersey, regardless of socioeconomic status or geographic location, the educational opportunity which will prepare them to function politically, economically and socially in a democratic society.”

The most recent New Jersey pre-PARCC assessments were called ASK and HSPA. A sampling of students also takes proficiency tests administered by the National Center for Education Statistics. These NAEP tests, often referred to as the “gold standard of assessments,” evaluate a group of randomly selected students in fourth, eighth, and 12th grades. (NAEP’s sampling methodology precludes school-to-school and district-to-district comparisons and, hence, no one who understands this suggests that NAEP is a substitute for annual statewide testing.)

And everyone, from national union leaders to anti-test crusaders, loves NAEP. For example, here’s October’s press release from the NJEA: “NJEA President Wendell Steinhauer lauded the results. ‘This year’s NAEP results are yet another data point supporting what we have known for a long time. New Jersey’s public schools are among the very best in the nation.’’’

If NAEP is the true yardstick of student proficiency, then we should see some preliminary correspondence between NAEP and PARCC scores. And if New Jersey’s old ASK tests reasonably reflected student learning, then we’d see some correlation with those aureate NAEP assessments as well.

While there are differences among the subjects and grades tested, as well as scoring metrics, all three assessments measure student mastery of state content standards in language arts and math. Let’s, then, look at comparisons in fourth-grade and eighth-grade proficiency levels for math and reading for ASK (last given in 2014), NAEP, and PARCC:

  • In 2014 59.8 percent of fourth-graders scored at or above proficient in language arts on ASK tests.

  • In 2015 42 percent of fourth-graders scored at or above proficiency in language arts on NAEP tests.

  • In 2015 52 percent of N.J. fourth-graders scored at or above proficiency in language on PARCC tests.

  • In 2014 74.6 percent of fourth-graders scored at or above proficiency on ASK tests.

  • In 2015 47 percent of fourth-graders scored at or above proficiency in math on NAEP tests.

  • In 2015 40 percent of fourth-graders scored at or above proficiency in math on PARCC tests.

  • In 2014 79.4 percent of eighth -graders scored at or above proficiency in language arts on ASK tests.

  • In 2015 41 percent of eighth -graders scored at or above proficiency in language arts on NAEP tests.

  • In 2015 53 percent of. eighth -graders scored at or above proficiency in language arts on PARCC tests.

  • In 2014 71.3 percent of eighth-graders scored at or above proficiency in math on ASK tests.

  • In 2015 46 percent of eighth-graders scored at or above proficiency in math on NAEP tests.

In 2015 24 percent of eighth-graders scored at or above proficiency in math on PARCC tests.

(Note from the New Jersey Department of Education: “PARCC Math 8 outcomes are not representative of grade 8 performance as a whole” because 30,000 students took the Algebra 1 test in seventh grade.)

With the exception of eighth-grade math, then, NAEP and PARCC scores are similar; fourth-graders found PARCC’s reading assessment a little easier than NAEP and a little harder in math. Eighth-graders found PARCC slightly harder than NAEP in both math and language arts.

But ASK scores were significantly higher, artificially inflating our perceptions of statewide student proficiency.

This hard truth, perhaps, is why New Jersey Teacher of the Year Maryann Woods-Murphy recently described “a near unanimous agreement” among teachers of the year that “consortia assessments -- PARCC and Smarter Balanced -- do a better job of measuring student understanding, based on what they need to know to become ready for college and careers, and that they better reflect what teachers are teaching to meet higher academic expectations.’

Parents have every right to challenge student assessments. Teacher union leaders and anti-test activists have every right to resent linking student outcomes to teacher evaluations. But we owe families an honest and accurate measurement of their children’s progress towards academic goals and readiness for life after high school.

Right now, that tool is PARCC.

Laura Waters writes about education politics and policy for NJ Spotlight and other publications. She also blogs at NJ Left Behind and has been a school board member in Lawrence Township (Mercer County) for 10 years.

Read more in Opinion
Sponsors
Corporate Supporters
Most Popular Stories
«
»