Every Bergen County school administrator we spoke with cringes at every school ranking that is presented to the public. There are many publications that provide school rankings. None of them are really about education. The ranking never focuses on the big educational picture.
School rankings have become an important measurement used by the public. Real estate firms use them. Community members use them to evaluate their investment in property taxes. School rankings sell magazines. People depend on them. They are very likely here to stay.
Teachers and school leaders are committed to the whole child. Beyond academic proficiency there are factors and needs unique to every student that cannot be ranked. In the context of the individual needs of children, ranking schools is meaningless. Education is about creative and critical thinking. It is often about individual solutions to individual educational needs and situations. This can’t be fit into a one-size-fits-all evaluation. The public should not use school rankings data as a definitive measure for comparison of quality between schools.
Newsweek bases its rankings on how well a district prepares students for college. We know that college isn’t for everybody. A really good school prepares students for whatever life course they choose to take. U.S. News & World Report says it recognizes that not all students are college bound. However, if a school is to receive a gold or silver medal, or even a numerical ranking, the district must score at a certain level in their “College Readiness Index.” We know that workplace readiness is critical as an indicator.
Some question the reliability of the data. One New Jersey district saw that from one year to the next its high school had gone from a gold medal listing to no listing at all. When someone inquired, he was told “U.S. News is currently experiencing some problems with the site, which is causing the discrepancy in the rankings.” He checked the site regularly. There was no change in the rankings.
There are issues with questionable criteria. Does a school day that is 10 minutes longer mean a school is better serving its students? There are many other concerns regarding magazines’ school rankings. They don’t tell the true story.
It is important that we offer constructive alternatives to support the challenge we make. There is an existing system, if done accurately, which would meet the need. Building on the existing work of others makes change an accomplishable goal.
The New Jersey Department of Education (NJDOE) has released School Performance Reports. This is the “new and improved” New Jersey School Report Card. The solution we offer utilizes these reports.
The intent of the Performance Reports is to inform parents about how well the public schools are educating their children. Parents and schools must be able to understand what’s being presented, and they must have confidence in the accuracy of the report. This wasn’t accomplished with the first release of these reports. Concern was expressed regarding the results and the metrics used. The Report was difficult to interpret and there were difficulties with communication when errors were first noted.
Many school administrators continue to have concerns about accuracy. Some found the forms didn’t allow for the full reporting of a district’s programs. NJDOE purchased testing information from the College Board. The one-year statistics (10th grade) they received didn’t account for students who took the PSAT exam in 9th grade.
Traditional “District Factor Groups” (DFG) are no more. In their place are “Peer Groups.” Group members can come from other counties. At one Bergen County district, the high school peer group includes schools from twelve different counties. The middle school also has twelve different counties. They are different from those at the high school. This has caused confusion. Most parents and taxpayers want to compare their schools to neighboring towns.
The School Performance Reports offer a lot of information. Each includes elementary and middle schools, as well as high schools. Peer Groups do provide a school-by-school comparison rather than the district-by-district comparison of the DFG. A very broad framework, developed by very able people, is already in place. The investment has already been made. To maximize this, the shortcomings of the first release must be addressed. Communication between districts and the NJDOE has to be regular and reliable.
When statistical reporting became digital, we traded off individuality, flexibility, and the reliability of information in return for small cost savings. It’s time to reconsider. In the not-too-distant past, people did information gathering. The U.S. Census Bureau comes to mind. Interviewers made the phone calls and went into towns armed with preset questions and a section called comments. We suggest the NJDOE take a similar approach. This would allow for the verification of data. Information used in a School Performance Report would be approved by the district’s administration. Special circumstances and the necessary variations that exist in every school and district would be explained in an understandable way. Confidence can be built in the reliability of the report.
This accomplishes a ranking system we can all live with. NJDOE can minimize the negative impact of rankings provided by magazines. Parents will know how their schools measure up, home buyers will know where to buy homes, and industries will know where the most educated workforce resides.