Republican state legislators are criticizing a recent change in how the state reports school test score data, saying that obscuring a small number of student results is masking failures at low-performing schools.
State education officials say their actions are intended to comply with a federal student privacy law and have vigorously disputed a news station’s allegations that they were intentionally hiding data about failing schools from the public.
In a flurry of letters to one another over the course of the past week, legislators and state officials have sparred over a very technical, statistical matter, but one that determines what gets publicly reported.
Here’s what you need to know to make sense of the dispute.
How did this conflict start?
The debate began in late January when a vendor working for the Maryland State Department of Education prematurely released Maryland Comprehensive Assessment Program data on the Maryland School Report Card website where the state reports test scores. Several news outlets, including The Baltimore Banner, downloaded the data, analyzed it, and reported the data in news stories.
Fox 45 used the raw data to calculate the number of Baltimore City schools — 23 — where no student had passed the math test.
In airing its report naming the 23 schools, Fox 45 made public information which could be used to identify how individual students in those schools scored on the test, which is against the Family Educational Rights and Privacy Act (FERPA). In other words, if a Fox viewer knew that a student attended one of those 23 schools, they would know that the student had failed.
The state took down the data from its website and made adjustments so that students couldn’t be identified as having failed the test.
“This isn’t something we did on the fly last month. This is something we have been working on for a number of years,” said Chandra Haislet, the education department’s head of the division of assessment, accountability and performance reporting. .
How did the education department change the way it reported the data?
The department continued to report results as they have in the past for most schools. But if fewer than 5% of students in a grade had passed a test and fewer than 30 students had taken that test, they replaced the results with asterisks for those whose scores were failing.
By making these changes, the department made it harder to piece together how an individual student scored on MCAP tests.
Let’s say your child is one of 15 fifth graders at a school in Maryland. Prior to the change, the school’s data in the Report Card would show that there were 15 fifth graders and fewer than 5% of them were proficient in math. But 5% of 15 is .75. So saying that fewer than 5% of students are proficient is the same as saying that no students in the class are proficient. Anyone who knows your child is a fifth grader at a specific school can figure out they aren’t proficient in math, and that violates the federal law that protects student privacy.
The state chose to increase the threshold for suppressing, or obscuring, test results when fewer than 5% of students are passing from 10 to 30, which it contends better reflects a previous rule the department had for avoiding identification. “Our rational for using 30 was really precedent. That is what Maryland as done historically,” Haislet said.
Why are some lawmakers and parents upset about the change?
Eight Republican delegates believe the newly revised reporting guidelines make the test score data useless, especially in low-performing schools where this data is critical for improving education, wrote Del. Kathy Szeliga, a Republican representing Baltimore County. “Maximum transparency in student and school achievement data is essential for parents, schools, educators, and lawmakers to help improve learning. Hiding the data should not be the response to failing schools.”
In an interview, Szeliga said she is particularly troubled by the department’s failure to report, in some cases, how many students took a test in a particular grade in a school. In addition, she said she objects to the state taking down test data from previous years and applying the new rules to mask information that has been public for years.
“Going back and changing historical data — that is crazy,” Szeliga said. It would be very difficult for anyone to identify a student who was in a particular school or grade five years ago, she said.
She said “there has been a complete head-in-the-sand response to failing schools in the city.”
But the city wasn’t the only school system that had schools where all, or nearly all, the students failed the math test. Nearly 300 schools in 20 of the state’s 24 school systems had fewer than 5% of students passing the MCAP math test, according to the state.
The legislators have asked the state to reverse course and go back to the original way data was reported on the Maryland Comprehensive Assessment Program, or MCAP.
A parent group, the Maryland Alliance of Parents and Students, issued a statement saying that the changes “should have been preemptively messaged to stakeholders. Addressing it after the fact creates doubt in the community and watchdog groups who rightly flagged this discrepancy.”
What do experts say about the changes?
Leroy Rooker, a FERPA expert and a senior fellow at American Association of Collegiate Registrars and Admissions Officers, said the state was right to adjust the data.
“What they are saying is we are trying to mitigate the possibility of someone identifying students and the fact that they failed. That is what they should be doing,” he said. “From a FERPA perspective, that is not information you can put out there.”
Andrew Ho, a Harvard University psychometrician in the School of Education, said that the state “had a duty to maintain student privacy and monitor school performance.” Whenever either zero or 100% of students pass a test, then student privacy is at issue, he said, “because the public would know the proficiency status of any tested student in that group. So, I believe that some suppression of proficiency percentages is reasonable.”
Ho, whose research aims to improve the design and interpretation of tests scores, would prefer states average scores instead of using proficiency rates, but he said it is more difficult to do technically and harder to explain to the public.
In addition, Stephen G. Sireci, executive director of the Center for Educational Assessment at the University of Massachusetts at Amherst, said he too believes the state made the right call.
“It sounds like the state is doing everything they can to protect their privacy. One thing I can say is that other states do have minimum threshold criteria of 30. So Maryland would not be alone,” said Sireci, who is on the state’s technical advisory panel.
Critics of the state are correct in saying that the changes in reporting test scores are most likely to mask the results at schools where more students failed.
How have public officials responded to the criticism?
In the past week, Fox 45 has been reporting that the state is intentionally hiding data about failing schools from the public and confronted the superintendent to try to get information. The department shot back with pages of retorts.
“These inquiries remain rooted in inaccurate information. To dispel inaccurate and misleading public reports,” the department said, it was releasing lengthy descriptions of the process it used and the reasons for doing so.
Asked by reporters about the test scores on Thursday, Gov. Wes Moore said he did not know about the state’s plans to redact certain results.
”The fact that there is not a level of coordination between the department and the governor’s office is something that needs to be fixed, something that needs to be addressed,” Moore, a Democrat, said.
”It’s an independent structure, and while we respect that, we also understand that we’ve got to do better when it comes to being able to present accountable, transparent results in our public education system,” Moore added.
State School Superintendent Mohammed Choudhury responded by forwarding emails to The Banner that showed the governor’s office had been notified on April 20 and April 25 about the changes and the statements the department was releasing to public officials and the press on the issue.
Pamela Wood contributed to this report.