The rollout of Florida’s standardized tests was so hampered by glitches that students “shut down” after having their answers lost over and over again, and others may have accidentally gotten a peek at questions a day before they had to answer them.
A recently released survey of Florida school districts that use the controversial Florida Standards Assessments has provided a more detailed look into what it was like as districts tried to administer the computer-based tests in the spring.
In anonymous responses gathered as part of a testing postmortem ordered by the Legislature, district officials outlined issues with the computerized exams that ranged from the annoying to the alarming. One district said some students may have been clued in that they hadn’t supplied enough correct answers when the test refused to let them move on until they added another response. Another district worried there might be lawsuits from parents. And yet another resorted to capital letters and exclamation points to register the full extent of problems that cropped up:
“With SO many error messages and issues, frustration and stress levels were through the roof!”
Florida Education Commissioner Pam Stewart has touted the results of the study as “welcome news,” highlighting the overall conclusion that the exams were found to be an accurate way to measure what students have learned. The study found that only between 1 and 5 percent of students who took each test were affected by computer glitches. With that blessing, Florida plans to use results from the FSAs to assign schools a letter grade and evaluate teachers — which could affect their pay.
But the assurances from the state have done little to smooth over concerns by school districts across the state, which say the chaotic environment surrounding the testing can have immeasurable effects on how students performed.
“It’s almost asking people to suspend their personal experience for a moment and just trust us that it went well, when people know for a fact that it didn’t,” Andrea Messina, executive director of the Florida School Boards Association, told the Miami Herald.
Miami-Dade and Broward leaders have both said the tests should not be used to issue letter grades or evaluate teachers — a contention that has been echoed by the state’s superintendents association.
“What you saw was just inconsistency,” said Sally Shay, a district director of assessment for Miami-Dade schools. “One district may have done one thing, one school may have done one thing.”
Survey results show a majority of districts felt they had enough time and information to launch the new tests, but others describe a mad dash to keep up with changing requirements. Districts wrote there was “great confusion” because information was often given at the “last minute.” One district wrote they were “essentially flying blind” since they didn’t even know what computer screens would look like on testing day.
With SO many error messages and issues, frustration and stress levels were through the roof!
Anonymous school district response to survey on FSA testing
Mere weeks before the tests, school districts were told that text-to-speech accommodations for children with special needs wouldn’t be available and districts scrambled to find adults who could read prompts aloud during testing.
The report notes that in some schools, vague instructions meant entire grade levels had their math scores thrown out because students were allowed to use calculators when they shouldn’t have, or used calculators that weren't allowed. One district reported that new drag-and-drop features didn’t work and another said students weren’t able to review their work.
And on questions where students were instructed to “‘check all that apply,’ if a student selected only one option and two options were correct, the system would not let the student continue to the next item,” according to notes from one focus group.
“This cued the student to select another option,” the report said.
In an email, Florida Department of Education spokeswoman Cheryl Etters called the issues “isolated.” She chalked up the multiple-choice issue to students being confused, but said it wasn’t a “systemic issue” with the test. All problematic questions were removed from student scores after a customary post-test review that happens after exams are given, Etters wrote. And the follow-up study, conducted by the companies Alpine Testing Solutions and edCount, found that all the remaining questions were fair.
“If any test item is found to be flawed, it is not used to calculate student scores,” she wrote.
Survey results show the most frequent bug that schools had to contend with were unexpected computer crashes. Over and over again, students lost work in cyberspace when they were booted off the test. Sometimes their answers were recovered, sometimes they weren’t. Though Etters said preliminary data shows all tests responses were captured, the problem knocked confidence in the exams.
“No matter if you lose one or a million responses, that is a major impact... the trust in the system to capture responses is gone,” one school district wrote.
Many students also lost work because, the districts said, the save function didn’t work the way students were told it did. Another issue: students were able to continue typing when the connection was lost, but then the text would disappear once the connection was reestablished.
“There are students who just shut down after the third or fourth time and said, ‘I’m not writing this again,’” one school district wrote. “They would type in a few words and submit and say they were good to go. I cannot imagine any of these tests can be considered reliable.”
Requiring students to keep coming back to take the tests essentially meant some got multiple attempts to take it, according to some survey responses.
“This definitely gives an unfair advantage to students because they have more time to think about how to answer than their peers who took the test in one sitting,” one district wrote.
Test questions were also exposed because students were allowed to access parts of the exam before they were required to take it, according to the responses. Many districts reported that it was unclear when a student had finished one portion of an exam, so the students moved on accidentally.
Once the problem was discovered, some were forced to take two days of tests in one sitting. Other times, students were allowed to return another day to answer questions — creating “havoc” because it was difficult to reschedule test sessions and labor-intensive to log students in and out of the exams, school districts wrote.
The result, according to one district, was a major departure from the usual testing protocol.
“...We have never allowed students to come back to a test on a different day to finish, and that had to be done a countless number of times due to the problems encountered,” a response said.
Regardless of the study’s overall findings, school boards and superintendents remain unconvinced it’s fair to make high-stakes decisions based on a test that few trust.
“These were real problems that existed and because of the nature of the circumstances under which they emerged, it would be difficult for us to determine the magnitude,” Broward County Superintendent Robert Runcie said. “It’s always going to leave a cloud around this test.”
Follow @Cveiga on Twitter.