Florida’s new standardized tests for students administered last year were fair.
Or were they?
An independent review of the Florida Standards Assessment, released Tuesday, has done little to quiet questions about whether the exams are valid.
The answers to those questions are critically important, since so much rides on these test scores — from student promotion to school grades.
Never miss a local story.
State education officials say the new study proves the tests are an accurate way to measure student performance. Among the findings: that the state followed best practices to create its tests and individual exam questions were error-free.
As a result, scores will be baked into state-issued grades for schools and teacher evaluations.
“I believe it is in the best interest of our students that we move forward based on the results of this year’s FSA,” said Florida Education Commissioner Pam Stewart.
But local education leaders point to the very same study to confirm concerns about the exams.
“...Superintendents stand firm behind their initial position that the results of the Florida Standards Assessment (FSA) cannot fairly be used in teacher evaluations or to calculate A-F grades for public schools,” John Ruis, president of the Florida Association of District School Superintendents said in a statement.
Lawmakers ordered the analysis of the FSAs after a rough debut last school year. Technical woes and even a cyber attack prevented students from logging on to the computerized exams; others were booted off mid-test.
In May, the education department awarded an almost $600,000 contract to study whether the FSAs were fair. In 90 days, the companies Alpine Testing Solutions and edCount sorted through hundreds of documents and conducted interviews to come to their conclusions.
The main findings:
▪ Test scores for some students may be “suspect” because of the technical glitches, so the results shouldn’t be the only factor schools consider when making high-stakes decisions such as whether a student should be held back a grade, denied a high school diploma or placed in remedial classes.
▪ On the other hand, only a small percentage of students were impacted, so the test results can be used to issue schools grades and evaluate teachers. Between one and five percent of students taking each test on a computer were affected, the study estimates.
Perhaps the greatest concern superintendents have: the study’s finding that “many” questions on last year’s tests were not aligned with Florida’s education standards. The standards dictate what a student should learn in each grade level.
The questions were geared to what students in Utah are taught, though, because that’s where the questions used on Florida’s test were developed.
“This should be of grave concern to all of us,” Miami-Dade Superintendent Alberto Carvalho said. “Kids were taught Florida standards by Florida teachers in the state of Florida, but they were assessed with questions that fully matched Utah’s standards.”
Of the tests studied in the independent review, the exams with the highest stakes for students were found to be the least consistent with Florida standards. Only 65 percent of the questions used on the third and tenth grade English exams were found to match Florida’s education standards. Students typically have to pass the tests to be promoted to fourth grade or earn a high school diploma.
School leaders also point to the chaotic roll out of the test to question how fair it really was.
Computer glitches meant the testing environment was anything but standardized. Some students were able to see test questions before computer problems took over, and came back to finish the test another day. Having to administer the tests on computers also meant it took weeks for all students to sit for the exams. Both situations allow for students to share questions or try to look up answers, education officials say.
The study conceded that the administration of the FSA “did not meet the normal rigor and standardization expected with a high-stakes assessment program.”
It added: “scores for some students will be suspect.”
“This is hardly a blanket confirmation of FSA validity,” Bob Schaeffer, public education director for FairTest, a national organization that opposes what it calls the misuse of standardized testing, wrote in a statement.
Commissioner Stewart said in a Tuesday morning conference call with reporters that suspect student scores were thrown out by school districts.
But Carvalho pointed out that there’s no guarantee the invalidation will impact all schools the same. For example, if more scores of high-performing students were invalidated at one school, then throwing out the scores could actually have a negative impact once school grades are assigned.
“As I read that report, 21 times I read the words ‘appears’ or ‘likely,’ when speaking in terms of validating the reliability of the assessment,” Carvalho said. “Are we ready to say that the school grade etched on the school’s front wall ‘appears’ to be an A or is ‘likely’ a C?”
Miami-Dade leaders are especially worried about the state’s decision to move forward with assigning school grades because half of the testing data normally used to come up with the grade isn’t available this year. Typically, the formula counts a student’s learning gains — how well a student does from one year to the next.
Such comparisons can’t be made this year because students took completely different tests from one year to the next. Without student improvements included in grading formulas, districts like Miami-Dade — where more than 70,000 students are learning English as a second language and more than half of kids live in poverty – could be disproportionally impacted.
“What we’re asking for is a greater examination of the findings. We should not be sugar-coating the findings,” Carvalho said.