Preliminary results for Florida’s controversial new standardized tests are in, but education leaders who don’t trust the exams questioned the usefulness of the scores.
Student performance in South Florida was more or less “what you’d expect,” said Gisela Feild, Miami-Dade’s administrative director of assessment.
For starters, the initial results released Wednesday by the Florida Department of Education include no method for comparing the supposedly harder test to past years. The scores for each county were broken only into quartiles — ranking students statewide from lowest to highest. So a normal distribution would mean about 25 percent of students fall into each category.
By that measure, Miami-Dade students scored slightly lower than expected on most of the five tests. In neighboring Broward, performance also was slightly below average — except for algebra exams, on which students did a few percentage points better.
One area in which Miami-Dade students performed considerably below average was the algebra 2 exam, with 36 percent landing in the bottom ranking. By comparison, only 22 percent of Broward students were in the lowest category for that test.
Feild said it’s hard to know why students struggled more on the algebra exam since the tests are new and school districts haven’t even been allowed to look at the questions. The algebra tests, along with other subjects, were given on computers and included “technology-enhanced” questions that require students to do things like drag-and-drop items using a mouse.
“We don’t know in terms of the rigor, or the content or if the technology-enhanced items were more difficult,’’ she said. “I don’t know, because I don’t know what the tests looked like.”
In Broward, students did the worst on the geometry exam, with 27 percent scoring at the bottom.
What was evident from the data: There remain huge differences in student performance among schools in Miami-Dade. That was hardly a surprise. Typically, studies have shown, test results largely correlate with poverty levels. For example, 92 percent of mostly affluent students at Pinecrest Academy charter school performed better than their peers statewide on the math tests. By contrast, only 17 percent of students at Brownsville Middle, in one of Miami’s poorest neighborhoods, bested statewide scores.
The percentages the state released are not scores that determine whether students pass or fail. They showed only how each district’s students fared relative to other students across the state.
It’s also still not been decided how many students passed or failed because the state has yet to finalize cut-off scores. Florida Education Commissioner Pam Stewart released her “cut score” recommendations this week, giving the Legislature 90 days to review them.
Under them, for example, slightly more than half the state’s third-graders would likely pass the test — part of a goal of “raising the bar” on student performance. The State Board of Education is expected to formally adopt cut-off scores in January.
The FSA results carry massive weight in Florida’s education system, which has made them a lightning rod for criticism. They are used, in part, to evaluate teachers’ performance and set pay and to determine school grades for the 2014-15 school year. Teachers unions, the state PTA and the Florida Association of District School Superintendents have all protested rushing the new tests into use as measuring sticks.
Since the exams were given for the first time this year, and plagued by widespread technical glitches, Miami-Dade Superintendent Alberto Carvalho on Wednesday reiterated the position of critics: that the scores should be considered only as “baseline information to be used in future performance comparisons.”
“Conclusions beyond that would probably be inadequate,” he said in a statement.
Both the PTA and superintendents association recently declared they have “lost confidence” in the exams and have pushed the state not to issue school grades this year.
Among the top complaints: the quick roll-out of the tests has been confusing and stressful for districts, the test was developed using questions that were written for another state with different education standards, and the administration of the computerized exams was marred by massive technical glitches.
Despite the hiccups, an independent study released this month found the FSA results could still be used for group-level evaluations, such as school grades. The Department of Education has relied on that finding to explain why it’s forging ahead, amid a new wave of criticisms.
On Wednesday, the superintendent’s association reiterated its position with a new statement.
“The overall purpose of the accountability system should be to improve student performance and inform instruction. However, our current accountability system is based on a flawed and incomplete process that ultimately yields flawed and incomplete data upon which high stakes decisions are being based for students, schools and our communities,” the statement read.
The formula for calculating school grades is still being worked out, Department of Education spokeswoman Meghan Collins said.
“It’s too soon to say precisely how much of an impact [the FSA will have] or how we’ll be determining the school grades,” but there will be opportunities for public input, she said.