Technology

After Twin Florida tragedies, online trolls and bots pounce

Students are evacuated by police from Marjory Stoneman Douglas High School in Parkland after a shooter opened fire on the campus. Online trolls and bots, many of which experts have tied to Russia’s campaign to spread disinformation on American social media, are capitalizing on breaking news such as the shooting.
Students are evacuated by police from Marjory Stoneman Douglas High School in Parkland after a shooter opened fire on the campus. Online trolls and bots, many of which experts have tied to Russia’s campaign to spread disinformation on American social media, are capitalizing on breaking news such as the shooting. Sun Sentinel

The proliferating army of online bots and trolls have homed in on two South Florida targets: the FIU bridge collapse and the Parkland school shooting. And some of the malicious accounts may have Russian ties.

Within hours of the Parkland shooting, hundreds of online bots were spewing false information about alleged shooter Nikolas Cruz, according to Wired and other news outlets. Their focus was gun control — one of the hottest issues in the U.S.

Now, the trolls are framing the bridge collapse into a divisive narrative about gender and immigration.

Less than 24 hours after the FIU tragedy, an article appeared on a website called Squawker, which bills itself as a “platform for conservatives of all schools of thought.” Its misleading headline: “A Female-Led Construction Company Built The Florida Bridge That Collapsed.” The article was accompanied by a photo montage of women wearing uniforms for MCM Construction, the project’s lead contracting firm.

“There are some things that women shouldn’t do. One of these is construction,” stated the article, written by Alisha Sherron. Sherron’s personal Twitter account states she “write(s) controversial things on the internet for a living.”

As the Miami Herald has reported, MCM is actually run by five brothers.

But the notion that the bridge collapsed because its engineers were female was almost immediately picked up by conservative troll accounts. A “parody” account, @DrProctorMusic2, that openly espouses white supremacist views, turned Squawker’s article into a Tweet that also singled out Leonor Flores, an MCM employee, as a “Mexican feminist” who had worked on the project. It was false; Flores was not involved in the project.

“Import the third world, become the third world,” DrProctor Tweeted, using hashtags including #miamibridgecollapse and #maga (short for Make America Great Again).

The Tweet received 26 retweets, 39 likes, and 11 responses. From that point on, Tweets about Flores multiplied. By 11 a.m. on March 16, the day after the collapse, FIU added a statement to its bridge update stating that Flores did not work on the project in any capacity. Fact-checking website Snopes.com also took up the task of debunking the Squawker report.

The accounts spreading this misinformation “are not trying to focus on one candidate or one policy win,” said Karen North, director of the Digital Social Media program at the University of Southern California’s Annenberg School for Communication and Journalism. “It’s really an active attempt to create discord and discontent in American society. They want people to disagree and be more polarized.”

When a Twitter user looks at breaking news events or searches for news using hashtags, they see Tweets from all across Twitter, and not just from their followers. That exposes them to false or inflammatory reports as well as true ones, said Ash Bhat, a Berkeley College student. He runs a project called RoBhat Labs that tracks unusual online behavior.

A vigil was held at FIU for the victims of the collapsed pedestrian bridge on March 21, 2018.

RoBhat performed a scan on March 21 for the Miami Herald of 1,500 Twitter accounts that the project classifies as political propaganda bots. The scan showed that about one in seven Tweets using the hashtags #fiubridge or #fiubridgecollapse were still exhibiting “bot behavior”— more than a week after the collapse occurred. Such accounts may have been tweeting every few minutes in a full day — unlikely for a human — or endorsing polarizing political propaganda and/or fake news.

“By bots and trolls tweeting about these events, their content gets wider exposure,” Bhat said.

He said he does not know whether the suspicious accounts they track have links to Russia.

But another project called Hamilton68, set up by nonprofit the Alliance for Securing Democracy, in conjunction with public policy research group the German Marshall Fund, specifically tracks Russia-linked troll activity online. And within hours of the Parkland shooting, Hamilton68 noted that Twitter accounts with suspected links to Russia “released hundreds of posts taking up the gun control debate,” according to the New York Times.

Using the hashtag #parklandshooting, the bots began posting false reports, including that suspected shooter Nikolas Cruz had searched for Arabic phrases on Google, the Times said.

An echo-chamber effect immediately kicked in. Pamela Geller, a far-right commentator, issued a false report about the shooter having been motivated by Islam and left-wing hate. Her story was then Tweeted out numerous times. The story remains live on her website, though it contains multiple updates disavowing previous claims she had posted.

In 2016, two known Russian Internet troll accounts were found to have retweeted a Tweet from Geller about a false story that she has since removed from her site, according to the American Jewish news site Forward.com. There is no evidence showing Geller has actively collaborated with Russia’s efforts.

Another false rumor — that Cruz was undocumented — also received wide exposure on social media, according to Snopes.

Last month, Twitter suspended thousands of accounts it deemed suspicious after special counsel Robert Mueller unsealed an indictment against Russia-based Internet Research Agency, accusing it of engaging in “operations to interfere with the elections and political processes” of the U.S.

But the Hamilton68 project shows that many malicious accounts remain active.

“Most (of these accounts) are operated by real users, some of whom are paid (Russian) government trolls, some with less clear connections to the Russian government, and some who are real users who heavily promote pro-Kremlin propaganda,” Hamilton68 spokeswoman Kelsey Glover said in an email.

  Comments