The Election Is Over, But Russian Disinformation Hasn’t Gone Away

Share:
Sen. Amy Klobuchar, (D-Minn.), speaks next to a poster depicting an online ad that attempted to suppress voters during a hearing about Russian election activity on Oct. 31, 2017.

Sen. Amy Klobuchar, (D-Minn.), speaks next to a poster depicting an online ad that attempted to suppress voters during a hearing about Russian election activity on Oct. 31, 2017. (AP Photo/Andrew Harnik)

November 1, 2017

The internet was, once again, blowing up.

It was late September, and President Donald Trump had recently spoken out against NFL players who refused to stand during the national anthem, calling them unpatriotic and saying they should lose their jobs for taking a knee.

“Wouldn’t you love to see one of these NFL owners, when somebody disrespects our flag, to say, ‘Get that son of a bitch off the field right now,” Trump said during a rally in Alabama.

For weeks, buzz over the anthem controversy continued to unfold on Facebook and Twitter. Americans on both sides of the issue stormed social media, hurling hashtags like #NewNFL to #takeaknee.

Yet, Americans weren’t the only ones chiming in to the debate. Rather, many of the posts, experts say, were coming from Russian internet trolls.

“I’ve never seen an issue have that kind of shelf life,” said Bret Schafer, coordinator for communication, social media and digital content with the Alliance for Securing Democracy, a public policy research group that tracks in near real-time “hashtags, topics and URLs promoted by Russia-linked influence networks on Twitter.”

Ordinarily, these accounts will latch onto a particular story “for a few days, and then it’s onto the next week,” Schafer said. “But, clearly they found something that works with the NFL.”

In total, between Sept. 23, the day of the president’s speech, and Oct. 20, more than 11 percent of the most shared URLs by these accounts were related to the NFL controversy, based on data gathered by the group. As recently as last week, Schafer said, the NFL still appeared as a top topic on their Hamilton 68 dashboard, which tracks more than 600 Twitter accounts.

“They don’t care about that issue one way or another,” Schafer added. “What they care about is it’s a great vehicle to hop on, to take a side, to get an impassioned audience connected to an account where they can then push other agendas.”

In January, U.S. intelligence agencies concluded with “high confidence” that Russian President Vladimir Putin sought to influence the 2016 presidential elections, employing an aggressive cyber campaign of government agencies, state-funded media, third-party intermediaries and paid social media trolls “to undermine public faith in the U.S. democratic process, denigrate Secretary Clinton, and harm her electability and potential presidency.”

However, Russian disinformation didn’t come to an end on Election Day, according to public policy and cybersecurity experts. It is as present as ever in the post-election landscape, they say, working to divide Americans by exploiting the country’s most contentious issues — while providing Russia cover to pursue its geopolitical interests.

“They’re still active, they’re still present, they’re not waiting for the 2018 cycle to be active in American political discourse,” said James Ludes, executive director of the Pell Center for International Relations and Public Policy. “The threat is bigger than the election of 2016, the threat is bigger than the election of 2018. The threat is really a more fundamental assault on liberal democracy in the West.”

Russian disinformation is not always explicitly tied back to the Kremlin, according to Schafer. While the Hamilton 68 dashboard does monitor state-sponsored media outlets like Sputnik and RT, it also tracks automated bots that amplify stories from Russian accounts, as well as third-party users that post in support of pro-Russian policies and themes.

Taken together, these accounts contribute to an extensive influence operation that spreads propaganda and disinformation across the web to sow discord, blurring the lines between fact and fiction as Americans unknowingly consume fake or Russian-influenced content.

Molly McKew, an information warfare expert who served as adviser to the former president of Georgia, said that Russian cyber campaigns in the U.S. mimic similar efforts in Europe and the former Soviet republics.

“Anything they find divisive within a society, these are issues where they tend to play. And, the logic behind it being if you can weaken the social fabric of a country or a target of any kind, that gives you a greater ability to influence the outcome or sort of have an advantage in a weakened playing field,” she said.

In September, Facebook revealed that between June 2015 and May 2017, a Russian firm linked to the Kremlin purchased approximately $100,000 worth of ads on a slew of contentious issues.

“The ads and accounts appeared to focus on amplifying divisive social and political messages across the ideological spectrum — touching on topics from LGBT matters to race issues to immigration to gun rights,” wrote Alex Stamos, Facebook’s chief security officer, in a post, adding that they have since shut down the fake accounts and pages.

On Tuesday, officials from Facebook, Google and Twitter began two days of congressional hearings on the role their platforms played in Russian disinformation efforts during the 2016 elections. In prepared testimony, Facebook disclosed that as many as 126 million users were exposed to Russian-influenced content between 2015 and 2017, adding that the Kremlin-affiliated Internet Research Agency, the St. Petersburg “troll farm” that employs hundreds of Russians to distribute pro-Russian messages, had posted around 80,000 pieces of divisive content that directly reached around 29 million Americans.

On Wednesday, Facebook upped those figures, acknowledging that Russian-linked content may have reached around 150 million people, citing the additional reach of Instagram, which Facebook owns.

Twitter reported that it had found more than 2,700 accounts, which posted around 131,000 tweets, traced back to the Internet Research Agency between September and November 2016, in addition to more than 36,000 automated Russian bot accounts that tweeted 1.4 million times between September 2016 and November 2016.

Since the election, there has been a shift away from completely automated, bot accounts and a move toward “cyborg” accounts that are half human, half bot, according to Samantha Bradshaw, a researcher with the Computational Propaganda Project at Oxford University. These accounts combine automated activity with human input, occasionally posting an original comment or responding to a message.

“That makes it really hard to tell if an account is a bot and then therefore take it down,” she added.

According to experts, what may be most alarming is that stories promoted by Russian-influenced accounts are rarely complete fabrications. Instead, they feed on already brewing fissures and conflict, like the NFL anthem controversy, and use it to their advantage.

“Russian disinformation is this dark mirror of our own societies,” McKew said. “They can’t create these things, they can’t make something true that is not true in any way, shape, or form, but they can amplify the doubts and divisions and things that are already there and that’s kind of the key.”

Schafer described typical stories seen across the Hamilton 68 dashboard as the issue du jour tantalizing the U.S. on any given day, mixed with some sort of conspiracy theory, and content that promotes Russia’s foreign policy interests.

For example, between October 14 and October 20, the most prominent story promoted by Russian influenced accounts were the unproven allegations that Hillary Clinton lent support during her time as secretary of state to a business deal that gave Russia control over 20 percent of uranium production in the U.S. in exchange for donations to the Clinton Foundation. The week prior, the Harvey Weinstein scandal was a popular topic, with specific focus on his links to prominent Democrats.

Schafer said that around 90 to 95 percent of all domestic content they’ve tracked seems to be targeting audiences on the far-right.

But Russian disinformation networks are not necessarily ideological, according to Schafer. Rather, they find traction within networks like the alt-right and use that to promote content around topics they actually care about, including Russia’s geopolitical interests, particularly around Syria.

Between August 2 and October 20, Syria was consistently one of the most discussed geopolitical topics on the Hamilton 68 tracker — and by a wide margin. On any given day, Syria is almost always one of the top three hashtags and topics discussed by the network.

“This is the end game for the Russians: to turn Americans against Americans, to so distract us with internal divisions that we’re not able to pay attention to things happening abroad,” Ludes said. “And, that will give Russia a freer hand internationally and domestically.”

Looking ahead to future elections, experts worry that other countries could mimic Russian disinformation efforts and launch their own attacks on the U.S. And as the American electorate continues to grow increasingly polarized, they worry the country is ignoring the larger issue: Russia as a strategic threat to Western democracy.

“The way that Russia looks at it, the American style of democracy is the anomaly in a long timeline of history where the strongman rule is going to be the thing that dominates. And, they still think this period of history is coming to an end,” Mckew said. “They want to create a parallel system where they have advantage, where they can continue to steal and do well and prosper on the backs of their own people while nobody notices what it is they’re doing. And, the only way to do that in their view is to rip everything else apart.”


Nicole Einbinder

Nicole Einbinder, Former Abrams Journalism Fellow, FRONTLINE/Columbia Journalism School Fellowships

Twitter:

@NicoleEinbinder

More Stories

‘Documenting Police Use of Force’ Filmmakers & Reporters on Navigating Obstacles in Their Reporting
The filmmakers and reporters behind "Documenting Police Use of Force" spoke about how they built an extensive database and created a visual narrative that traces why and how people die in the wake of police using what is known as “less-lethal force.”
April 30, 2024
One Day After Elijah McClain, Another Young Man Was Sedated While Restrained by Police
The day after Elijah McClain’s ultimately fatal encounter with law enforcement and paramedics, another young man ​​— 24-year-old Taylor Ware — was forcibly sedated and later died. Ware’s story unfolds in an excerpt from ‘Documenting Police Use of Force.’
April 30, 2024
Dozens of Deaths Reveal Risks of Injecting Sedatives Into People Restrained by Police
The practice of giving sedatives to people detained by police has spread quietly across the nation over the last 15 years, built on questionable science and backed by police-aligned experts, an investigation led by The Associated Press has found.
April 26, 2024
Harvey Weinstein’s Rape Conviction Overturned by New York Appeals Court
The New York Court of Appeals ruled in a 4-3 decision that disgraced Hollywood producer Harvey Weinstein did not receive a fair trial when he was convicted of sex crimes in 2020.
April 25, 2024