Facebook announced on Thursday that it has taken down three “coordinated inauthentic behavior” networks promoting disinformation that included nearly 300 Facebook and Instagram accounts along with dozens of Facebook Pages and Groups. While the efforts were seemingly run independently, and focused primarily outside of the US, each has ties to Russian intelligence—and they collectively provide a sobering echo of the social media assault that roiled the 2016 election.
The networks Facebook tackled dated back at least three years, but most had few followers at the time they were caught. They primarily promoted non-Facebook websites in an apparent effort to get around the platform’s detection mechanisms, focusing on news and current events, particularly geopolitics. They targeted users in a number of countries, including Syria, Ukraine, Turkey, Japan, the UK, and Belarus, as well as the United States to a lesser extent.
Given Russia’s impact through digital influence operations during the 2016 United States presidential race and in democratic elections around the world, state and federal officials and researchers—not to mention tech companies—have been bracing for activity in the US during 2020. Earlier this month, Microsoft announced that it had caught Russia’s Fancy Bear hackers targeting hundreds of campaign-adjacent organizations. Facebook warned repeatedly on Thursday that despite the successful takedown, it’s still bracing for whatever might come next.
“It’s not new. These are tactics and techniques we’ve seen before,” Nathaniel Gleicher, Facebook’s head of security policy said on a Thursday call with reporters. “But the increasing reliance of these actors on these techniques is another sign that operating networks with fake accounts on Facebook and, quite frankly, elsewhere on major social media platforms is getting harder and harder for them.” Russian efforts have evolved, he added, to enlist unwitting users to amplify their messages and to build websites that exist outside of the social media platforms to avoid detection. “The good news about this is that both of these techniques are difficult, they are slower, and they are less guaranteed to be successful than the techniques we saw them use in 2016,” Gleicher said. “In short, they’re being forced into using less effective techniques but they are still trying.”
Facebook attributed one of the disinformation distribution networks to “actors associated with election interference in the US in the past, including those involved in ‘DC leaks’ in 2016.” In other words, the actors were likely tied to Fancy Bear, also known as APT 28, the group also responsible for hacks of the Democratic National Committee and Hillary Clinton’s presidential campaign.
Facebook attributes the second network to “individuals associated with past activity by the Russian Internet Research Agency,” the so-called troll farm that wreaked havoc on Facebook in 2016. The company noted that it is unclear whether the IRA is still an active entity or what form it takes at this point. The third network had “links to individuals in Russia, including those associated with Russian intelligence services.”
None of the networks focused solely on the US. Instead, they engaged with a broad array of topics connected to Russian interests, including the war in Ukraine, the Syrian civil war, the election and protests in Belarus, Russia’s relationship with NATO, and politics in Turkey.
The Fancy Bear-linked campaign developed phony personas, posed as journalists, and built groups that purported to be local in target regions. They all pushed Facebook users to external, Russia-controlled sites as well as alternative social media accounts. Among other things, those destinations contained content about alleged leaks of sensitive or compromising information. The network tied to IRA-linked individuals included accounts and groups collectively posing as a Turkey-based think tank. The third network was particularly targeted at topics related to Russia’s neighboring countries, including Belarus, and involved the creation of phony personas that pretended to be researchers and editors soliciting articles and other content.