Facebook Reveals Ongoing Political Influence CampaignsPropaganda Efforts - and Adversary OPSEC - Continue to Improve, Experts Warn
Facebook has suspended eight pages and about 24 accounts for what it describes as their being "involved in coordinated inauthentic behavior." But the social networking giant stopped short of saying which individuals or organizations might have been behind the political influence campaign (see Facebook Removes 'Bad Actors' for 'Inauthentic' Activity).
See Also: Role of Deception in the 'New Normal'
Facebook says 290,000 Facebook accounts followed at least one of the eight pages, the earliest of which was created in March 2017, and the latest in May. The pages collectively posted 9,500 pieces of content, and since May 2017, had been used to plan or organize 30 real-world events.
Despite the takedowns, however, it's not clear how many more such pages and accounts might remain active on Facebook, never mind other social networks. Some experts in combating propaganda have suggested that Facebook should be acting more quickly to counter such activities.
On Tuesday, Facebook announced the takedowns. "We're still in a very early stage of our investigation and we don't have all the facts, including who may be behind this," Facebook COO Sheryl Sandberg told reporters in a Tuesday press briefing. "We're sharing what we know today because of the connection between these bad actors and an event planned in Washington next week." She also promised to issue updates as more information comes to light.
Facebook has stopped short of attributing the pages to any specific individuals or entities. But its move comes after the U.S. government assessed that Russia's military intelligence agency, the GRU, used phishing and hacking to steal and leak emails in a bid to undercut the 2016 presidential campaign of Hillary Clinton (see More Indictments in Russian Election Interference Probe). The U.S. government also blamed Russia's Internet Research Agency, an alleged troll factory, with attempting to manipulate social media to amplify divisions in U.S. society.
Facebook says that while it concurs with both of those assessments, it's not clear that the same groups were behind these latest efforts.
"It's clear that whoever set up these [now-blocked] accounts went to much greater lengths to obscure their true identities than the Russian-based Internet Research Agency, IRA, did in the run up to the 2016 U.S. presidential election," Sandberg said. "Security is an arms race and it's never done."
Facebook says that new procedures that it put in place to make it more difficult for groups engaged in this type of behavior did appear to have blocked some attempts to create or operate these types of pages outright (see Facebook's Security and Privacy Overhaul Comes at a Price).
At the same time, groups running influence operations have refined their operational security, aka OPSEC. "They use VPNs and internet phone services to hide their identity and paid third parties to run ads on their behalf," Nathaniel Gleicher, head of cybersecurity policy at Facebook, told reporters on Tuesday.
Facebook: No Attribution
Alex Stamos, Facebook's CSO, says that while there is some evidence tying some of the 32 now-blocked pages to individuals who previously worked for the IRA, the evidence was too circumstantial to attribute to any individual or organization. In addition, he says that many of the group's tactics are now public record, meaning they could be copied.
"We have proactively reported our technical findings to U.S. law enforcement because they have much more information than we do, and may in time be in a position to provide public attribution," Stamos said.
"Facebook is moving cautiously - and responsibly - on attribution here, likely indicating a better-than-IRA level of adversary OPSEC in this op," Thomas Rid, a professor of security studies at Johns Hopkins University and expert on disinformation campaigns, says via Twitter.
Some propaganda experts, however, suggested Facebook should be acting more quickly to battle political influence campaigns.
"It's understandable that the scale of the problem is huge - but the budgets of social media firms are also massive," Sam Woolley, director of the digital intelligence lab at the think tank Institute for the Future, tells the Wall Street Journal.
Analysis: Digital Forensic Research Lab
About 24 hours before the takedown, Facebook shared eight of the pages with the think tank Atlantic Council, which says it's continuing to study them.
In May, Facebook said it was working with the Atlantic Council's "ElectionWatch" program, which is designed "to identify, expose and explain disinformation during elections around the world," according to the think tank.
Atlantic Council says it's also been exploring leads from the Justice Department's indictment of 12 GRU officers last month for 2016 U.S. presidential election interference. Based on information contained in the indictment, the Atlantic Council has identified at least one Facebook group, with 4,000 members, which appeared to have been created by the Russian government.
On Tuesday, the Atlantic Council's Digital Forensic Research Lab published a preliminary report on some of the blocked pages.
"It's clear that whoever set up these accounts went to much greater lengths to obscure their true identities than the Russian-based Internet Research Agency has in the past," its researchers say in a blog post.
The researchers caution, however, that they do not have access to the full set of data that Facebook used to determine that the pages were not legitimate.
But Facebook's Stamos, discussing the takedown in broad detail, revealed that the social network was able to tie some account activity to previously seen activity.
"We have found evidence of connections between these accounts and previously identified IRA accounts," Stamos said. "For example, in one instance a known IRA account was an administrator on a Facebook Page controlled by this group. These are important details, but on their own insufficient to support a firm determination, as we have also seen examples of authentic political groups interacting with IRA content in the past."
Lawmakers Blame Kremlin
U.S. intelligence chiefs have continued to issue public warnings that Russia's attempted interference in the country's political system has not diminished at all since 2016. With the U.S. midterm elections this November now just three months away, some states say the federal government has been doing too little to combat hacking efforts against states' electoral systems as well as information warfare campaigns disseminated via social media (see Will Congress Lose Midterm Elections to Hackers?).
Following Facebook's Tuesday takedown announcement, some lawmakers didn't hesitate to point fingers.
"I am glad to see that Facebook is taking a much-needed step toward limiting the use of their platform by foreign influence campaigns," said Republican Sen. Richard Burr of North Carolina, who heads the Senate Intelligence Committee.
But he said more needs to be done across all social media channels to combat such efforts. "The goal of these operations is to sow discord, distrust, and division in an attempt to undermine public faith in our institutions and our political system," he said. "The Russians want a weak America."
Sen. Mark Warner of Virginia, the top Democrat on the Senate Intelligence Committee, said the "disclosure is further evidence that the Kremlin continues to exploit platforms like Facebook to sow division and spread disinformation, and I am glad that Facebook is taking some steps to pinpoint and address this activity."
The Russian government continues to deny that it interfered in the 2016 U.S. presidential election or that it continues to do so.
Aim: 'To Promote Divisions'
Regardless of the identity of the pages' creators, their intent appeared to be obvious, Digital Forensic Research Lab's researchers say. "The pattern of behavior by the accounts and on the pages in question make one thing abundantly clear: they sought to promote divisions and set Americans against one another," they say.
"Their approach, tactics, language, and content were, in some instances, very similar to accounts run by the Russian 'troll farm' or Internet Research Agency between 2014 and 2017," the researchers say. "Similarities included language patterns that indicate non-native English and consistent mistranslation, as well as an overwhelming focus on polarizing issues at the top of any given news cycle with content that remained emotive rather than fact-based."
The accounts appear to have been designed to help build an online audience that could be used to promote real-world events, such as protests. "This specific set of accounts was focused exclusively [on] engaging and influencing the left end of the American political spectrum" as well as "designed to catalyze the most incendiary impulses of political sentiment."
Resisterz Planned Washington Event
One of the blocked accounts was "Resisters" - @Resisterz - which described itself as being concerned with "online and offline feminist activism against fascism." Digital Forensic Research Lab notes that the account didn't list any managers or moderators, with only a Facebook messenger address as a contact point.
The account promoted a variety of events, some of which were hosted by what appear to be legitimate groups. "Events that it promoted - but did not necessarily host or co-host included protests against U.S. Immigration and Customs Enforcement (ICE), U.S. President Donald Trump's tax plan, protests against Trump's Muslim ban, and a 'March against rapist cops,'" Digital Forensic Research Lab's researchers say.
One upcoming event it organized was a protest scheduled to run from Aug. 10 to 12, called "No Unite the Right 2, D.C." in Washington.
Resisterz Writers: Russian?
Whoever posted content to Resisterz, however, appeared to be Russian. The account "repeatedly made linguistic errors which are uncharacteristic of American colloquial language yet characteristic of native Russian-speakers, especially an inability to use grammatical articles - 'the' and 'a/an' - and difficulties with singular and plural verb forms," Digital Forensic Research Lab says. "This was one of the most telling identifiers of the troll accounts which targeted the U.S. from Russia in 2014-2017."
Other pages taken down by Facebook on Tuesday echoed previously seen IRA efforts in other ways. For example, the "memes, tone and posts" to @warriorsofaztlan "strongly resembled those of known Russian troll account 'Brown Power,'" Digital Forensic Research Lab's researchers say.
Target: Left-Wing Communities
Previous IRA efforts appeared to focus on amplifying concerns held by the right side of the political spectrum. But the Facebook takedown revealed pages that instead seemed to be focused on left-leaning groups.
Digital Forensic Research Lab says that when it comes to information warfare campaigns, all ideology is a target.
"Such online activity poses a danger of both disinformation, which we define as deliberate spread of false information, and misinformation, which we define as the unintentional spread of false information," the researchers say. "The Russian operation in 2014 through 2017 showed how easily disinformation actors could seed their falsehoods into genuine American communities on the right and the left; Americans thus became the unwitting amplifiers of Russian information operations."
The eight pages that Digital Forensic Research Lab's researchers reviewed appeared to have been designed to achieve similar ends - namely, amplifying existing divisions in U.S. society as part of an effort to further polarize and destabilize it.
But the pages also reveal that whoever is behind these information operations appears to have learned from the IRA activities of recent years, including better disguising their efforts.
"Information operations, like other asymmetric threats, is adaptive," the researchers say. Thankfully, the most recent efforts were "not enough to stop Facebook finding them, but it does reveal the challenge facing open source researchers and everyday users."