At a glance.
- Influence operations in Georgia.
- A perspective on Russian disruption.
- No evidence found of foreign bots acting discreditably in social media.
- Impostors may be trying to pose as the US Democratic National Committee.
- Choking off Internet access as a means of information control.
- Naming and shaming threat actors.
- How to organize influence operations.
Crude, but consistent with Moscow's style.
And many governments are prepared to call Russia out for a disinformation campaign driven by rough-handed website vandalism last October.
A number of countries have joined the US, the UK, and Georgia in condemning what they characterize as a large-scale GRU defacement attack against Georgian websites last October, Fifth Domain and others report. Naming and shaming are thought part of a broader effort to reinforce international norms of conduct in cyberspace. Other allied governments, including governments with strong institutional memories of Russian hybrid operations like those of Estonia and the Czech Republic, have also joined in the criticism of Moscow’s operations against Georgia.
The High Representative of the European Union for Foreign Affairs and Security Policy Josep Borrell, the EU’s top diplomat, said last Friday, according to Eurasiz Review, “Georgia was the victim of a targeted cyber-attack causing damage to their social and economic infrastructure.” Western intelligence services, notably those of the UK and the US, have attributed the influence campaign to Russia’s GRU. Georgia’s government has thanked the EU for the expression of solidarity.
Russia’s Foreign Ministry has denied any involvement in the attack ("Russophobic lies and fakes," as TASS quotes Russia's ambassador to Ottawa--it's a representative sample of the Kremlin's reaction) and puts the whole matter down to a coordinated propaganda campaign run from Washington, London, Tibilisi, and an unspecified “elsewhere.” The Ministry has also deplored Georgia’s decision to “demonize” Russia, Georgia Today reports, and just when relations between the two peoples were getting so much better (says they). But most observers still see Fancy Bear’s paw prints all over the Caucasus.
A perspective on Russian disruption.
The Georgian operations were almost purely disruptive, figurative sand in the metaphorical gears of civil society. With that in mind, it’s worth reviewing the current state of the US elections, and the attempts to influence them. The Atlantic looks at Russian influence operations directed against the 2020 US elections and concludes that the Americans themselves are doing a good job of creating divisive content all on their own, and that the Russians seem to have moved from creation to curation. It’s impossible to resist the temptation to quote Pogo Possum on this: "We have met the enemy and he is us," as he famously said more than half a century ago. There's enough ill-will and paranoia in domestic production to leave the troll farms of St. Petersburg with little to do beyond retweeting it. As the Atlantic observes, "The U.S. doesn’t need Russians to erode faith in its elections—one buggy app at the Iowa caucus did that just fine."
Moscow remains interested in weakening American civil society, and can be expected to continue its efforts along those lines, but we may not see a revival of 2016-style hacking and creative disinformation. Amplification and curation may well do it. The Atlantic talked to Graham Brookie, director of the Digital Forensic Research Lab at the Atlantic Council (no relation to the Atlantic magazine, by the way). They quote Brookie as saying, of Russia’s Internet Research Agency, the highest-profile troll farm of them all, that at this point “They could spike the football and say, ‘Mission accomplished.’” Maybe they will.
Right, right: absence of evidence isn't evidence of absence...
We get it. And negative existentials are notoriously impossible to prove. But on the other hand they say that about UFOs, too, when the UFOlogists scoff at Fermi's Paradox. Don't get us started on Sasquatch.
So here's some absence of evidence. Senator Sanders suggested twice last week (on grounds of a priori probability) that online nastiness apparently emanating from his supporters might well have been the work of Russian bots. Experts the Daily Beast polled think this unlikely. The Wall Street Journal subsequently reported that Facebook has been unable to substantiate claims that some ill-behaved supporters of Senator Sanders were in fact either Russian or Republican trolls. Had Menlo Park found evidence of coordinated inauthenticity, Facebook says, they’d have taken down the offending sites, pages, posts, etc. But they didn't, and the social media presence that some had objected to will have to stand evaluation on its own. Much of the criticism that the campaign thought might have been the work of hostile bots impersonating supporters was directed against the Culinary Workers, a union that had come under criticism for its unwillingness to support Medicare for All, one of Senator Sanders' key proposals.
One of the things observers have noted is that Russian trolling is commonly picked up and amplified by authorized mouthpieces of the Russian state, like RT and Sputnik. That hasn't seemed to happen with the putative Brobots. So is it possible that Russian bots have been trolling US political campaigns, impersonating supporters? Sure. Lots of things are possible. Attribution is always difficult, but it’s worth remembering that America is great in lots of ways, including her ability to generate loud-mouthed invective, at scale and in quantity. And at some point in the absence of evidence you've got to be content with an argument to best explanation.
Still, even the morbidly suspicious have real enemies.
According to the Washington Post, persons (possibly foreigners) impersonating the Democratic National Committee have sought to establish contact with Presidential campaigns. The impersonation was initially reported to the DNC by Senator Sanders’ campaign. The national party would like all campaigns to regard contacts purporting to be from the DNC with appropriate skepticism. It's still early, but this does look like a genuine attempt at social engineering.
Shutting down the Internet.
Lies can be accomplished by suppressing the truth or through false suggestion. But influence can sometimes be exercised by choking information off altogether. The Wall Street Journal finds that restricting Internet access has become a common measure governments of various stripes (democratic, theocratic, oligarchic, etc.) adopt when they need or wish to control popular opinion or limit the ability of a population to spontaneously organize. Internet control has become what seizing newspapers and radio stations was in the mid-Twentieth Century.
Naming and shaming?
Retaliation for state-sponsored hacking is often thought of in terms of returning a cyberattack in tit-for-tat fashion. But documents obtained from US Cyber Command by Motherboard suggest that Fort Meade sees publication of hostile activity as itself making a contribution to disrupting the adversaries' efforts. Publication of attack code to Virus Total, as the Command recently did with some North Korean tools, raises awareness and contributes to quicker development of countermeasures.
There's some evidence that the policy of naming and shaming threat actors can disrupt an adversary. The Chinese organizations named in the Equifax breach indictment seem to have vanished from cyberspace. It appears that Chinese services, at least, are sensitive to this kind of treatment. CrowdStrike founder Dmitri Alperovitch said Wednesday at RSAC 2020 that it appeared China's Ministry of State Security has had to "reset and retool." Comment Panda, Stone Panda, and Gothic Panda have all gone quiet. Whether this amounts to more than a restructuring or reorganization remains to be seen, but even a reorganization can be disruptive enough.
Alperovitch added that the Chinese seem unusual in this respect. The Russians, the Iranians, and the North Koreans, to consider the three other familiar adversaries, tend to shrug off American indictments and move on.
What's the right organization for influence operations?
Defense One ran an op-ed this week that presents a case for the creation of a high-level post to oversee influence operations, a "Secretary for Influence Operations." Such matters are currently the responsibility, within the US Department of Defense, of the Assistant Secretary of Defense for Special Operations/Low-Intensity Conflict, the "SOLIC." Some former SOLICs argue that the post is simply too junior for the gravity of the responsibilities, and that the Department as a whole is too inward-looking and risk-averse to manage influence operations effectively. Those operations have historically gone by many names--most civilians will recognize the now obsolete term "psychological operations"--but they're now called "MISO," or "military information support operations." The US Department of Defense Dictionary of Military and Associated Terms defines this as "Planned operations to convey selected information and indicators to foreign audiences to influence their emotions, motives, objective reasoning, and ultimately the behavior of foreign governments, organizations, groups, and individuals in a manner favorable to the originator’s objectives."
It's a tough problem. The US is the country that basically invented modern mass marketing, and "military information support operations" are essentially marketing in battledress. But as successful as the Americans have been at arousing every imaginable factitious desire in consumers (soda, financial planning, insurance, cars, beer, etc.) they've been justly diffident about their ability to persuade on the battlefield.