It can be difficult to determine whether an apparent information operation is in fact that, as opposed to being a prank or an act of digital vandalism. And even should it turn out to be a state-directed operation, specific attribution can be difficult. One such case occurred recently in Spain. That country's state-owned broadcaster TVE says that a portal they’d inadvertently left open was exploited last week by parties unknown to air an RT-produced interview with self-exiled Catalan separatist leader Carles Puigdemont. Reuters asked, and RT (that is, Russia Today) says they didn’t do it. And RT says it doesn't know who did it. “I give you my word,” said RT’s editor-in-chief Margarita Simonyan. In fairness to RT, anyone can stroll in through an open portal. The presence of RT points toward Moscow, especially since the promotion of a separatist point-of-view in a NATO country is the sort of thing that aligns with general Russian objectives, but on the other hand it's at least equally probable that the prank was the work of Catalan separatists themselves. In any case TVE believes few actually received the broadcast.
Foreign Affairs runs a take on disinformation that traces the root of the problem to "unrestricted broadcasting:" we wouldn't see the present "misinformation overload" if only, back in the 1960s, the European model of state-controlled broadcasting had prevailed over the American preference for relatively open access to the spectrum, which is one way of looking at it. It's unclear that a command economy of ideas would tend to result in a better outcome, from the points-of-view of both civil liberties and epistemic quality, than the relatively free marketplace of ideas it might have preempted.
The concern about willful deception, particularly insofar as it's capable of influencing public opinion and electoral outcomes, will only increase as the US 2020 elections approach. Such concerns have provided the background of the impeachment the US House voted Wednesday against President Trump, but other events have contributed to public worries. Prominent among those developments are the increasing persuasiveness and plausibility of "deep fakes," a general term for deceptive material, mostly video, designed to deceive, were it possible, the very elect. Deep fake technology has become increasingly commodified, too, and it's now well within the reach of private citizens. For nation states the costs don't even amount to a rounding error--under US Federal procurement rules, you could buy a deep fake with your Government credit card and the inspector general wouldn't so much as sniff at you. To see how easy it is to construct such fakes, see Ars Technica, where reporter Timothy Lee writes that he made his own deep fake in just two weeks for only $552. That's less than a Microsoft Surface Pro 7 costs, or about the same in time and money that updating your home lighting fixtures would cost. Social media, under pressure from governments and consumers to do something about this problem are looking for solutions. Computing says that Facebook is piloting a project in which it would crowdsource deep fake detection.
An atmosphere in which disinformation is in play seems to lend itself to conspiracy theories, to the tendency to perceive wheels within wheels. This week the Guardian, quoting Open Media, a fairly independent and opposition-friendly Russian news outfit, says that official photographs taken at both the President’s Kremlin office and in his official Novo-Ogaryovo residence show that the President's machines are still running Windows XP. It's possible that this might in fact be the case. For one thing, XP is the last version of Windows the Russian government approved for use on machines that handle state secrets. It's a pattern one sees elsewhere. Industrial control systems, for example, in which reliability and availability are at a premium, frequently run modified, tailored versions of older operating systems that are valued for precisely those reasons. There may be some analogy with systems used for official purposes in Russia. (Mr. Putin may also simply not care. He's said to dislike the Internet in general, which he regards with suspicion as a dodgy Yankee invention.) But Forbes and the security people they're talking to see another possibility, and point out that the pictures of Mr. Putin at work may well be deceptive, an instance of what the Russians call maskirovka, a deceptive appearance put forward to induce the adversary to make a bad move. Thus, Forbes speculates, deeper game may be afoot--maybe the Russians simply want everyone to think they’re still running XP on the President’s machines, when in fact they’re really quite up-to-date with the latest homegrown OS. That would make the Kremlin an irresistible honeypot, wouldn't it?
Maskirovka is not a particularly exotic concept. It's a common term in the Russian military lexicon that covers what Americans would call camouflage and deception. Camouflage would be, for example, covering tactical vehicles with netting that made them more difficult to spot from the air. Deception might be bogus radio traffic intended to be picked up by the enemy so they'll form a false picture of your actual situation. Those of you who've seen the recently released movie Midway would have seen a Hollywood depiction of deception in the US radio message, sent in a cipher the Americans knew the Japanese had broken, indicating that Midway Island's water purification plant had broken down, and that the Marines stationed there were running short of water. Deception is practiced by most military and espionage services, but historically it hasn't been an American strong suit. One US Air Force leader, General Charles Q. Brown, who commands the Pacific Air Forces, would like to see that change. He told a writers' breakfast, Defense One says, that "gadget culture" (and American military culture has been a gadget culture since the Revolutionary War, as the bemused Redcoats and their Hessian hirelings noted as they advanced down Long Island) won't be sufficient to contain the Chinese threat in his theater. He'd like to see information operations designed to undermine and delegitimize the Communist government in Beijing. (He'd also like to see better electronic warfare capabilities to help do so, because, being an American, he can't really shake the love of gadgets any more easily than the rest of us can.)
Why do disinformation narratives work? They work when they're good stories. Phys.org even goes halfway to C.G. Jung and suggests that the archetypal story of going on a journey through a hostile country to arrive in the end at self-knowledge is the basic template for successful disinformation.