At a glance.
- The European assessment of China's disinformation efforts was harsher than public statements suggested.
- Poland reports Russian disinformation campaign. Latvia sees Russian business as usual.
- The personal toll of state-sponsored disinformation and the madness of crowds.
- Astroturfing, political advocacy, and the propagation of tendentious news.
- Doxing WHO, and others.
The European assessment of China's disinformation efforts was harsher than public statements suggested.
The European External Action Service's internal memorandum on disinformation efforts surrounding the COVID-19 pandemic reported substantially the same conclusions as the US State Department: Russia, China, and Iran have engaged in "highly harmful disinformation" that's "gone viral," especially in smaller media markets.
The assessment linked above is the EEAS internal report. According to the New York Times, EU officials under pressure from Beijing and desiring to achieve more amicable relations softened the harder conclusions before rendering their much less direct, much more placid public statement Friday. In the Times' judgment the original report "was not particularly strident: a routine roundup of publicly available information and news reports." “'Such appeasement will set a terrible precedent and encourage similar coercion in the future,' an analyst, Monika Richter, wrote to her colleagues and supervisors in an email seen by The Times. She said that European Union diplomats were 'self-censoring to appease the Chinese Communist Party.' She also wrote that it was a lie to claim that the document had not been scheduled for release."
In the meantime, the BBC reports that China has also rejected an Australian-led call for an investigation into the origins of COVID-19, dismissing it as "politically motivated" efforts that "would serve nobody any good." As the BBC paraphrased Chen Wen, a senior diplomat in China's mission to the UK, "there were lots of rumours about the origins of the virus but such misinformation was dangerous, she claimed, and said it was like a political virus and as dangerous as coronavirus itself, if not even more so." For a summary of Chinese active disinformation about the coronavirus, see the EU's External Action Service's original report, especially pages 7 and 8.
A British official speaking with the BBC on condition of anonymity said there was "nervousness" about confronting China, since relations with Beijing are presently "delicate."
Poland reports Russian disinformation campaign. Latvia sees Russian business as usual.
According to the New York Times, Polish security services say that the country has been subjected to a "complex disinformation operation" whose structure and apparent goals are consistent with earlier Russian influence operations. The centerpiece of the campaign is a fake letter posted to the (hacked) website of the War Studies Institute, a defense academy for senior Polish military leaders. Purporting to be from the Institute's director, the letter calls for Polish soldiers to fight "American occupation." The attribution to Russia is circumstantial but convincing nonetheless: the disinformation is entirely consistent with earlier Russian influence campaigns in both content and the ways in which its messaging has been amplified.
A report in the Baltic News Network says that Latvia is seeing Russian influence operations continuing at their customary steady pace: no surge, but no particular relaxation, either.
The personal toll of state-sponsored disinformation and the madness of crowds.
It's easy to become accustomed to thinking of disinformation and misinformation in terms of their large-scale effects on societies and the politics that surround them. But they have personal effects, too, in what may serve as a reminder that the familiar New Left slogan, "the personal is political," might be reversed.
First, the madness of crowds. In India, a man whose family suffered from COVID-19, properly quarantined themselves, and recovered, is being hounded online by people who claim (falsely) that one of the family had in fact died of the virus, and moreover that harsh local restrictions are the family's fault. "When a health officer called me I explained the matter to him. Later, an online news portal carried fake news that the district administration had announced a ‘double lockdown’ at Manacaud and Ambalathara areas which fall under hotspot due to the irresponsibility of my family. We won’t do that,” Bin Sagar told the Indian Express. He's taken legal advice.
And then state-run disinformation finds amplification when it finds its audience. The Chinese Communist Party's claims that COVID-19 was brought to Wuhan in October by US Service Members participating in the World Military Games (a kind of good will olympics among the world's military services) have been widely broadcast by Chinese official statements (usually in the form of a call for investigation, sometimes with the suggestion that the virus was an American bioweapon). US Secretary of Defense Esper calls this allegation "completely ridiculous...and irresponsible," and we're with him on that.
But not everybody is, and "everybody" in this case includes some YouTubers. CNN reports that one US Army Reservist who participated in the games has been called out as the source of infection, and is receiving all the hostile attention one would expect. The charge that the Reservist, Sergeant First Class Maatje Benassi, is the patient zero of the infection and the prime mover in the pandemic is of course absurd, but that hasn't prevented YouTubers from pushing it, acting in effect as a kind of cyber mob.
Prominent among the YouTubers flacking the story is George Webb, whom CNN calls a "misinformation broker" but who describes himself as "investigative journalist." Mr. Webb has propounded numerous conspiracy theories in the past, to the extent that Google has stopped running ads in his channel. He is, as he would put it, only asking questions, but the questions are specific and damaging, especially to the Benassis, who have nothing to do with the virus at all. Only asking questions, mostly, but one might reflect that a traditional moralist would have treated false suggestion as itself a form of false witness.
The US Army is providing Sergeant First Class Benassi with support against the attention. Colonel Sunset R. Belinsky told the Army Times, “The Army is providing support to help Sgt. 1st Class Benassi with the public attention. As a matter of policy, the Army would neither confirm nor deny any safety or security measures taken on behalf of an individual; however, as we would with any soldier, the Army will work with the appropriate authorities to ensure that she and her family are properly protected.”
Astroturfing, political advocacy, and the propagation of tendentious news.
As we noted last week, there's been a surge in the registration of domains related to resuming ("reopening") normal activity in the United States. KrebsOnSecurity reported earlier this week that a great deal of it looked like astroturf.
Domain Tools this morning published their own study of how the domains came to be, and who registered them. Many of the sites, a number of them with Second Amendment themes, appear to Domain Tools to have been established by Aaron Dorr, a consultant who advises political movements on advocacy and organization. Their use of a small set of common templates seemed to derive from another political consultancy, One Click Politics, which further raised suspicion that the apparently local, ostensibly grassroots sites were in fact astroturf.
There's also some countersquatting going on, with other political advocates quickly and preemptively registering domains that seemed to have the kinds of names that would draw Mr. Dorr's attention. Those doing so seem to be in general on the political right, and view association with the operations Domain Tools ascribes to Mr. Dorr as undesirable.
Domain Tools emphasized in a conversation with us that one common feature on the astroturfed sites is a prominent and functioning donation button. This suggests to them that a nontrivial goal of the operation is making money.
In that conversation, Domain Tools also suggested two areas that warrant some attention. First, deep fakes have been generally associated with faked audio or video content. Domain Tools points out that one of the problems of astroturfing and influence operations generally is the production of useful content, at scale. Sometimes this is done through plagiarism or repurposing, sometimes (and this is something Domain Tools noticed in connection with Mr. Dorr's operation) by having some lone Stakhanovite crank out a number of bylined pieces (using the same byline does tend to blow the gaffe, but it happens). Domain Tools suggests that deep-learning tools can be adapted to rapidly produce good-enough written content in the service of influence. This could involve impersonation of real persons or simply to generate articles that could be attributed to various sockpuppets.
Second, while most of the astroturf seems based domestically in the United States, there are indications that a few of them may have infrastructure in Hong Kong. That's curious, and deserves further investigation.
BuzzFeed has a report on what it characterizes as "black PR" firms, companies whose specialty is to collect and amplify spin. Their goals are not wholly unfamiliar—advertising, public relations firms, and especially political consultants would recognize them as spin, and even the methods aren't utterly novel. It's possible to discern an ancestor in late-twentieth-century direct mail campaigns. But the scale that automation makes possible does seem new. In this case quantity may have a quality all its own.
Doxing WHO, and others.
Misinformation, especially when it leads to conspiracy theories that in turn lead their adherents into investigations, can provide a motivation for doxing. Following up on the release of credentials belonging to the World Health Organization, the Gates Foundation, and other groups involved in one way or another with attempts to control the COVID-19 pandemic, the Washington Post cites a study by the SITE Intelligence Group that connects the doxing to an American conspiracy theorist. The identity of the conspiracy theorist is so far unknown.
The Post characterizes the evidence as follows: “Based on comments and links on various social media sites that appear to be from the same person, however, SITE determined that the initial poster probably was an American who espoused conspiracy theories popular on the political right, including that government officials and news organizations are exaggerating covid-19 death counts to manipulate the public.”
SITE speculates, again on the basis of comments and links, that the goal of the doxing was to facilitate further compromise of organizations the conspiracy theorist believed complicit in various forms of misbehavior with respect to the pandemic. It’s of course entirely possible that there’s little or no hidden misbehavior of the sort the conspiracies envision. The leaked credentials aren’t new: they’re believed to derive from material posted online as early as 2016. But the goal of the recent republication may have been to inspire more DIY investigation.