At a glance.
- Specious chatter surrounding ongoing US civil unrest.
- A look at lessons learned on information operations during the COVID-19 pandemic.
- The Cyberspace Solarium weighs in on public engagement and rumor control.
- The difficulty of content moderation.
- And the Executive Order on Preventing Online Censorship has arrived.
Specious chatter surrounding ongoing US civil unrest.
Anonymous resurfaced during the Minnesota-centered unrest, although distinguishing the real Anonymous (insofar as an anarchist collective can be said to have a real, enduring identity) is, as Motherboard notes, difficult. Anyone can claim to represent Anonymous. It’s perhaps significant that a lot of the chatter nominally from Anonymous is amplified through K-Pop social media fan accounts. Both the Washington Post and CyberScoop dismiss the claimed Anonymous operations as derivative fizzles, either an attempt to regain relevance or the work of wannabes and re-enactors.
Anonymous, or more precisely people saying they’re acting in the name of Anonymous, have for years over-promised and under-delivered. The videos posted in the name of Anonymous are appropriately menacing, but they seem to generally have been more cosplayer than superhero. So far the material they claim to have stolen from police sites seems to be old, recycled stuff from publicly known breaches, much of it up on Have I Been Pwnd.
As of today, episodic, nuisance-level hacktivism continues to accompany protests in the US. According to KXAN, Anonymous has claimed responsibility for taking down an Austin, Texas, public website in an anti-police gesture, and Variety reports that K-Pop fans remain an odd force in social media hashtag-jamming. Anonymous, as we’ve had occasion to remark, is now probably better regarded as a lifestyle brand than as an identifiable group. And in that respect, come to think of it, it’s a lot like K-Pop, more style than conspiracy, more propaganda of the (sort-of, not-too-taxing) deed than actual insurrection.
There’s also inauthenticity in the chatter related to the unrest, some from foreign intelligence services, and some from rival extremists flying false flags, NBC reports. Racial fissures in American society have long been favorite points of attack for foreign, especially Russian, disinformation campaigns. And while there have been calls from Antifa urging the extremist group’s followers to regard the National Guard as “easy targets” (again as reported by Minnesota Public Radio), there have also been spoofed Antifa messages attributed by Twitter to the white supremacist fringe group Identity Evropa . Thus one reprehensible dresses up as another reprehensible.
There has also been some fairly wild chatter about social media blackouts in, for example, Washington, DC. These have been easily debunked by reporters on the ground, but not before, as the Washington Post reports, much misinformation was tweeted under the hashtag #DCBlackout.
A look at lessons learned on information operations during the COVID-19 pandemic.
The pandemic is still with us, of course, but with growing relaxation of some of the social isolation rules we've lived under it seems a good time for a first look back. One of the lessons the pandemic has taught is the distinction between and the interplay among disinformation and misinformation, or to put another way, the relationship between lies and folly.
The difference between disinformation (that is, lies deliberately formulated and spread to serve political ends) and misinformation (that is, delusions that often arise spontaneously without any original political purpose) has been on display during the pandemic. There's also an interplay between the two. Misinformation can be used to render disinformation more effective, and disinformation can escape its masters' control and propagate as misinformation.
The US State Department's Global Engagement Center described disinformation in circulation about the origins of COVID-19. The European External Action Service's internal memorandum on disinformation efforts surrounding the COVID-19 pandemic reported substantially the same conclusions as the US State Department: Russia, China, and Iran have engaged in "highly harmful disinformation" that's "gone viral," especially in smaller media markets.
Two styles of disinformation: constructive and destructive; work or friction.
Chinese doctrine has, under the Communist Party's current leadership, emphasized the importance of "discourse power," roughly speaking positive propaganda, and an insistence on that propaganda's receiving an international hearing. An essay in Foreign Affairs described how Beijing has sought to apply discourse power during the COVID-19 pandemic. It sees China as pursuing traditional influence by, for example, posing as a reliable partner and a valued source of friendship and humanitarian aid during a difficult time. But it also argues that Chinese influence operations have been "more aggressive" than usual, and that Beijing has "even experiment[ed] with tactics drawn from Russia’s more nihilistic information operations playbook." "Negative" might be better than "nihilistic," however, because even as they go negative, Chinese operators have been more interested in persuasion than in generating confusion and doubt, which have been the typical goals of Russian influence operations (and those might fairly be described as nihilistic). Thus Beijing's disinformation has tended to seek to convince, Moscow's to confuse.
Chinese disinformation was preceded by censorship.
WIRED recounted how quickly and comprehensively the Chinese government moved to suppress social media posts that dealt with the initial outbreak of COVID-19 in Wuhan. The efforts at suppression go back at least as far as the first week of January. How have reporters become aware of them? By following the maxim “Cover China as if you were covering Snapchat.” The posts have a brief life, so when you see something interesting, take a screenshot before the post is quashed and the account blocked for "spreading malicious rumors." Weibo and WeChat Moments are the most commonly used platforms on which ephemeral posts appear.
Avoiding embarrassment would surely have been a principal goal of the censorship campaign, but it may also have had a more direct, practical objective. The motivation for suppressing the news may in part have been motivated by plans to stockpile necessary medical supplies. The AP and POLITICO reported seeing a US Department of Homeland Security report that says, in part, “We further assess the Chinese Government attempted to hide its actions by denying there were export restrictions and obfuscating and delaying provision of its trade data.” Before informing the World Health Organization of the epidemic's outbreak, Beijing significantly cut back exports and increased imports of such basic medical equipment as facemasks, gloves, and gowns.
And Chinese censorship was succeeded by dissemination of a false origin story (taken up in turn by Russia and Iran).
POLITICO reviewed a report by the State Department's Global Engagement Center that concludes three governments—those of Russia, China, and Iran—are pushing complementary lines of disinformation:
- COVID-19 is an American bioweapon.
- The US is making political capital from the pandemic.
- The virus did not originate in China.
- US Army troops spread the virus.
- US sanctions are killing Iranians during the pandemic.
- China responded to the crisis effectively and responsibly, but the US response was marked by negligence.
- Russia, Iran, and China are handling the pandemic well.
- The US economy cannot withstand the toll COVID-19 is exacting.
The false stories are being distributed by a mix of official, semi-official, and cooperating outlets. Some of the official outlets aren't shy about disseminating surprisingly tabloidesque stories: a Russian military paper Zvezda, for example, in March began retailing the story that the novel strain of coronavirus was developed by the Bill and Melinda Gates Foundation, an unspecified secret laboratory, and a cabal of pharmaceutical companies. Their goal was evidently profit. (This particular accusation, facially preposterous, was nonetheless picked up by "unknown activists," the Washington Post reported, which activists then distributed it through 4chan.)
The New York Times claimed that Russia has been running a long campaign aimed at undermining the authority of US scientific consensus on a range of topics, but especially on health-related and biomedical research. The “decade long” disinformation campaign is said to have promoted quack treatments and questionable research, “undermining major institutions” and rendering outbreaks of disease more serious.
Misinformation as popular delusion.
WIRED reported that the US Department of Homeland Security warned local authorities to watch for vandalism against cellular infrastructure, and particularly against 5G towers. The crank theory that's animated vandals in northwestern Europe (and that now appears to have a beachhead in North America) comes in two varieties. The first holds that 5G electromagnetic signals actually carry the virus. The second variant, only marginally less plausible, holds that 5G electromagnetic radiation impairs the human immune system, thereby rendering populations near the towers more susceptible to infection. There's no evidence for either view,
There's also another crank view, exemplified by the film Plandemic, which the Washington Post described as retailing a complex and implausible conspiracy theory about corporate and governmental interests the film makers claim are behind the pandemic. It's often cited as an example of the dangerous potential of misinformation. Its recent distribution also affords an example of the difficulty of controlling such misinformation's spread.
The Cyberspace Solarium weighs in on public engagement and rumor control.
The US Cyberspace Solarium Commission this week issued their promised white paper on lessons learned about cybersecurity from the COVID-19 pandemic. For the most part those lessons reinforce the Commission’s policy recommendations, but they also see interesting analogies between a pandemic and a major cyberattack. They’re both global crises that call for a whole-of-nation response. Both call for an environment that makes it possible for solutions to emerge. And in both cases, “prevention and preestablished relationships” are better than deterrence and response.
In particular, the Commissioners think establishment of a National Cyber Director is more clearly indicated than ever. They call upon Congress to send digitization grants to state, territorial, tribal, and local governments, and to do so as part of COVID-19 relief packages. They urge planning for continuity of the economy, and they repeat their recommendation that the nation work toward building “societal resilience to disinformation.”
The Solarium Commissioners include four new recommendations:
- First, they urge Congress to pass an Internet of Things Security Law.
- Second, they recommend Increasing support to not-for-profit organizations that help law enforcement agencies’ efforts to combat cybercrime and support victims.
- Third, they advocate establishing a Social Media Data and Threat Analysis Center.
- And, finally, they urge Increasing nongovernmental capacity to identify and counter foreign disinformation and influence campaigns.
There are two risks that seem to have dominated the Solarium's conclusion: the threat to infrastructure (especially industrial control systems) and the risks of disinformation. Addressing the second will be the more challenging of the two problems.
The difficulty of content moderation.
For all their efforts at deplatforming conspiracy theorists, offering rumor control, and stopping the distribution of disinformation, social media have had little success at any of these, The ability of social media accounts to monetize their content by maximizing clicks, views, and other engagement has outrun the ability of the social media to moderate content and exclude fringe theories from their services. MIT Technology Review sees conspiracy theory as being especially deeply rooted in "YouTube culture."
Companies like Facebook have had more success identifying and blocking "coordinated inauthenticity" used to push and amplify disinformation than they have in directly correcting erroneous information posted by users. Twitter has moved toward a kind of marketplace of ideas approach to content moderation. Facebook announced a different approach, a kind of misinformational contact-tracing. It will be coupled with a kind of online rumor control Facebook is calling "Get the Facts," and by the introduction of some accurate information about the virus into the news feeds of users who've interacted with dubious content.
And the Executive Order on Preventing Online Censorship has arrived.
As expected, late last Thursday US President Trump signed an Executive Order on Preventing Online Censorship intended to address ways in which social media are applying "selective censorship that is harming our national discourse." It addresses Section 230 of the Communications Decency Act, which affords civil liability protection to online service providers that act as "neutral platforms" as opposed to "editors." The Secretary of Commerce will lead a "petition for rulemaking" to clarify Section 230. Federal agencies will evaluate spending on platforms that engage in "viewpoint discrimination," and the Federal Trade Commission will investigate unfair trade practices related to content moderation.
Among the points that stand out in the order are its observation that the protections in Section 230 were designed, narrowly, to provide certain protection for minors. It also emphasizes the Act’s provision that restrictions on content be done in “good faith.” And it asks the Federal Trade Commission to take a close look at social media companies’ outsourcing of content moderation to third-parties that themselves arguably engage in viewpoint discrimination.
The Order has been widely viewed as a response to the President's recent experiences with Twitter. At issue is the difficult question of what counts as a neutral supplier of a service and what counts as being a publisher with responsibility for content. Thus should Facebook, Instagram, and Twitter be treated like sellers of newsprint or like newspapers, like telephone companies or like television stations.