At a glance.
- Disinformation for hire.
- Attention to a different set of facts and the shaping of climates of opinion.
- How effective, really, is Chinese government disinformation?
Influence mercenaries, spin doctors, and front groups.
The New York Times has a long piece on a growing, albeit shadowy, industry which traffics in influence on behalf of clients, often political candidates and causes, often by placing disinformation in sources that could be assumed by consumers of information to be independent. The campaign the Times story leads with is one organized by a public relations firm, Fazze, approached French and German social media influencers with an offer to pay them to disseminate falsehoods about the Pfizer COVID-19 vaccine. But Fazze soon thereafter scrubbed its social media accounts, and the London address it gave was revealed to be fictitious.
As the Times points out, such "[p]rivate firms, straddling traditional marketing and the shadow world of geopolitical influence operations, are selling services once conducted principally by intelligence agencies." Or, one might add, also by lobbying and political consulting firms. Their services are valuable in part because they're deniable.
Disinformation-for-hire has deep roots in propaganda. Cold War covert operations on both sides of the Iron Curtain used front organizations to push narratives and influence politics. Some of these were "proprietaries," others were directly run by intelligence services, and others were operated by contractors. (We note in passing that there's nothing in the Times coverage to suggest that Fazze was not itself a front organization--its talking points tended to run to the advantage of Pfizer's Russian rival, Sputnik.) It also has roots in political campaigning, with the direct-mail organizations that emerged in the 1970s and 1980s providing a pre-Internet foreshadowing of some of today's operations. And as deceptive marketing, they surely owe something to the tradition of "public service" advertising run by corporations and others seeking to burnish their image, or to create an astroturf simulacrum of grassroots opinion. And some of the companies themselves would deny any impropriety. They're hired to present a point of view, although it's difficult to defend use of inauthentic personae on those grounds.
Social media's role in disinformation.
The Times argues in its piece on disinformation mercenaries that social media are particularly adaptable to exploitation in this way. Whether that's a natural feature of social media or simply a sign that they're too new for people to consume with appropriate skepticism isn't clear. Most of us are able to toss out alarmist junk mail (recognized as such and therefore unread), many of us hang up easily on the recorded conversation voice asking us to contribute to help law enforcement officers, veterans, and others, but fewer seem ready to ignore influencers twerking their way through policy talking points. In any case, social media have yet to find their McLuhan, still less their Aristotle.
A piece in WIRED suggests that in some respects the role social media have played in, for example, fostering hesitancy about receiving COVID-19 vaccines, has been exaggerated. Anti-vaccine sentiment has, as the article points out, been a "top-down" phenomenon driven by elites. It also isn't founded exclusively or even largely on factual disinformation, but rather on a selection of different facts.
A study of vaccination rates did seem to find a correlation between people whose principal source of news was Facebook and people who declined to be vaccinated, but found no comparably strong correlation among those who got their news from generally anti-Administration sources like Fox. It seems, rather, that mistrust of institutions (like traditional news media, or Government agencies) tends to be associated with looking for news in non-traditional places, and Facebook seems to have an immediacy that may lend it authenticity and credibility in the eyes of its users. It's as if an old alternative local newspaper, the kind that told neighborhood stoners what kind of shoes the narcs were wearing nowadays, had somehow managed to scale to a platform reaching hundreds of millions.
Is Chinese disinformation actually working?
A piece in Just Security thinks Beijing's success at influence operations is much exaggerated, and that it would be a mistake to overreact to them. "First," an essay argues, "there is little evidence that shows China’s digitally mediated efforts at influencing public opinion overseas is actually working. Research warning against the scale of China’s external propaganda often acknowledges that its efficacy is mixed, if not counterproductive." In part this is an attribution problem: which campaigns are actually being directed by, say, the MSS?
"Second," the piece continues, "while concerns over China’s growing influence are valid and worthy of interrogation, inflating such threats can potentially lead to disproportionate measures that unduly target content, individuals, and entities of Chinese origin." Thus inflating the threat can lead to unreasonable and unwarranted retaliatory measures.
"Third, policies that resort to blanket bans based on political stance or country of origin are only strengthening China and its allies’ cyber sovereignty agenda, which has been used to solidify digital borders and justify surveillance and suppression practices." China and other authoritarian states want sovereign Internets and control over information. It would be unwise to abet them in this. And, a related fourth point would be that "democracies should focus on putting their own house in order and pay as much attention to homegrown disinformation campaigns and issues pertaining to domestic information environments as to foreign influence operations." (And, see the third point, they should presumably do so without censorship or heavy-handed content moderation.)
And the final point is worth quoting in full:
"Last but not least, discourses or agendas that seek to securitize disinformation (i.e., frame it as a national security threat) have led to censorship-enabling policies and legislation across the globe. Ruling parties in authoritarian and democratizing countries have been capitalizing on Western media’s justified concern over “fake news,” framing it as a threat to national security that in turn justifies extreme measures including passing laws and regulations that would criminalize the creation and dissemination of “fake news” or “rumors.” The securitization of misinformation and disinformation has led to increasingly illiberal policies, such as urging the private sector to police speech or even adopting China’s model of “significant monitoring and speech control.” If we have learned anything from studying authoritarian actors in-depth, it is that their model of regulating information is the antithesis of good governance."
It's worth noting that China, whose disinformation tends to have a positive goal in that it seeks to persuade its audience of a relatively small set of specific propositions, has set itself a much tougher influence task than has, say Russia, whose aims are typically negative: they don't care what you believe as long as it does the adversary (which is usually your government and the civil society it serves) harm. Information entropy is their friend.