At a glance.
- Burisma redivivus, and the problems of content moderation.
- Domestic influence operations in West African elections.
- Signalling or staging in Nagorno-Karabakh?
- Coordinated authenticity take-downs (this time with a domestic focus).
- An unambiguous accusation that Huawei is working for China's intelligence services.
A "smoking gun," but whose gun?
Reports by the New York Post that alleged “smoking gun” emails involving US-Ukrainian relations have been found on a computer belonging to Hunter Biden, son of former US vice president and present Democratic presidential candidate Joseph Biden, raise questions of influence operations (potentially foreign, and demonstrably commercial.
At issue is the long-running and much-investigated nature of the relationship between Biden-fils and various foreign business interests, notably Ukrainian energy firm Burisma, and whether such relationships amounted to influence peddling, or at least the invidious appearance of influence peddling. The elder Biden has denied detailed knowledge of his son’s business relationships, and the younger Biden has periodically regretted any appearance of impropriety.
The provenance of the emails the Post reported is disputed, coming as they did from a laptop of uncertain origin, but with some appearance of connection to the younger Biden. The Johns Hopkins University’s Thomas Rid points out the ways in which the emails could amount to a disinformation operation, and that cannot be ruled out. The story’s details have been difficult, so far, to corroborate, and some of the emails give the appearance of having been either reconstructed or fabricated.
But the treatment of the Post’s reporting has also raised questions about content moderation. Ars Technica has a summary of the issues the case raises for social media content moderation.
Twitter and Facebook were quick to inhibit sharing of the Post’s coverage, and that’s aroused more questions about the ways in which they attempt to control alleged disinformation or misinformation. Twitter simply blocked it, and blocked some accounts that had shared the story. Twitter CEO Jack Dorsey tweeted some regrets about his company’s handling of the material: “Our communication around our actions on the @nypost article was not great. And blocking URL sharing via tweet or DM with zero context as to why we’re blocking: unacceptable.” Thus, what we have, Mr. Dorsey says, is a failure to communicate, or specifically a failure to communicate context.
Facebook didn’t block sharing or discussion of the content. Instead, it deprecated sharing, which is to say that it reduced the likelihood that the platform’s algorithm would amplify the story.
In any case the two platforms seem to have enmeshed themselves in a lose-lose approach to the story, with Republicans incensed by what they characterize as censorship, and Democrats upset by what they see as an instance of the Streisand Effect where an attempt to downplay information has the unwelcome and paradoxical effect of drawing attention to it.
Domestic influence operations in West Africa reported.
Bloomberg reports that African governments are actively using social media to spread what it characterizes as “disinformation” during the run-up to this year’s elections in order to “dominate the narrative around campaigns.” In these cases—Bloomberg cites Guinea and Ghana—the influence operations are domestic, not foreign. Government-aligned operators are said to have been particularly active on Facebook.
Deniable battlespace preparation, or intervention signalling?
Bellingcat, while noting that evidence for the Russian mercenary company Wagner Group's involvement in the conflict over Nagorno-Karabakh has been lacking, says chatter in Reverse Side of the Medal (RSOTM) that suggests some battlespace preparation (in the form of trolling) may be in progress. So far the more credible reports of foreign fighters (in Bellingcat's estimation) have placed them on the side of Azerbaijan, but RSTOM reports seem to be signalling the arrival of Russian mercenaries in the Armenian interest.
Coordinated inauthenticity, with domestic audiences.
Last Thursday both Facebook and Twitter disclosed the discovery and suspension of politically-motivated or state-connected networks of inauthentic accounts.
Facebook’s takedowns involved coordinated inauthenticity that sought to engage mostly domestic audiences. A US-based network of “thinly veiled personas” associated with the Rally Forge marketing firm which appears to have worked on behalf of Turning Point USA and another conservative political organization that favored the re-election of President Trump. The network’s audience was primarily a US domestic one, with secondary audiences in Botswana and Kenya. Those secondary audiences were sent content that, oddly, favored big-game hunting.
Facebook also dismantled a network in Myanmar that consisted of seventeen Pages, fifty Facebook accounts and six Instagram accounts. Their line was critical of the National League for Democracy and political leader Aung San Suu Kyi; there was also some anti-Rohingya content. The network was linked to members of Myanmar’s military.
The social network also removed five-hundred-eighty-nine Facebook accounts, seven-thousand-nine-hundred-six Pages and four-hundred-forty-seven accounts on Instagram based in Azerbaijan. These were engaged in praise of President Ilham Aliev and the New Azerbaijani Party, criticism of the opposition (with accusations of treason), and denials that human rights were being abused in Azerbaijan. They also included patriotic content about the ongoing fighting with Armenia over Nagorno-Karabakh.
Finally, in Nigeria, seventy-nine Facebook accounts, forty-seven Pages, ninety-three Groups and forty-eight Instagram accounts were suppressed. The networks supported Ibrahim Zakzaky and Nigeria’s Islamic Movement; they were critical of the government.
Twitter’s cancellations showed little overlap with Facebook’s most recent round, although some of them did coincide with Facebook’s September enforcement round. Twitter cancelled inauthentic Iranian accounts that aimed principally at deepening US social fissures during the election season. The company also removed more than five-hundred Cuban accounts. It also cancelled Saudi accounts that operated principally against regional rival Qatar. The most interesting takedowns were of a network of accounts associated with the Royal Thai Army that “amplified” pro-government and anti-opposition content. Stanford’s Internet Observatory called the Army’s operation “low-impact” and “cheerleading without fans.” The Bangkok Post reports that the Royal Thai Army has denied any involvement in disinformation.
Facebook, in its account of the coordinated inauthenticity most recently dealt with, summarized some new features of the networks against which it took action:
“Two of the networks we’re sharing today engaged primarily in commenting on content — relying on real people, not automation — to create the perception of wide-spread support of their narratives by leaving comments on posts by media entities and public figures. Other campaigns — like the ones from Russia (that we removed in late September) — focused on tricking unwitting freelance journalists into writing on behalf of these operations.
“Deceptive campaigns like these raise a particularly complex challenge by blurring the line between healthy public debate and manipulation. Our teams will continue to find, remove and expose these coordinated manipulation campaigns, but we know these threats extend beyond our platform and no single organization can tackle them alone. That’s why it’s critical that we, as a society, have a broader discussion about what is acceptable political advocacy and take steps to deter people from crossing the line.”
The House of Commons defence committee accuses Huawei of working for the Chinese government.
The BBC reports that a British Parliamentary committee last Thursday released a report that concluded there was “clear evidence of collusion” between Huawei and the Chinese Communist Party. While tut-tutting a bit to inoculate itself against charges of “ill-informed, anti-China hysteria,” the House of Commons defence committee supported its conclusions by noting the subsidies the company has received from the Chinese government: some $75 billion over the last three years. That subsidy enabled Huawei, the report said, to lowball its competition and secure great marketshare by selling its equipment at a "ridiculously low price point.”
The report also cites research that alleges that the Shenzhen hardware giant has "engaged in a variety of intelligence, security, and intellectual property activities." In sum, the Parliamentary study concludes, "It is clear that Huawei is strongly linked to the Chinese state and the Chinese Communist Party, despite its statements to the contrary. This is evidenced by its ownership model and the subsidies it has received."
The report is expected to have the effect of advancing the replacement of Huawei equipment in the UK’s telecommunications infrastructure. For its part, Huawei expressed its confidence that “people will see through these accusations of collusion and remember instead what Huawei has delivered for Britain over the past 20 years.”
Fortune sees the report as harsher than any official statements other critics of Huawei, including the US and Australian government, have so far offered. It represents a direct, official accusation that Huawei is actively working for the Chinese government. Previous warnings have concentrated on the company’s susceptibility to Beijing’s influence, and this report goes beyond that.