At a glance.
- The Cyberspace Solarium report addresses disinformation.
- China criminalizes online negativity.
- Coronavirus disinformation.
- Georgia moves closer to NATO.
- Advances in trolling.
- Huawei's view of backdoors.
- Wikipedia and crowd-sourced fact-checking.
The Cyberspace Solarium on disinformation.
The US Cyberspace Solarium reported this week, and one of its recommendations addresses disinformation. Key Recommendation 3.5 says "The U.S. government should promote digital literacy, civics education, and public awareness to build societal resilience to foreign malign cyber-enabled information operations." This is accompanied by Enabling Recommendation 3.5.1: "Reform Online Political Advertising to Defend against Foreign Influence in Elections." The report contains a brief but clearly stated account of Russian trolling. That its recommendations count on public awareness and a culture of critical thinking to repel malign information operations amounts to a silent acknowledgement that the problem may not be readily amenable to easy technical solution.
It's a crime to be too negative online.
In China, anyway, as of March 1st. The Eurasia Review reports that Beijing is enforcing broadly worded but stringent censorship over what must, may, and may not be said online. Prohibited are “sensationalizing headlines,” “excessive celebrity gossip” and “sexual innuendo,” but also forbidden are reports about “the country’s religious policies," and the promotion of "cults and superstitions.” These are, according to reports, having immediate effect on discussions of Christian persecution (reports of church closings, arrests of priests, destruction of crosses) but can also be presumed to apply to the roughly similar situations of other ideologically disfavored groups, including Muslim Uyghurs, Tibetans, and Hong Kong dissidents. The goal is said to be the promotion of China's "unity and stability," including a requirement that online content portray the thoughts of President Xi Jinping in an "accurate and vivid way.” The Chinese Communist Party unsurprisingly likes the new rules, saying it's time to “eradicate the weeds from cyberspace.”
Coronavirus disinformation (and counter-disinformation).
And some of the misinformation, alas, is state-driven disinformation. And this, we hasten to add, isn’t among the oddball conspiracy theories like the ones that say coronavirus was produced in some top-secret government lab, or that it’s the work of space aliens, and so forth.
The US State Department warned late last week, according to the Washington Post, that the familiar apparatus of Russian trolling has been at work pushing coronavirus scare stories. The goal of the information operation is, as usual, disruption and chaos, confusion to the enemy, that enemy being, unfortunately, Mr. and Ms. United States, and civil societies in other countries that aren’t necessarily reliably aligned with Russian interest.
Lea Gabrielle, coordinator of the State Department’s Global Engagement Center, an organization charged with counteracting disinformation, told Congress last week that threat actors tied to Russia were working through what she called “state proxy websites” as well as official state-owned media and inauthentic online accounts, forming a coordinated effort to “take advantage of a health crisis, where people are terrified worldwide, to try to advance their priorities.” Moscow’s general objective, she said, is “to weaken its adversaries by manipulating the information environment in nefarious ways, by polarizing political conversations, and attempting to destroy the public’s faith in good governance, independent media, and democratic principles.”
That goal should be familiar from earlier discussions of election influence operations. But the problem, of course, is fighting pernicious rumors, whether malicious or simply ill-informed, at scale and in a timely fashion.
In the UK, Whitehall has established a team of experts from across the government to work with social media firms to counter coronavirus misinformation, the Daily Mail reports. And the European Union is reviving the self-reporting system it established with US Big Tech in the hope of finding some way of muting disinformation on the disease. The Wall Street Journal says the pandemic has prompted what some are calling an "infodemic," and the European Union wants help containing the cognitive as well as the physical contagion.
Georgia gets closer to NATO.
Georgia announced at the end of last week that it had joined the Atlantic Alliance's Malware Information Sharing Platform (MISP). While not a NATO member, Georgia has found common cause with NATO in its confrontation of Russian cyber threats. Civil.ge, reporting the news, describes the MISP as "a threat-sharing defense initiative functioning under the aegis of NATO and co-financed by the European Union." Georgia has received considerable diplomatic support from Western nations over the Russian influence campaign it sustained last October. The US, the UK, the Czech Republic, Denmark, Estonia, Lithuania, the Netherlands, Norway, Poland and Sweden have all condemned the Russian operation.
Advances in trolling.
Super Tuesday may have gone off without much incident, but a recently released study by New York University’s Brennan Center for Justice thinks the US ought not relax its guard. The researchers concluded that disinformation operations directed against the 2020 election began last year, and that the operators behind the IRA troll farm have returned, using many of the same accounts. The study finds that the trolls have gotten better at impersonating candidates and parties, and are prepared to go beyond the simple amplification tactics seen so far. It will be interesting to see how successful exposure and blocking of such accounts will prove to be--Facebook, for one, seems to be devoting considerable attention to identifying and stopping coordinated inauthentic behavior. How Menlo Park and others do against the current versions of the Saint Petersburg troll farms will be worth watching.
Huawei explains backdoors...
Huawei continues its charm offensive with a too-earnest-to-be-slick video in its Twitter feed that offers a sparkling little cadenza on what counts as a backdoor. Some backdoors, it says, are good, like those used for lawful interception of traffic, and there’s no real cause to be concerned about these, because they’re used only by duly constituted authority for narrowly defined purposes. That of course is a conceptual backdoor big enough to drive a busload of Shenzhen operators through, so few commentators seem to have been reassured. Does Huawei have a point about backdoors? Well, sure, but as so often happens the trees in this particular forest have stories that the forest itself knows not.
Crowdsourced fact-checking in Wikipedia.
It would be too much to say that Wikipedia has solved the problem of handling disinformation at scale, but the online encyclopedia's crowd-sourced approach has earned a reputation for fact-checking that's somewhat better than traditional print encyclopedias. While Wikipedia is certainly imperfect (but what's not?) Fast Company has an account of its operation that's worth considering by any organization concerned with countering misinformation.