At a glance.
- Influence operations as political consultation.
- Disinformation by suppressing the truth.
- COVID-19: origin stories.
- Exploiting COVID-19.
- Twitter's appeal to reason.
- Facebook appoints a content oversight board.
- Facebook takes down coordinated inauthenticity.
Influence operations as political consultation.
The US Department of Homeland Security and the FBI have warned in a memorandum, the Voice of America says, that Russian influence operators may seek to advise candidates in this November's US elections. That hasn't been observed in the US so far, but oligarchs close to Russian President Putin are believed to have provided such advice to various campaigns in Africa.
The memorandum, unclassified but initially marked "For official use only," was obtained by the AP in a Freedom of Information Act request. It outlines several classes of Russian influence tactics, some characterized as "high threats," others as "moderate threats." The high threats include doxing (essentially a replay of the forced transparency the Clinton campaign experienced in 2016), the use of "state-controlled media arms to propagate election-themed narratives to target audiences" (a tactic that's become familiar), attempts to use economic or business leverage for influence on a campaign or within an administration, and the use of inauthentic social media personae to influence American opinion (also an ongoing, familiar tactic).
The moderate threats include directly interfering with or manipulating election infrastructure (like voter registration lists or vote-tallying systems), giving money to candidates or parties (through loopholes in American election laws, if possible), or, finally, the novel threat of providing covert advice to candidates and campaigns.
Disinformation by suppressio veri.
WIRED describes how quickly and comprehensively the Chinese government moved to suppress social media posts that dealt with the initial outbreak of COVID-19 in Wuhan. The efforts at suppression go back at least as far as the first week of January. How have reporters become aware of them? By following the maxim “Cover China as if you were covering Snapchat.” The posts have a brief life, so when you see something interesting, take a screenshot before the post is quashed and the account blocked for "spreading malicious rumors." Weibo and WeChat Moments are the most commonly used platforms on which ephemeral posts appear.
Avoiding embarrassment would surely have been a principal goal of the censorship campaign, but it may also have had a more direct, practical objective. The motivation for suppressing the news may in part have been motivated by plans to stockpile necessary medical supplies. The AP and POLITICO report seeing a US Department of Homeland Security report that says, in part, “We further assess the Chinese Government attempted to hide its actions by denying there were export restrictions and obfuscating and delaying provision of its trade data.” Before informing the World Health Organization of the epidemic's outbreak, Beijing significantly cut back exports and increased imports of such basic medical equipment as facemasks, gloves, and gowns.
Intelligence services continue to investigate the source of the outbreak. The Washington Examiner reported over the weekend that a majority of the agencies in the US Intelligence Community now believe "with high confidence" that the COVID-19 pandemic originated in the Wuhan Institute of Virology. The release is believed to be accidental, and the virus is not thought to have been engineered. The Examiner also quotes US Secretary of State Pompeo as saying that there's "enormous evidence" of the lab's role in the initial spread of the virus.
The World Health Organization says (according to Agency France Presse) that the US hasn't so far provided it with any evidence of the laboratory's involvement, nor, Foreign Policy reports, have the other Five Eyes offered any confirmation. Canadian and British officials said the question remained open, "still too early to offer firm conclusions," as Canadian Prime Minister Trudeau put it. Australian Prime Minister Morrison, according to the Sydney Morning Herald, said his government's view was that the virus probably jumped to human populations in a Wuhan wet market, and while a lab accident was possible, Australian services estimated it as low ("5%"). In any case Prime Minister Morrison called for an independent investigation by the G20 into COVID-19's origins.
Dr. Anthony Fauci, Director of the US National Institute of Allergy and Infectious Diseases, told National Geographic that the virus probably emerged in the wild and made the jump to humans from there. More investigation remains to be done, as the BBC reports, noting as it does so the difficulty of working out attribution without running afoul of American and Chinese sensibilities.
The Chairman of the US Joint Chiefs of Staff, US Army General Mark Milley, on Tuesday offered an assessment of where the ongoing US investigation into the origins of COVID-19 stood at midweek. As the Hill reported, General MIlley told reporters, "The weight of evidence — nothing’s conclusive — the weight of evidence is that it was natural and not man-made. The second issue is, was it accidentally released, did it release naturally into the environment or was it intentional? We don’t have conclusive evidence in any of that, but the weight of evidence is that it was probably not intentional.” He called upon China to cooperate with international investigators.
So the current state of the question seems to be that the virus was not artificially engineered, but rather emerged naturally, and was not intentionally released. Whether the outbreak originated in human contact with infected animals (more widely believed, as CNN reports in an account of views prevailing in the other four of the Five Eyes intelligence services) or in an accident at a Wuhan laboratory (possible, but with evidence inconclusive) remains undetermined.
US Secretary of Defense accuses Russia, China, of exploiting COVID-19 pandemic.
Secretary of Defense Mark Esper yesterday said that Russia and China were exploiting the COVID-19 pandemic to gain influence in Europe, Reuters reports. The matter came up in an interview with La Stampa during the course of which Secretary Esper was asked about Russian and Chinese offers of aid to Italy during the emergency. Such aid, he said, represents a play to enhance their position in both Italy and Europe as a whole. In the case of China, the Secretary added that Beijing was also using the crisis as an opportunity for more tightly coupling its exports to European supply chains. (Russian prospects of capturing such supply chains are not generally reckoned as favorable.)
Foreign Policy describes China's "playbook" for enhancing its stature during the pandemic. The basic goal is to show that the Chinese Communist Party's relatively authoritarian system is better at managing crisis than democratic alternatives.
Twitter's appeal to reason against misinformation.
Twitter's still trying to control the rumor that 5G causes COVID-19. One would have hoped the odd belief that cell towers are somehow the cause of coronavirus infections would have by now passed its expiration date. Alas, no: Twitter is still grappling with the dissemination of that particular theory, often linked by the credulous to suspicion that the whole matter is linked to a deeper conspiracy to cull the herd, to prepare for some horrendous world order of social control (and that fear exists in left, right, and center forms). The Telegraph says that Twitter's most recent approach to the rumor is to prompt people who tweet it to read an official British report debunking the cell-service origin theory, which is so direct and almost charmingly naive (and we mean “naive” in the best possible sense of the word) that one wishes them all success. Why not give the invisible hand of the marketplace of ideas a chance to work its magic? Give reason a chance?
This particular bit of misinformation is dangerous not because it's affecting treatment or compliance with public health advice; it's dangerous because it's inspired people to vandalize cell towers. An ex-Googler told the Telegraph earlier piece that he sees structural problems with social media that tend to cause misinformation cascades. He's concerned mostly with YouTube, and sees the algorithmic push to "optimize watch time at all costs" as fostering the propagation of spectacularly false and spectacularly attractive content. Substitute "engagement" for "watch time" to generalize the problem. The problem has involved more than just vandalism: some telecom maintenance workers in the UK were attacked by locals who accused them of setting up the virus infrastructure.
Facebook's oversight board.
In contrast with Twitter's offer of the straight dope to those doing their own research, Facebook advanced its plans for an oversight board that would assist with content moderation. Four members of the board took to a New York Times op-ed (paywalled) to explain their charter. They wrote, in part:
"[W]e know that social media can spread speech that is hateful, harmful and deceitful. In recent years, the question of what content should stay up or come down on platforms like Facebook, and who should decide this, has become increasingly urgent.
"So in November 2018, recognizing that no company should settle these issues alone, Facebook committed to creating an independent oversight body that will review Facebook’s decisions about what content to take down or leave up. Over the past 18 months, more than 2,000 experts and other relevant parties from 88 countries have contributed feedback that has shaped the development of this oversight board, which will have 20 members (ultimately growing to 40) and is scheduled to become operational this year.
"The oversight board will focus on the most challenging content issues for Facebook, including in areas such as hate speech, harassment, and protecting people’s safety and privacy. It will make final and binding decisions on whether specific content should be allowed or removed from Facebook and Instagram (which Facebook owns)."
CNBC published a list of the board's initial twenty members, and reports Facebook's characterization of the board as representing an ideologically, religiously, and culturally diverse selection of experts. The twenty members on the list CNBC provided are:
- Afia Asantewaa Asare-Kyei, human rights advocate at the Open Society Initiative for West Africa
- Evelyn Aswad, University of Oklahoma College of Law professor who formerly served as a senior U.S. State Department lawyer
- Endy Bayuni, journalist who twice served as the editor-in-chief of the Jakarta Post
- Catalina Botero-Marino, Facebook Oversight Board co-chair, dean of the Universidad de los Andes Faculty of Law
- Katherine Chen, communications scholar at the National Chengchi University and former national communications regulator in Taiwan
- Nighat Dad, digital rights advocate who received the Human Rights Tulip Award
- Jamal Greene, Facebook Oversight Board co-chair, Columbia Law professor
- Pamela Karlan, Stanford Law professor and United States Supreme Court advocate
- Tawakkol Karman, Nobel Peace Prize laureate named as one of “History’s Most Rebellious Women” by Time
- Maina Kiai, director of Human Rights Watch’s Global Alliances and Partnerships program
- Sudhir Krishnaswamy, vice chancellor of the National Law School of India University
- Ronaldo Lemos, technology, intellectual property and media lawyer who teaches law at Universidade do Estado do Rio de Janeiro
- Michael McConnell, Facebook Oversight Board co-chair, Stanford Law professor who previously served as a federal circuit judge
- Julie Owono, digital rights and anti-censorship advocate who leads Internet Sans Frontieres
- Emi Palmor, former director general of the Israeli Ministry of Justice
- Alan Rusbridger, former editor-in-chief of The Guardian
- Andras Sajo, former judge and vice president of the European Court of Human Rights
- John Samples, helps lead a libertarian think tank and writes extensively on social media and speech regulation
- Nicolas Suzor, Queensland University of Technology Law School professor
- Helle Thorning-Schmidt, Facebook Oversight Board co-chair, former Prime Minister of Denmark
One of Facebook's goals in establishing the board is to clear itself of accusations of bias in content moderation. It seems safe to assume that this will be difficult to achieve, as a critical op-ed in the Telegraph already indicates. The board itself agrees. As they write at the conclusion of their New York Times op-ed:
"We also know that we will not be able to please everyone. Some of our decisions may prove controversial and all will spur further debate.
"But we speak for all the members of the oversight board when we say that we are committed to demonstrating the value of an independent, principled and transparent oversight process and to serving the online community."
Facebook takes down coordinated inauthenticity.
It is interesting, then, to see how relatively uncontroversial Facebook's culling of coordinated inauthenticity has been. Facebook has removed "hundreds" of disinformation accounts. The geographical breakdown of the accounts, as reported in Menlo Park's report on coordinated inauthenticity for April says that many of them were state sponsored, others associated with political groups. Georgia leads, with almost a thousand suspect accounts taken down; they were for the most part associated with domestic political groups. Russia and Iran showed high levels of state-directed activity directed at foreign targets. A number of takedowns in the US removed inauthentic accounts associated with conspiracy theorists at QAnon. Accounts taken down in Mauretania and Myanmar focused on domestic audiences, and the Myanmar operations were associated with that country's police.
- Russia (46 Pages, 91 Facebook accounts, 2 Groups, and 1 Instagram account, with complicated associations and goals)
- Iran (118 Pages, 389 Facebook accounts, 27 Groups, and 6 Instagram accounts, with an international focus and a connection to the Islamic Republic of Iran Broadcasting Corporation)
- Mauritania (11 Pages, 75 Facebook accounts, and 90 Instagram accounts, with a focus on domestic audiences)
- Myanmar (3 Pages, 18 Facebook accounts, and 1 Group, also with a domestic focus and aligned with the Myanmar police)
- Georgia (511 Pages, 101 Facebook accounts, and 122 Groups, and 56 Instagram accounts linked to the media firm Espersona; and 23 Facebook accounts, 80 Pages, 41 Groups, and 9 Instagram accounts connected with the United National Movement, a political party)
- The US (5 Pages, 20 Facebook accounts, and 6 Groups associated with the QAnon conspiracy theory network, and 19 Pages, 15 Facebook accounts, and 1 Group linked to anti-immigration groups VDARE and the Unz Review)
The QAnon types Facebook gave the boot for misrepresenting themselves have already migrated to other, less exclusive platforms, according to Business Insider.