At a glance.
- Lessons on influence operations from a hybrid war.
- Disinformation themes in Russia's hybrid war.
- A look at Russia's NewsFront and related media operations.
- Not all coordinated inauthenticity is Russian.
- US Department of Homeland Security shuts down its Disinformation Governance Board.
- A taxonomy of falsehood.
Lessons on influence operations from a hybrid war.
The Atlantic Council has published a set of twenty-three lessons to be learned from half a year of Russia's war against Ukraine. Two of them have particular relevance to influence operations:
- "Lesson for wartime strategic communications: Influence operations are a day-in, day-out job." (Offered by Jennifer Counter, nonresident senior fellow at the Scowcroft Center’s Forward Defense practice.) Russia has not succeeded in influence operations, but Ukraine has. "The beauty of what the Ukrainians have accomplished is that a vast network of people who follow the government’s messaging lead and further spread the campaign in ways that their individual networks can understand—thus building new advocates and reinforcing Ukraine’s base of support." The Russian influence campaign has relied upon large-scale disinformation, and it hasn't worked. The Ukrainian approach offers a clear contrast. "In large part, the Ukrainian government uses firsthand accounts and video clips as evidence, which further reinforces its message; and crucially, it has not resorted to large-scale mis- and disinformation as Russia has. Overall, the cohesion and duration of the Ukrainians’ campaign can, and should, be used as a template for what the United States and its allies can accomplish with an influence strategy, communications discipline, and a willingness to grind day-in, day-out to meet the end goal."
- "Lesson for US homeland security: Ignoring the home front is a serious mistake." (From Thomas S. Warrick, nonresident senior fellow at the Scowcroft Center for Strategy and Security’s Forward Defense practice.) The inherently deniable and ambiguous character of cyber conflict tends to spread its effects beyond the immediate theater of operations. The US got off to a good start, but emphasis may have faded in recent months. "After an initial burst of activity culminating in late April and early May, efforts by the US Department of Homeland Security (DHS) to counter Russia’s hybrid war in the United States appear to have faded—even amid a Russian “avalanche of disinformation,” as the Atlantic Council’s Digital Forensic Research Lab has documented. The last update to the Cybersecurity and Infrastructure Security Agency’s “Shields Up” webpage was dated May 11, and the most recent entry in CISA’s “Russia Cyber Threat Overview” was dated April 20. The last Russia-specific public alert, “Russian State-Sponsored and Criminal Cyber Threats to Critical Infrastructure,” was revised May 9. While DHS and the FBI are in frequent communications with agencies, companies, and individuals targeted by Russian cyberattacks, the public is often unaware of this quiet but vital activity. So more needs to be done by DHS and others to get the American people to understand and better resist the Russian hybrid-warfare campaigns that promote divisive propaganda and social-media manipulation. Russia’s hybrid-warfare strategy, which uses disinformation even more than cyberattacks, seems designed to wear down Western democracies’ opposition to Russia’s aggression. Senior DHS and administration officials should speak out more publicly on what Americans can do to counter Russian disinformation, cyber threats, and other Russian hybrid-warfare targeting of the civilian population. The home front—specifically, unity in the United States and NATO in opposing Russian aggression against Ukraine—is a vital source of national power. Ignoring it, or treating Ukraine as almost entirely a military and diplomatic crisis, could be a perilous mistake."
Disinformation themes in Russia's hybrid war.
The US State Department yesterday marked Ukraine's Independence Day with an overview of the recurrent themes in Russian disinformation directed against Ukraine. "President Putin regularly denies that Ukraine is a true nation," State's summary begins. "In 2008, he told President Bush “Ukraine is not a country,” and in July 2021, he publicly identified Russians and Ukrainians as “one people” and declared “the true sovereignty of Ukraine is possible only in partnership with Russia.” Putin has repeatedly attempted to depict Ukraine as “entirely created by Russia” on Russia’s “historical lands,” arguing that Ukraine is an “inalienable part of…[Russia’s] own history, culture and spiritual space.” In a February 2022 missive, he blamed the first Soviet leader, Vladimir Lenin, for “creating Lenin’s Ukraine” and called it “worse than a mistake,” days later launching a full-scale war, in an attempt, to redraw the map according to his own distorted version of history. Over the past six months, Russia’s disinformation and propaganda machine has used Putin’s false claims as a blueprint for campaigns aiming to deny Ukraine its right to independence and even existence."
The specific themes include:
- The use of maps to depict an imagined imperial past as defining the legitimate boundaries of Ukraine, shrunken to a small area around Kyiv, with the rest of the country's territory divided among Russia and other neighbors.
- An attempt to show deep continuity between Ukrainian and Russian history, with a view to the erasure of any independent Ukrainian culture.
- "Dehumanizing rhetoric plus denials and threats," with commentators on Russia television providing some of the starkest examples. In this case the rhetoric has been backed with filtration camps and forced deportations.
- Staged plebiscites designed to provide a simulacrum of popular support for Russian annexation of occupied provinces. Such referenda have been slow to get under way: "U.S. intelligence indicates that Russian officials are concerned about low voter turnout in these sham referenda and know their efforts to legitimize the illegal land grab will lack legitimacy and will not reflect the will of the people."
- Claims that Ukrainians are really, and deeply, convinced of their unity with Russia, that Ukrainians believe themselves to be one people with Russians. (State doesn't mention this instance, but a recent talk show on Rossiya 1 provides a particularly implausible example of such unlikely insistence.)
A look at Russia's NewsFront and related media operations.
The Stanford Internet Observatory has published A Front for Influence: An Analysis of a Pro-Kremlin Network Promoting Narratives on COVID-19 and Ukraine, by researcher Christopher Giles, which describes the background to Twitter's takedown of several long-running disinformation operations centered around the Russian media outlets NewsFront, Kherson Live, and Ukraine Today. The first of these is well-known and widely recognized as a Russian government operation. The other two, less prolific, have had more success at flying under the radar. "The influence tactics used in this network are not novel," the study reports. "The network included accounts created in bulk, which then coordinated and amplified specific narratives or were used to promote propaganda news sites." Among the recurring themes are NATO responsibility for Russia's war against Ukraine and American responsibility for the COVID pandemic.
Not all coordinated inauthenticity is Russian.
Stanford's Internet Observatory this morning blogged about its investigation of the takedown, by Twitter and Facebook's parent Meta, of two coordinated networks of inauthentic accounts. "In July and August 2022," the Internet Observatory wrote, "Twitter and Meta removed two overlapping sets of accounts for violating their platforms’ terms of service. Twitter said the accounts fell foul of its policies on 'platform manipulation and spam,' while Meta said the assets on its platforms engaged in 'coordinated inauthentic behavior.' After taking down the assets, both platforms provided portions of the activity to Graphika and the Stanford Internet Observatory for further analysis."
The investigation found "an interconnected web of accounts on Twitter, Facebook, Instagram, and five other social media platforms that used deceptive tactics to promote pro-Western narratives in the Middle East and Central Asia." These efforts amounted to disparate campaigns conducted over approximately five years. "These campaigns consistently advanced narratives promoting the interests of the United States and its allies while opposing countries including Russia, China, and Iran. The accounts heavily criticized Russia in particular for the deaths of innocent civilians and other atrocities its soldiers committed in pursuit of the Kremlin’s 'imperial ambitions' following its invasion of Ukraine in February this year. A portion of the activity also promoted anti-extremism messaging."
The study draws two lessons in particular. First, the range of tactics available to, or at least used by, coordinated campaigns using inauthentic personae is limited; the tricks have been seen before. "The assets identified by Twitter and Meta created fake personas with GAN-generated faces, posed as independent media outlets, leveraged memes and short-form videos, attempted to start hashtag campaigns, and launched online petitions: all tactics observed in past operations by other actors." And, second, these campaigns seem not to have had much reach. "The vast majority of posts and tweets we reviewed received no more than a handful of likes or retweets, and only 19% of the covert assets we identified had more than 1,000 followers."
The generally successful Ukrainian operations provide an interesting contrast with this generally pro-Western network.
The Washington Post offers two observations on the research. First, it came from a Western country and a Western university. Earlier descriptions of Western influence operations have tended to come from adversaries, notably Chinese or Russian sources. Second, there's no attribution to any specific government.
US Department of Homeland Security shuts down its Disinformation Governance Board.
US Secretary of Homeland Security Alejandro Mayorkas yesterday announced that his Department was canceling plans to establish a Disinformation Governance Board. “In accordance with the HSAC’s [Homeland Security Advisory Council's] prior recommendation, Secretary of Homeland Security Alejandro N. Mayorkas has terminated the Disinformation Governance Board and rescinded its charter effective today, August 24, 2022, the Department's announcement said. "With the HSAC recommendations as a guide, the Department will continue to address threat streams that undermine the security of our country consistent with the law, while upholding the privacy, civil rights, and civil liberties of the American people and promoting transparency in our work.” The Disinformation Governance Board had drawn criticism as a step toward erosion of freedom of speech and thought, which of course the Department was at pains to dispute, but which nonetheless induced a pause in the Board's formation and a request for advice.
His decision followed a recommendation issued earlier in the day by the Homeland Security Advisory Council's Disinformation Best Practices and Safeguards Subcommittee. "We previously recommended to the full Council—and the Council has accepted our recommendation—that there is no need for a separate Disinformation Governance Board." That said, the Subcommittee sees a need for work to counter disinformation. "It is our assessment that the underlying work of Department components on this issue is critical. The Department must be able to address the disinformation threat streams that can undermine the security of our homeland."
DHS should be able to do without controversy, the Subcommittee thinks. "While the components of the Department of Homeland Security have some of the most compelling reasons to address disinformation that could undermine their missions, other agencies of government have for decades taken similar steps without controversy." Some models for this have included the way the National Oceanic and Atmospheric Administration (NOAA) disseminates accurate information about weather, climate, and so on. Should misinformation about these areas circulate in ways that affects the operations of organizations that depend upon accurate information, NOAA takes care to correct the misinformation. The National Highway Traffic and Safety Administration (NHTSA) performs an analogous role with respect to motor vehicle safety. And the Subcommittee also commends the office of the US Surgeon General with its work on accurate health information. (This last authority has run into periods of episodic controversy over the last few decades, most recently during the COVID pandemic, but the Subcommittee no doubt has the Surgeon General's longer-running successes, like its campaign to disseminate accurate information about the dangers of smoking, in mind.)
The Department of Homeland Security, the Subcommittee said, "cannot render effective service to the American people without being able to speak authoritatively and accurately to the public. Critically, this work can and must be undertaken consistent with the law and best practices. To address its Congressionally mandated missions, the Department needs the ability to identify, analyze, and, where necessary, address certain incorrect information, especially but not limited to information that tends to undermine public safety and malicious efforts by foreign governments and foreign actors to manipulate the American public." But this does not constitute authority to take the sort of general action some had seen as the mission of the Disinformation Governance Board. The relevant section of the report said:
"We emphasize, in this regard that the Department of Homeland Security does not have a broad remit to address all inaccurate information or disinformation, nor does it have the authority to silence or sanction anyone’s speech. Rather, its efforts should focus on (a) assessing whether publicly disseminated disinformation impedes missions assigned to the agency by law and (b) disseminating correct information. The Department can and should speak publicly to accurately inform the public of disinformation. The Department already engages in much of this activity to good effect. These public communications efforts can include publishing a factual correction, publishing additional context, identifying the source of the disinformation as a foreign actor, informing the public about issues relating to the credibility of the disseminator—such as by revealing facts that show a motive to lie or a conflict of interest—disclosing that the author is using a false identity, or revealing past falsehoods, for example. Apart from these public rebuttals, the Department can and should also bring such disinformation to the attention of other government agencies for appropriate action and to platforms hosting the falsehoods. It is for the platforms, alone, to determine whether any action is appropriate under their policies."
The Subcommittee recommended that the Department of Homeland Security's Office of Intelligence and Analysis should take a leading role in furnishing that sort of information to government and private organizations. In doing so, it should concentrate on:
- "Foreign influence operations;
- "Domestic Violent Extremist-driven disinformation that elevates risks of violence;
- "The identity of high-volume disinformation purveyors;
- "Emerging focal points of disinformation, such as dangerously inaccurate health advice; and
- "Emerging technologies that intensify dissemination and the targeting of disinformation."
A taxonomy of falsehood.
In the course of its discussions, the Homeland Security Advisory Council's Disinformation Best Practices and Safeguards Subcommittee offered a guide to different varieties of falsehood. "Disinformation is, in essence, a particularly pernicious form of inaccurate information. As the Inspector General observed, a 'disinformation campaign occurs when a person, group of people, or entity (i.e., a ‘threat actor’ or a hostile nation) coordinates to distribute false or misleading information while concealing the true objectives of the campaign.'" It comes in three forms:
- Disinformation proper: "the deliberate dissemination of falsehoods."
- Misinformation: "the unintentional propagation of falsehoods."
- Malinformation: "the intentional spreading of genuine information with the intent to cause harm, for example, by moving private and personal information into the public sphere."
None of these, the report notes, are recent developments, but technology has rendered them, the report argues, more pernicious: "While lies and gossip are ancient vices, changes in telecommunications and social media permit disinformation to spread at an unprecedented scale, speed, and scope. Images and other forms of media can be forged to look real, making it difficult for individuals and organizations to know what is true and what is false."