Ukraine at D+162: Ukraine's counteroffensive, and the story of a sub-JV Russian troll farm.
N2K logoAug 5, 2022

Ukraine claims to have scored against Russian ammunition supply points, bridges, and air defense units as its counteroffensive gains urgency. Meta takes down a large Russian influence network whose "coordinated inauthenticity" hasn't risen above a stumblebum level.

Ukraine at D+162: Ukraine's counteroffensive, and the story of a sub-JV Russian troll farm.

The nuclear power plant at Zaporizhzhia, on the Dnipro River in southeastern Ukraine, has been for months under Russian occupation, and its security and safety remain a matter of concern. This morning's situation report from the UK's Ministry of Defence focused on the role forces in and around the installation are playing in Russia's war. "Following five months of occupation, Russia’s intentions regarding the Zaporizhzhia Nuclear Power Plant remain unclear. However, the actions they have undertaken at the facility have likely undermined the security and safety of the plant’s normal operations." First, the areas adjacent to Zaporizhzhia are serving as active artillery parks. "Russian forces are probably operating in the regions adjacent to the power station and have used artillery units based in these areas to target Ukrainian territory on the western bank of the Dnipro river." And the plant has provided a shield behind which Russian forces can assemble without a high risk of attack. "Russian forces have probably used the wider facility area, in particular the adjacent city of Enerhodar, to rest their forces, utilising the protected status of the nuclear power plant to reduce the risk to their equipment and personnel from overnight Ukrainian attacks." The Zaporizhzhia plant is overbuilt, designed to withstand a range of disasters, the Telegraph reports, and Ukrainian authorities say they'll plan and execute any strikes carefully to avoid damage to the facility.

CyberFront Z's failed influence operation.

Facebook's corporate parent Meta released its Adversarial Threat Report for the second quarter of 2022 yesterday. Prominently featured in the report is Meta's account of its monitoring of, and action against, a large Russian troll farm that had been marshaled to support Moscow's narrative concerning Russia's war against Ukraine. It's connected to the notorious Internet Research Agency, itself connected with Russian attempts at influence operations during recent US elections. (It's also one of the enterprises in the empire of Yevgeniy Prigozhin, who also runs the Wagner Group of contract combat units. Early this week the US State Department offered a $10 million reward for information on Mr. Prigozhin and his activities, should any of his associates or employees be interested in ratting him out.) In this case the flagship of the influence operation is "CyberFront Z:"

"We’re also sharing our threat research into a troll farm in St. Petersburg, Russia, which unsuccessfully attempted to create a perception of grassroots online support for Russia’s invasion of Ukraine by using fake accounts to post pro-Russia comments on content posted by influencers and media on Instagram, Facebook, TikTok, Twitter, YouTube, LinkedIn, VKontakte and Odnoklassniki. Our investigation linked this activity to the self-proclaimed entity CyberFront Z and individuals associated with past activity by the Internet Research Agency (IRA). While this activity was portrayed as a popular 'patriotic movement' by some media entities in Russia, including those previously linked to the IRA, the available evidence suggests that they haven’t succeeded in rallying substantial authentic support."

"Coordinated inauthentic behavior" is Meta's, and previously its subsidiary Facebook's, term of art for organized trolling in the service of disinformation. The term is self-explanatory. Instead of attacking disinformation on the basis of content, and thereby seeking directly to moderate and control content, the company has typically gone after campaigns that use false personae (inauthentic identities) with evidence of coordination or central direction.

Meta's report explains what it found, and what it did about its discovery. "We took down a network of Instagram accounts operated by a troll farm in St. Petersburg, Russia, which targeted global public discourse about the war in Ukraine. This appeared to be a poorly executed attempt, publicly coordinated via a Telegram channel, to create a perception of grassroots online support for Russia’s invasion by using fake accounts to post pro-Russia comments on content by influencers and media." CyberFront Z was, in Meta's estimation, "the Z team," that is, definitely not the A team, not even the junior varsity. "This deceptive operation was clumsy and largely ineffective — definitely not 'A team” work,'" the report says. "On Instagram, for example, more than half of these fake accounts were detected and disabled by our automated systems soon after creation. Their efforts didn’t see much authentic engagement, with some comments called out as coming from trolls. We also found instances of the 'trolls' who sprinkled pro-Ukraine comments on top of the paid pro-Russia commentary, in a possible attempt to undermine the operation from within."

While the operations of CyberFront Z were labor intensive--they concentrated on commenting in social media with posts written by human operators--they seemed to have included only perfunctory gestures in the direction of building convincing personae. The overall goal, however, was to create an impression of grassroots opinion. The one-note concentration on the many evils of what CyberFront Z characterized as Ukraine's "Nazi" regime, however, seemed to have proven largely unpersuasive. In several channels the comments attracted pro-Ukrainian and anti-Russian posts that outnumbered CyberFront Z's comments.

Meta researchers were able to link members of CyberFront Z to the Internet Research Agency (IRA). In some respects the tactics, techniques, and procedures used were throwbacks. "The Z Team’s tactics closely resembled those of the IRA in its earliest days, in 2013, when it focused on targeting the Russian opposition domestically, including now-jailed activist Alexei Navalny. The earliest exposés of the IRA’s Russian-language activity spoke of an office in the Olgino suburb of St. Petersburg, where teams of Russians were paid to mass-post pro-government comments on online forums, including LiveJournal. Back then, that operation advertised widely for writers, tipping off several investigative journalists, who responded to the ads, worked in the building and then exposed the now infamous troll-farm."

In all, Meta evaluates CyberForce Z as a fizzle. They offer one caution. Influence operations seek to become self-reinforcing, and they do so in part by creating an impression of success. The growing public awareness of (and fear of) disinformation can contribute to such reinforcement. "These examples underscore the importance of analyzing attempted influence operations according to the evidence," Meta writes, "and not taking any claims of viral success at face value. Some threat actors try to capitalize on the public’s fear of influence operations by trying to create the false perception of widespread manipulation, even if there is no evidence — a phenomenon we called out in 2020 as 'perception hacking.'" Besides, it helps the trolls look good to the boss, so there are local, self-interested incentives for the trolls to shine it on. "It is possible that operators may also do this in an attempt to convince their funders or employers of their effectiveness in corralling large-scale authentic movements, while 'faking' this engagement on the back end. However, the available evidence suggests that they haven’t succeeded in rallying substantial authentic support online as part of this operation."

The story of CyberFront Z is an instance of a familiar phenomenon. Russian disinformation has been most effective when it's been negative, entropic, engaged in increasing the adversary's friction. When it's had the positive aim of persuasion, it's been far less effective. Confusion has been easier to achieve than conviction.

Rolling Stone indelicately (but with a primly placed asterisk) sums up the career of CyberForce Z's IRA parent: "Russia’s Infamous Troll Farm Is Back — and Sh*tting the Bed."

A quick look at morale.

Positive propaganda often has a domestic audience, and some of the amplification used with CyberForce Z sought to frame the operations as a patriotic movement. The Atlantic Council reports polling data (which should be received with caution when collected in a closed and repressive society like Russia, but which in this case seem to offer some evidence) that suggest that popular support for Russia's war remains relatively solid among the Russian civilian populations. There have been relatively few protests, and not many resignations in protest. That support, however, hasn't translated into volunteers for military service. And in the military things seem to be different. High casualties appear to be affecting morale, and Russia has had difficulty recruiting soldiers while fighting in Ukraine continues. The morale problems are surfacing at the front, where, according to the Guardian, there are credible reports of combat refusals.