At a glance.
- Trends in Iranian disinformation.
- Beijing's presentation of Uyghur life.
- Content farming as influencers grow in importance to advertising.
- Civil liberties as an instrumental good in influence (metaphorical) wars.
Iranian disinformation operations: small, slow, and patient.
Recent Iranian disinformation operations targeting Israel took pains to remain small and stay patient. Tehran's operators concentrated on making quiet inroads into Telegram channels, WhatsApp groups, and messaging apps used by activists in Israel, the New York Times reports, the better to remain undetected. The payoff, such as it was, was to serve the entropic goal of sharing polarizing content, particularly in the wake of former Prime Minister Netanyahu's electoral defeat. The images and memes, traceable to Iranian sources, were distributed by inauthentic accounts. The operation seems not to have progressed much beyond trolling, but it represents an interesting shift in Tehran's influence campaigns, now at least as interested as inducing friction (in the familiar Russian manner) as in pushing a specific positive line (positive from Iran's point of view). The effort also showed an ability to work through the various safeguards and filters the platforms have in place.
China's rosy presentation of Uyghur life: "We are very free."
The New York Times has an account of how China's government has engaged in counter-propaganda in response to reporting that described the harsh, repressive treatment of the Muslim Uyghur minority in Xinjiang. Videos distributed through various media represent a coordinated and large-scale astroturfing effort in which nominal Uyghyurs describe how good they have it. People in Xinjiang are "happy" and "very free," say the videos in what rhetoricians might call "unlikely insistence." The productions don't, of course, bear such marks of government media as logos, but they do have an awful lot of English subtitles. These, and other stigmata of state-organization, will move many viewers to skepticism.
Content farming for influencers.
State-nurtured content farming in China appears to have assumed an important role in influencer culture on TikTok, the Rest of World reports. A lot of the content shows industrial production, and a lot people seem to be watching.
SEO and content moderation.
We've seen that inauthenticity seems easier to detect and manage than falsehood. Google has undertaken a project, however, in which it will caution users when it doesn't really have a reliable result to report. How this works out, particularly with respect to rapidly evolving topics, will be interesting to see. It doesn't seem likely that this will amount to censorship, but rather to a kind of epistemic modesty.
Doxing as a tactic in influence wars.
Principled opposition to doxing seems as difficult to find, if this Washington Post piece or this Atlantic Council post are any reliable guides, as is principled free-speech absolutism. It's not Lenin's kto kogo, perhaps, but it does seem that privacy and other civil liberties have a more instrumental place in the culture war than some might hope. That there are indeed fringe and arguably dangerous theories out there about the 2020 US election (and, for that matter, UFOs) is undeniable. But many seem comfortable with quick movement to direct viewpoint censorship as a way of countering them.