At a glance.
- Russian electioneering (and Western complicity therein).
- Domestic political spin.
- Algorithms designed to reward engagement don't prize truth, much.
- Separating editorial decisions from engagement?
Electioneering and voter suppression, Russian-style.
Russia conducted elections for the Duma over this past weekend, and the Putinist United Russia party has retained its comfortable majority. The effective leader of opposition to President Putin, Alexei Navalny, is in prison on a variety of charges ranging from fraud to extremism (external observers generally regard the charges as trumped up), but the Russian government isn’t interested in seeing the opposition maintaining a presence online, either.
Setting a precedent that WIRED called "troubling, Apple and Google acceded to the Kremlin's request that they remove opposition “voting apps” prepared by Navalny’s Smart Voting project from their stores. The app in question was a voting guide, not a mechanism for casting votes. “Created by associates of imprisoned opposition leader Aleksei Navalny, it offered recommendations across each of Russia’s 225 voting districts for candidates with the best shot of defeating the dominant United Russia party in each race.”
Radio Free Europe reported that Telegram has done likewise, blocking chat bots Smart Voting had used for endorsing candidates. Telegram said that it was following Russian “election silence” laws, represented as similar to laws in other countries that restrict various forms of campaigning during the elections themselves. (Here in Maryland, for example, it’s illegal to buttonhole people standing in line outside polling places. If you want to talk to them or hand them a leaflet, you’ve got to do it outside the parking lot, or at some comparable distance.) But, according to Radio Free Europe, Telegram’s founder significantly said that developer outfits like his own had little choice but to follow the lead of Apple and Google, so the decision taken in Silicon Valley seems to have flowed to other outlets.
The Atlantic Council summarizes the issue as follows:
“The Russian government has reacted to this voter guide as if facing a serious national security threat—a reaction that has stirred international controversy. The furious (and ultimately successful) efforts to suppress this voter guide not only demonstrate the Russian government’s determination to assert broad control over both the outcome of Russian elections and the information Russian citizens can access online, but also how the underlying dynamics of Russia’s censorship agenda can become an international problem, forcing companies based outside its borders into complicity with domestic repression.”
Investigation misinformation in US elections.
That Russian intelligence organs have engaged in attempts to manipulate US elections (or perhaps to so engender mistrust along American social fault lines that Russia's main adversary would find its civil society significantly weakened) isn't beyond serious dispute. The details of those influence operations remain controversial. And it should surprise no one that homegrown political whoppers certainly took opportunistic advantage of the capers of Cozy Bear and Fancy Bear.
Special counsel John Durham, tasked with investigating potential FBI misconduct during the 2016 election, has secured the indictment of Michael Sussmann, a former Federal prosecutor then working at the Democratic Party-connected law firm Perkins Coie who presented the FBI with information alleging connections between then-candidate Trump and a Russian bank, Alpha Bank. The indictment alleges that Mr. Sussmann lied to the FBI when he “stated falsely that he was not acting on behalf of any client,” which led the Bureau to understand that he “was conveying the allegations as a good citizen and not as an advocate for any client.” The indictment states that Mr. Sussmann was billing the time he spent on researching the matter to the Clinton campaign. He now faces one Federal charge of making a false statement. Mr. Sussmann's counsel has said it's confident that he'll be vindicated.
One of the attractions of running an influence campaign whose aims are the negative ones of increasing the adversary's friction is the readiness with which the adversary will spontaneously cooperate in their own confusion.
ABC News describes Los Angeles County's efforts to control mis- and disinformation during the recent vote on whether to recall California Governor Gavin Newsom. That recall attempt failed, and the Governor has remained in office. It will be interesting to see what lessons may be drawn from Los Angeles County's experience. Their approach seems to have been traditional: responsive rumor control, adapted to social media.
The fault is not in our stars, but in ourselves (or at least in our algorithms).
MIT Technology Review reports that Facebook's engagement maximization algorithms automatically pushed inflammatory, often false, troll-farmed content into American users' news feeds during the 2020 election season, reaching as many as a hundred-forty-million individuals per month. An internal Facebook study concluded, “Instead of users choosing to receive content from these actors, it is our platform that is choosing to give [these troll farms] an enormous reach.” The social network did seek to put "guardrails" in place to keep content from veering too far from some approximation of truth and normality, and it continued its work against coordinated inauthenticity, but its own algorithms were stacked against it.
“This is not normal. This is not healthy,” the report's principal author, Jeff Allen wrote. “We have empowered inauthentic actors to accumulate huge followings for largely unknown purposes ... The fact that actors with possible ties to the IRA have access to huge audience numbers in the same demographic groups targeted by the IRA poses an enormous risk to the US 2020 election.” The troll farmers, Russian, for the most part, followed the lead of the Saint Petersburg-based Internet Research Agency (IRA) in their targeting, prospecting mostly "Christians, Blacks, and Native Americans."
Social media, news, and misinformation.
A study by the Pew Center finds that 48% of American adults "often" or "sometimes" get their news from social media. That seems high, but it actually represents a decline of about five points from last year's findings. Many have called for regulation of social media by governments with a view to pushing them away from the misinformation that cumbers those platforms (and the First Amendment be damned--that was just some 18th Century document anyway, etc.). WIRED has suggested a different approach: separation of editorial from business decisions, which its essayist plausibly suggests was the self-regulatory move most responsible for moving an openly partisan press in the general direction of trustworthy objectivity. Such standards haven't caught on, yet, in part because engagement and the way it's achieved are only imperfectly understood. But if the optimization algorithms could be kept away from shaping or presenting content, that might represent a step in a positive direction. Every William Randolph Hearst deserves an Ambrose Bierce.