At a glance.
- A helicopter crash, and indifference to truth.
- Fining social media.
- Redefining "democracy."
- Individual responsibility for culpably false belief.
- Cultivating intellectual humility.
Disinformation as the child of misfortune and accident.
On December 8 a helicopter crash took the life of India's Chief of Defence Staff, General Bipin Rawat, and thirteen others, NDTV reported. The crash took place in Tamil Nadu, shortly after take-off. The Indian Air Force is investigating the crash. The Economic Times, citing research by Logically, says that Pakistani disinformation networks were quick to retail the theory, which is generally thought to be false, that the helicopter was shot down by Tamil insurgents. USA Today fact-checked a widely circulated video of a helicopter going down that's said to show General Rawat's last moments. But it doesn't: the video was shot in Syria in 2020.
Russia fines Twitter and Facebook for failure to remove prohibited content.
Computing reports that a Russian court has levied fines on behalf of communications regulator Roskomnadzor against the US-based multinational social networks for their failure to remove proscribed content. Facebook was fined 13 million rubles ($176,345), Twitter 10 million ($135,650). Roskomnadzor has also since March slowed Twitter down in Russia. The content that prompted both the fines and the slowdown is stuff that few would defend on anything but the most rigorous, fundamentalist free-speech grounds—"child pornography, drug abuse information or calls for minors to commit suicide," as Computing sums them up—but the restrictions also represent the entering wedge of a more general encroachment by the state on Internet content.
China's framing of democracy.
Chinese influence operations continue their practice of pursuing positive persuasion as opposed to simply seeking confusion to the enemy, which is more the Russian style. In a partial response to US President Biden's Summit of Democracies (to which China was pointedly not invited), the State Council of the People's Republic has announced the release of a white paper, "China: Democracy That Works." (Xinhua has the full English-language version.)
In most respects Beijing's vision of democracy amounts to a redefinition (or at least a reversion to something out of Rousseau). "China's whole-process people's democracy integrates process-oriented democracy with results-oriented democracy, procedural democracy with substantive democracy, direct democracy with indirect democracy, and people's democracy with the will of the State," the State Council says, adding. "There is no fixed model of democracy; it manifests itself in many forms. Assessing the myriad political systems in the world against a single yardstick and examining diverse political structures in monochrome are in themselves undemocratic."
Unsaid but suggested: China has a democracy that works, unlike whatever the Americans are fooling around with. And, by the way, foreigners, we really don't need your advice or consent.
We have met the enemy, and he is you.
Axios interviews Facebook executive Andrew Bosworth, who argues that political and medical misinformation are societal problems, and not something that can be simply laid at the door of Facebook and other platforms. "Individual humans are the ones who choose to believe or not believe a thing. They are the ones who choose to share or not share a thing," Bosworth told Axios, adding, "I don't feel comfortable at all saying they don't have a voice because I don't like what they said." Whatever role online communities and the platforms that mediate them may play in misinformation, Bosworth's not wrong about responsibility for belief.
We have met the enemy, and he is us.
The University of Chicago's Center for Practical Wisdom also has a take on the individual's role in fighting mis- and disinformation. We should all, the Center says, cultivate intellectual humility, that is, the habitual sense that we might, after all, have it wrong.
Intellectual humility is rational in the sense that we can’t all be right in most of our disagreements, we are often irrationally overconfident, and the evidence on which our beliefs and viewpoints are based is often rather flimsy. So, why would rational people be as sure of themselves as most of us are?
None of us thinks that our beliefs and attitudes are incorrect; if we did, we obviously wouldn’t hold those beliefs and attitudes. Yet, despite our sense that we are usually correct, we must accept that our views may sometimes turn out to be wrong. This kind of humility isn’t simply virtuous—the research suggests that it results in better decisions, relationships, and outcomes. So, the next time you feel certain about something, you might stop and ask yourself: Could I be wrong?