CSO Perspectives is a weekly column and podcast where Rick Howard discusses the ideas, strategies and technologies that senior cybersecurity executives wrestle with on a daily basis.
Election propaganda part 1: How does election propaganda work?
According to Renee DiResta, author of "Invisible Rulers: The People Who Turn Lies into Reality," propaganda is a deliberate presentation of inflected information that attempts to further the agenda of those who create it. Assuming she’s right, and I think she is, everything is propaganda. Wherever you get your information, whether you fall to the left or the right on the political spectrum, no matter how legitimate you think the information you consume is, rest assured that it's passing through some kind of inflected prism that bends towards the purpose of the people pushing the information.
I read the New York Times and the Washington Post. Those papers bend to the left. I also read the Wall Street Journal. That paper bends to the right. This isn't bad or good. It just is. The trick is to know that going in and to evaluate the value of the information with that in mind.
This isn't a new thing. According to Susan Wise Bauer, author of "The History of the Ancient World: From the Earliest Accounts to the Fall of Rome," the art of propaganda can be traced as far as back as 515 BCE with Darius the Great, one of the most important rulers of the ancient Persian Achaemenid Empire. He used it to legitimize his rule.
In modern times though (Before the internet, but after Gutenberg invented the printing press in 1440, after the first modern newspaper in 1605, after Marconi invented the radio in the late 1890s, and after Philo Farnsworth invented the TV in the late 1920s), successful propaganda efforts were in the hands of the few:
- Government leaders who could step behind the bully pulpit like US President Theodore Roosevelt.
- Government influence operators like Hitler's chief propagandist Joseph Goebbels.
- Media organizations like the printed press (The Times of London).
- Radio shows like CBS’s “See it All Now” with Edward R. Murrow.
- TV shows like “The CBS Evening News” with Walter Cronkite.
- Advertising organizations that produced ad content to sell their products like "The Marlboro Man" selling cigarettes in the 1950s.
Before the internet, all of these propagandists broadcast their messages using a one-way, one-to-many model. People got their information about the world from a handful of sources they trusted. And the sources they didn’t trust, like the crazy dude wearing the chicken suit spouting his manifesto in the public square, could easily be ignored. His message didn't spread. It lived and died there. His reach was small.
And yes, chicken suit propagandists were mostly dudes. We should look into that.
But after the internet came online (1969) and social media platforms started to emerge in the late 1990s, purveyors of propaganda got an exponential lift in broadcast capacity. Propaganda efforts were no longer restricted to the one-way, one-to-many model. Now your crazy Uncle Joe, who always caused family drama at the annual Thanksgiving dinner spouting conspiracy theories about UFOs, could easily find his people online. They could exchange information with like-minded people and they could now broadcast their collective propaganda message with a new model: two-way, many-to-many. If they learned how to use it correctly, they now had an exponential broadcasting mechanism at their fingertips that was free of charge.
After the internet, the public square, in the form of social media platforms, evolved as the source of news and debate in the United States and around the world. And it has massive reach. According to Simon Kemp in his 2024 Digital 2024 report:
- There are approximately 331 million internet users in the United States
- 97% regularly use the internet.
- 72% are active on social media.
- 72% use YouTube.
- 57% use Facebook.
- 51% use Instagram.
- 45% use TikTok.
- 31% use X.
But, all of these platforms are ripe with misinformation, disinformation, rumors, opinions, and fake news; in short, plain old-fashioned propaganda. Unless you are a self-proclaimed culture warrior looking for a fight, which I fundamentally believe the average American citizen isn’t, you are left bewildered and uncertain as to what to believe and what to think whenever the current viral event of the day emerges.
Pundits and scholars have suggested self-imposed technology improvements for the social media platforms to mitigate the propaganda spread. Policy wonks have floated proposed government regulations to force change. But none of these ideas are likely to be in place in time for the US Presidential election of 2024. How then does the average American citizen pick between the signal and the noise when both sides are launching propaganda bombs into the ether?
I'm glad that you asked.
This 3-part mini-series (essays and podcast episodes) on election propaganda will try to address that question. We aim to provide you the toolkit to identify the propagandist tropes that have been around for centuries, long before social media platforms were a thing. We’re going to show how platform designers created these diabolical systems designed to arithmetically amplify the messages associated with paid influencers, unpaid wannabe influencers, culture war policy propagandists, and nation-state chaos instigators in order to hype up the rage machine on both sides of the political spectrum. They are designed to convince you, the average American voter, to hit that like button and broadcast your rage to your friends, family, and colleagues all in the service of making more money for the platform owners who are already bringing in billions of dollars of revenue.
The goal of this series then is to help the average citizen, who isn’t any of those things, navigate the 2024 Presidential election information storm by providing a toolkit that helps distinguish between deceptive narratives and legitimate content in the ever-evolving world of election propaganda.
You may be asking yourself, why is N2K doing a series on election propaganda? This is a topic that is, strictly speaking, not about cybersecurity. That's true. But, let’s call it security adjacent. It’s an anti-propaganda toolkit for everybody, not one side of the political spectrum or the other; thus, it’s a Rick the Toolman essay. And, it’s an effort on our part to separate the signal from the noise, which is the N2K motto.
Let’s start by examining how social media works. It’s way more complicated than you think.
How does social media work?
There are two recently published books that address this subject. One is from author Nina Jankowicz called “How to Lose the Information War: Russia, Fake News and the Future of Conflict. The other is from the aforementioned author Renee DiResta. Let’s start with DiResta.
She identifies five distinct propaganda agents that interact with each other on all social media platforms that enable viral propaganda spread. I call them the Pentad:
- The Platform
- The Algorithm
- The Influencers
- The Crowd
- The Media.
These Pentad elements are distinct, but their incentives are inseparable. Each agent thrives on the proliferation of attention-capturing pseudo-events. All reward and reinforce sensationalism, spectacle, and tribalism. DiResta credits Daniel Boorstin with the definition of
pseudo-events: “‘synthetic’ media moments, events that exist solely to be reported on.”
Common indicators that demonstrate a message has gone viral are
- The number of likes the original message received (should be over tens of thousands)
- The number of times members reshared the message (should be over a thousand)
- The number of comments the message receives (should be in the hundreds or thousands)
To be truly viral though, engagement metrics should significantly outperform the account's follower count. For instance, if an account with 100 followers receives thousands of likes and shares, that message is likely viral.
But, even when a message receives hundreds of thousands of comments, that represents a very small number of actual American citizens engaging in the content. Just do the math: a hundred thousand comments divided by 331 million internet users is a really small number. When the New York Times reports that a social media viral event is important, it’s equivalent to them giving significance to the crowd cheering a touchdown at last night’s Giants / Cowboys football game. Not to put too fine a point on it but it’s also likely that those engaging with the content are culture warriors from one side or the other of some issue anyway. It’s not the average American.
So, when the Washington Post publishes a story on how viral a TikTok Story went, that’s a pseudo-event. A small number of people talking about an event is not an event. It's not valuable information. The viral event creates way more noise than it does signal. It’s meaningless unless you happen to be part of the tribe that cares about the issue.
The point is that each element in the Pentad contributes to this notion of virality. Let’s examine each to see how they contribute individually.
The platform.
According to DiResta, Social media platforms provide crowds (another Pentad element that we’ll discuss in a bit) with tools that encourage the formation of groups and to influence how those communities behave. They provide opportunities for spontaneous Crowd formation, which carries with it the potential to suddenly devolve into a mob; an unruly collection of angry people.
But, all platforms have some content moderation policy from X’s bare minimum policies against violent content, hateful conduct, abuse/harassment, and child exploitation to YouTube’s more nuanced set of clearly defined policies on hate speech, harassment, violent extremism, and misinformation. I'm not making an argument here that YouTube is better than X in terms of content moderation. My argument is that all platforms have a content moderation policy. You should pick the platform that more closely matches your tastes. If you like X’s bare minimum approach, go with that platform. If you like YouTubes’s more nuanced approach, go with that one. It makes sense though that you at least know what the policies are so that you know what you are getting yourself into.
But, when you post something that flies in the face of the platform’s content moderation policy, and the company deletes the message (let's say), that’s not a violation of the US First Amendment. The platform companies aren’t the US government. They’re not empowered to protect your civil rights, nor are they obligated to. Unless there is some obvious law being broken by the company or by the user of the platform, platform owners can do whatever they want with their content policy. Users can like it or not, and they can change platforms if they wish, but American citizens have no inherent right to publish whatever nonsense they want on a public company’s platform and the platform leadership has no obligation whatsoever to amplify that message. If that were true, Uncle Joe could just walk into the lobby of any corporation and start yelling about his UFO conspiracy stories. When security comes to escort him out of the building, he could just point out that he was executing his First Amendment rights. I don't see that happening anytime soon.
Platform leadership is trying to bring in revenue. They all craft their own content moderation policies designed to appease and cultivate ad buyers, to cater to certain audience demographics, and to comply with international laws all with revenue generation in mind. Under the covers, what makes all of that work is the algorithm.
The algorithm.
Again, according to DiResta, algorithms exploit the human brain’s attraction to divisiveness. Platform owners believe that encouraging the rage machine is much more profitable than encouraging holding hands and singing Kumbaya. Think of Algorithms as the wedge splitter; the tool that splits and pushes American citizens into opposing sides and compels them to respond. Algorithms determine what content is shown to the crowd and they do it by shaping information flow and by influencing the narratives the crowd encounters by their individual preferences and engagement patterns. Algorithms reinforce existing beliefs and sometimes nudge the crowd down some horrifying rabbit holes by catering to their biases and boosting relevant Influencers (another Pentad element that will get to in a bit) in order to keep the crowd on site. They steer the influencer’s attention and encourage the crowd’s output.
If the algorithm creates a viral event that the traditional media (cable TV and newspapers) report on, that’s an indicator of success. If the viral event spreads to other social media platforms, that’s also an indicator. It means that the algorithm has elevated the viral event to such an extent that it has broken out of the platform, like a contagion, to spread to other mediums; like the Bird Flu infecting humans.
The algorithm targets several Pentad members to achieve a viral event, like influencers and the media, but liftoff is attained by targeting the crowd.
The crowd.
DiResta says that “virality is a crowd collective behavior: each user makes a deliberate choice to post or retweet content because they find the post or messaging appealing, they believe in it, or they’re outraged by it.” In a kind of enclosed system, algorithms feed the crowd with content generated by inside-the-bubble trusted spokespeople (influencers) and then the crowd amplifies the rage machine with their response, which in turn gives the algorithms and influencers indicators about which ideas are working and which ones aren’t to help them generate and highlight more content in the same vein, which causes the crowd to engage with the rage machine again; a massive self-sustaining do-loop designed to keep the machine humming.
The most diabolic feature of this system is that new ideas that exist outside the bubble, even if they somehow miraculously make it inside for a bit, do not live long within the bubble.
DiResta says that “Consensus reality—our broad, shared understanding of what is real and true—has shattered, and we’re experiencing a Cambrian explosion of subjective, bespoke realities. A deluge of content, sorted by incentivized algorithms and shared instantaneously between aligned believers, has enabled us to immerse ourselves in environments tailored to our own beliefs and populated with our own preferred facts.”
Once you’re inside a bespoke reality, it’s hard to disengage. We’ve all talked to friends, relatives, and colleagues who are so deep within their own information bubble that no matter what counterfactual you present, they see it as fake news or some organized conspiracy to stifle the truth. That is the very definition of a bespoke reality.
Before the internet, crowds in the real world were local and fleeting. They’d show up, protest, maybe engage in various degrees of violence, and then disappear. The same members of that crowd would likely never see each other again. After the internet, crowds became persistent and global. DiResta says “they engage symbiotically with influencers but don’t require a leader or physical space to assemble. Crowds decide independently what to amplify. When they do this, they help influencers rapidly identify the memes and messages that truly resonate and capture public attention.” Yes, algorithms manipulate them and influencers monetize their rage, but in exchange, the crowd finds community, entertainment, and camaraderie; a sense of shared identity which leads to collective behavior and the formation of online factions in which they actively participate.
Sometimes crowds turn ugly. They transform into mobs; think pitchforks and torches like in the old Frankenstein movies; but digital. The 2014 Gamergate controversy is a good example. A group of mostly Twitter and Reddit members organized to harass female game developers and critics. The mob blended coordinated attacks, harassment, and the spread of false narratives. DiResta says that this mob behavior “requires fuel to sustain its outrage; to sustain its bespoke reality.” Each Pentad element provides that fuel in its own way, but the element that leads the charge is the influencer.
The influencer.
A social media influencer is an individual who has built a significant following on one or more social media platforms and leverages this audience to influence their opinions, behaviors, or purchasing decisions. By a “significant following,” I mean they have at least 10,000 followers but the most successful have millions. For example, Charli d’Amelio has approximately 156 million followers on TikTok. They often have a niche or area of expertise, such as fashion, fitness, travel, technology, or beauty, and they create content that resonates with their followers.
DiResta says they seek virality, a word lifted from epidemiology, the study of how diseases spread. They use targeted ads and paid boosts that appeal to either the algorithm, the crowd, or both. They dynamically test messaging strategies designed to grow followers or uplift engagement and they are always aware of how the algorithm and the Crowd responds.
World-class influencers innately understand how to connect with their base. Seemingly at the same time, they appear to develop intimate trusted relationships but achieve a massive reach that rivals what TV Networks used to achieve before the internet.
And let’s be clear, they cater to the algorithm and the crowd because the success of that effort determines how much revenue they bring in. This is not bad or good either. It just is. Some influencers are just in it to make a buck by selling their brand and their products. Taylor Swift is a good example of this and good on her. She has found a way to connect to her audience to sustain her career. If only all of us could be that successful.
Other influencers though, the culture warrior influencers, bring in revenue by fanning the flames in their bespoke realities; pointing the finger at “those people over there” as the cause of all the problems for the true believers, the members of their bespoke community. Their methodology is to look for the weak spot in the national culture and drive a wedge through it. They broadcast rumors as if they are true and never retract if it turns out not to be true later. They just move on to the next rumor in order to keep fanning the flame; to keep driving the wedge.
And with all of that effort, sometimes, the message breaks out of the bespoke reality and into the mainstream media.
The media.
What I mean by the media is the collection of old guard, traditional TV and newspaper companies like Fox, CNN, the New York Times, the Washington Post, and the Wall Street Journal.
As Daniel Boorstin said back in the early 1960s, “the media” sometimes reports on a viral event as if it was important, but in truth, it is a mirage that looks real and substantive but isn’t. To be fair though, mainstream media companies are incentivised to cover sensational content in similar ways that culture warrior influencers are incentivised: to generate revenue. DiResta says that “The Media practice of 24X7 news causes it to create content for content’s sake and the elevation of people who are famous for being famous. Media generates manufactured “important” moments that capture public attention but are in fact meaningless.”
What’s in the anti-propaganda toolkit?
Now that you understand how propaganda works in the modern digital age (thanks to the books from Renée DiResta and Nina Jankowicz), and you have no desire to be a practicing culture warrior yourself, what can you do to inoculate yourself from the spread and influence of propaganda and to not get sucked into the rage machine? It turns out that there are several tools that average Americans can use for each element of the Pentad.
Anti-propaganda tools for the platform.
- Know the content moderation policies of your social media platforms of choice. Make a conscious decision as to whether you agree or don’t agree with what they are trying to do. For bonus points, when you notice behavior that violates the policy, report it to the platform administrators.
- Understand that the execution of content moderation policies is not a violation of the US First Amendment. Don’t believe it when culture warrior influencers try to tell you that it is. It’s just another way to hype the rage machine.
- Remember that platform owners don’t care about your culture warrior issue except that they make money by hyping your rage about it.
Anti-propaganda tools for the algorithm.
- Know that algorithm designers leverage the brain’s attraction to divisiveness. If you reach for the rage button to like, comment, and re-broadcast a message that makes you steaming mad, that’s the algorithm pulling your strings. The algorithm wants to manipulate you like that. When that happens, the algorithm wins. If you’re OK with that, fine. Hit the rage button. But maybe a better solution is to sit back, cool off for a day, and reevaluate. If 24 hours go by and you're still stomping mad, you might have to come to terms with the idea that you have slipped into a cultural warrior role living in your own bespoke reality. This toolkit isn’t for you.
- Unless you’re a culture warrior seeking to gain followers, your engagement with the rage machine isn’t essential. Nobody cares what you have to say about [pick your culture warrior issue]. Your like, reshare, or comment is not fundamental to the debate. Consider stepping back from the rage machine. The algorithm can’t create a viral event if the crowd doesn’t respond.
Anti-propaganda tools for the crowd or am I in my qwn bespoke reality?
- Ask yourself if you have slipped into a hermetically sealed information bubble (Bespoke Reality).
- Do you reject authority figures in science, government, and academia out of hand in favor of what Kevin from the Bronx has to say on the subject?
- Do you favor scoring points against the other side instead of actually considering the issues?
- Have your friends, colleagues, and family members stopped talking to you because of your ideas?
- Do you only consume content from your side of the culture war?
- Do you immediately assume that Kevin’s rumors about [pick your culture warrior issue] are true without any source material?
- Do you immediately assume that Kevin’s rumors about [pick your culture warrior issue] are true because you want it to be true?
- Do you often think that you and your side are the only ones who know the truth?
- Do you engage in digital mob behavior?
Anti-propaganda tools for the influencer.
- Remember that some influencer’s main motive is to drive a wedge through the culture; to create sides.
- Think about the source of the information and why you trust what Kevin from the Bronx has to say about [pick your culture warrior issue]. You might say that Kevin makes good points. Fine, but maybe also consider the motive behind his message. Consider how Kevin makes money or other ways he might benefit in his effort to fan the flames of the culture wars.
- Does Kevin regularly broadcast rumors saying things like, “If this turns out to be true, it could be bad,” or “I'm just asking questions.” If you hear those kinds of caveats from Kevin, consider not mashing the rage button until you know for sure. If that’s Kevin goto get-out-of-jail free card when he is accused of broadcasting misinformation or disinformation, consider that Kevin’s motives might not be pure.
- Does Kevin routinely point to some other individual or group as the source of all the problems in his bespoke reality. Is “the other” the focus of his rage? Does it make you feel good to focus your rage on this “other.” If that is true, it might be time to step back, take a breath, and consider why you feel this way.
Anti-propaganda tools for the media.
- Remember that the mainstream media, just like social media influencers, need to generate revenue too. They are looking for stories to get your attention.
- Remember that a viral event, although noisy on a social media platform, is just a relatively small crowd (compared to the US population) of like-minded people yelling into the ether; think NY Giants fans cheering for a touchdown. You probably don’t have to pay attention to it.
- When you hear the mainstream media reporting on viral events, be suspicious. If they do it regularly, consider changing your source of mainstream news.
Takeaways.
At the beginning of this essay, I promised that I would provide you with, at least the beginnings of, a toolkit that will enable you to identify the propagandist tropes that have been around for centuries but have been arithmetically amplified in the modern age on social media platforms. My purpose was to help the average citizen, not the culture warrior thriving in their own bespoke realities, navigate the 2024 Presidential election information storm by enabling them to distinguish between deceptive narratives and legitimate content. My theory is that if we all understood how the system works, we could become self-aware and at least recognize the mechanism. We still might decide to engage but at least we are aware of who is profiting from the endeavor and how we are being used by the Pentad. I think I have done that. In part two of this essay series, we will examine how propaganda efforts have influenced countries in the past.
References:
David Ehl, 2024. Why Meta is now banning Russian propaganda [News]. Deutsche Welle.
Jeff Berman, Renée DiResta, 2023. Disinformation & How To Combat It [Interview]. Youtube.
Rob Tracinski, Renée DiResta, 2024. The Internet Rumor Mill [Interview]. YouTube.
Yascha Mounk, Renée DiResta, 2022. How (Not) to Fix Social Media [Interview]. YouTube.
Renee DiResta, 2024. Invisible Rulers: The People Who Turn Lies into Reality [Book]. Goodreads.