CSO Perspectives (public) 10.2.24
Ep 5573 | 10.2.24

Election Propaganda Part 1: How does election propaganda work?

Transcript

Rick Howard: It's 2024 and in the United States, we're rolling up on the next presidential election, November 5th, between Vice President Kamala Harris and former President Donald Trump. And although social media isn't new for this election, social media platforms have matured enough to be the presumptive source of news and debate around the world. But they're also ripe with misinformation, disinformation, rumors, opinions, fake news, conspiracy theories, and advocation of absolutely abhorrent ideas and evil behavior. See your side of the political spectrum for what they think the other side's evil things are. Now, none of this is new. The Greeks and the Romans almost made a science out of it. They didn't call it propaganda, but that's what it was. And people from all sides of the political spectrum engage in it. But unless you're a self-proclaimed culture warrior looking for a fight, which the average American citizen certainly isn't, I know I'm not, most of us are left bewildered and uncertain as to what to believe and what to think whenever the current viral event of the day emerges. Since the 2016 election between the then candidate Trump and Secretary of State, Hillary Clinton, pundits and scholars have suggested sweeping fixes to combat propaganda efforts on social media platforms from both sides. They usually come in the form of self-imposed technology improvements on the various platforms and/or government regulations to force change, but none are likely to be in place in time for this election. How then does the average American citizen pick between the signal and the noise when both sides are launching propaganda bombs into the ether? [ Music ] That's the question we're going to try to answer in this three-part miniseries on "Election Security Propaganda." We're going to provide you a toolkit to identify the propaganda's tropes that have been around for decades before social media was a thing but are now exponentially amplified by the underlying platform architecture and a system of paid influencers, unpaid wannabe influencers, culture war policy propagandists and nation-state chaos instigators, hyping up the rage machine on both sides of the political spectrum. The goal of the series is to help the average citizen who isn't any of those things, navigate the 2024 presidential election information storm by providing a toolkit that helps distinguish between deceptive narratives and legitimate content in the ever evolving world of election security. [ Music ] According to a Renee DiResta, author of Invisible Rulers, The People Who Turn Lies into Reality, propaganda is a deliberate presentation of inflected information that attempts to further the agenda of those who create it. Assuming she's right and I think she is, everything is propaganda. Wherever you get your information, whether you fall to the left or the right on the political spectrum, no matter how legitimate you think the information you consume is, rest assured that it's passing through some kind of inflected prism that bends toward the purpose of the people pushing the information. I read the New York Times and the Washington Post. Those papers bend to the left. I also read the Wall Street Journal. That paper bends to the right. This isn't good or bad, it just is. The trick is to know that going in and to evaluate the value of the information with that in mind. This isn't a new thing. According to Susan Wise Bauer, author of the History of the Ancient World, from the earliest accounts to the fall of Rome, the art of propaganda can be traced as far back as 515 BCE with Darius The Great, one of the most important rulers of the ancient Persian Achaemenid Empire. He used it to legitimize his rule. In modern times though, before the internet, but after Gutenberg invented the printing press in 1440, after the first modern newspaper in 1605, after Marconi invented the radio in the late 1890s and after Philo Farnsworth invented the TV in the late 1920s, successful propaganda efforts were in the hands of the few. Government leaders who could step behind the bully pulpit like US President Theodore Roosevelt, government influence operators like Hitler's Chief Propagandist, Joseph Goebbels, media organizations like the printing press, like the Times of London, radio shows like CBS's See it all Now with Edward R. Murrow, TV shows like the CBS Evening news with Walter Cronkite, and advertising organizations that produced ad content like the Marlboro Man selling cigarettes in the 1950s. Before the internet, all of these propagandists broadcast their message using a one-way, one-to-many model. People got their information about the world from a handful of sources they trusted. And the sources they didn't trust, like that crazy dude wearing the chicken suit spouting his manifesto in the public square could easily be ignored. His message didn't spread. It lived and died there. His reach was small and yes, chicken soup propagandists were mostly dudes. We should look into that. After the internet came online in 1969, and social media platforms started to emerge in the late 1990s, purveyors of propaganda got an exponential lift in broadcast capacity. Propaganda efforts were no longer restricted to the one-way, one-to-many model. Now, your crazy Uncle Joe who always caused family drama at the annual Thanksgiving dinner spouting conspiracy theories about UFOs, could easily find his people online. They could exchange information with like-minded people and they can now broadcast their collective propaganda message with a new model two-way, many-to-many. If they learned how to use it correctly, they now had a megaphone broadcasting mechanism at their fingertips that was free of charge. After the internet, the public square in the form of social media platforms evolved as the source of news and debate in the United States and around the world, and it has a massive reach. According to Simon Kemp from his digital 2024 report, there are approximately 331 million internet users in the United States, 97% of them regularly use the internet, 72% are active on social media. 72% use YouTube, 57% use Facebook, 51% use Instagram, 45% use TikTok and about 31% use X. But if you agree with me that a lot of that content spread on those platforms is mostly propaganda and you don't consider yourself a culture warrior looking for a fight, how then do you, the average American citizen, pick between the signal and the noise when it seems there's no escape from it? I'm glad that you asked. [ Music ] We're going to show you how platform designers created these diabolical systems designed to arithmetically amplify the messages associated with a system or systems containing algorithms, influencers, crowds and the media in order to hype up the rage machine on both sides of the political spectrum. They are designed to convince you, the average American voter, to hit the like button and broadcast your rage to your friends, family, and colleagues, all in the service of making more money for the platform owners who are already bringing in billions of dollars of revenue. Now, you may be asking yourself, why is N2K doing a series on election propaganda? This is a topic that is strictly speaking, not about cybersecurity. That's true. But let's call it security adjacent. It's an anti-propaganda toolkit for everybody, not one side of the political spectrum or the other, and it's an effort on our part to separate the signal from the noise, which is an N2K motto. With that explanation out of the way, let me start with examining how social media works. It's way more complicated than you think. [ Music ] And we will get to that after a quick break. There are two recently published books that address this subject that I've found. One is from author Nina Jankowicz, called How to Lose the Information War, Russia, Fake News and The Future Conflict. The other is from the aforementioned author, Renee DiResta. DiResta identifies five distinct propaganda agents that interact with each other on all social media platforms that enable viral propaganda spread. I call them the pentad. The agents are the platform, the algorithm, the influencers, the crowd and the media. These pentad elements are distinct, but their incentives are inseparable. Each agent thrives on the proliferation of attention capturing pseudo-events in an effort to go viral. All reward and reinforce sensationalism, spectacle and tribalism. DiResta credits Daniel Boorstin, the late great American historian, and by the way, the 12th Librarian of Congress, how great is that, with a definition of pseudo-events, synthetic media moments, events that exist solely to be reported on. And let me set the boundary for what constitutes virality. Common indicators are the number of likes the original message received, this should be over tens of thousands. The number of times members reshared the message, this should be over 1,000 and the number of comments the message receives, this should be in the hundreds of thousands. But to be truly viral, engagement metrics should significantly outperform the influencer's follower account. For instance, if an account with 100 followers receives thousands of likes and shares, that message is likely viral. But even when a message receives hundreds of thousands of comments, that represents a very small number of actual American citizens engaging in the content. Just do the math, 100,000 comments divided by 331 million internet users is a really small number. When the New York Times reports that a social media viral event is important, it's equivalent to them giving significance to the crowd cheering a touchdown at last night's Giants-Cowboys football game. Not to put too fine a point on it, but it's also likely that those engaging with the content are culture warriors from one side or the other of some issue anyway. It's not the average American. So when the Washington Post publishes a story of how viral a TikTok story went, that's a pseudo-event. A small number of people talking about some culture warrior issue is not an event. It's not valuable information. The viral event creates way more noise than it does signal. It's meaningless unless you happen to be part of the tribe that cares about the issue. The point is that each element in the pentad contributes to this notion of virality. Let's examine each to see how they contribute individually. First up, the platform. According to DiResta, social media platforms provide crowds and other pentad elements that we'll discuss in a bit with tools that encourage the formation of groups and to influence how those communities behave. They provide opportunities for spontaneous crowd formation, which carries with it the potential to suddenly devolve into a mob, an unruly collection of angry people. Still, all platforms have some content moderation policy. From X's bare minimum policies against violent content, hateful conduct, abuse, harassment, and child exploitation, to YouTube's more nuanced set of clearly defined policies on hate speech, harassment, violent extremism, and misinformation. Now, I'm not making an argument here that YouTube is better than X in terms of content moderation. My argument is that all platforms have a content moderation policy. You should pick the platform that more closely matches your taste. If you like X's bare minimum approach, go with that platform. If you like YouTube's more nuanced approach, go with that one. It makes sense though that you at least know what the policies are, so you know what you're getting yourself into. But when you post something that flies in the face of the platform's content moderation policy and the company deletes the message let's say, that's not a violation of the US First Amendment. The platform companies aren't the US government. They're not empowered to protect your civil rights, nor are they obligated to. Unless there is some obvious law being broken by the company or by the user of the platform, platform owners can do whatever they want with their content policy. Users can like it or not, or they can change platforms if they wish. But American citizens have no inherent right to publish whatever nonsense they want on a public company's platform, and the platform leadership has no obligation whatsoever to amplify that message. If that were true, uncle Joe could just walk into the lobby of any corporation and start yelling about his UFO conspiracy theories. When security comes to escort him out of the building, he could just point out that he was executing his first amendment rights. I don't see that happening anytime soon. And keep in mind, the platform leadership's sole purpose is to bring in revenue. All platforms craft their own content moderation policies designed to appease and cultivate ad buyers, to cater to certain audience demographics and to comply with international laws. All of that capitalist idea of generating revenue in mind. Again, this isn't bad or good, it just is. And I love capitalism. It drives the country, but it's good to keep that in mind when we're yelling at the other side while defending our culture warrior issue. Let me say that again. The platform owners don't care about your issue. What they care about is making money. So under the covers, what makes all of that work is the algorithm. Again, according to DiResta, algorithms exploit the human brain's attraction to divisiveness. Platform owners believe that encouraging the rage machine is much more profitable than encouraging holding hands and singing Kumbaya. Think of algorithms as the wedge splitter, the tool that partitions and pushes American citizens into opposing sides and compels them to respond. Algorithms determine what content to show to the crowd, and they do it by shaping information flow and by influencing the narratives that crowd encounters by their individual preferences and engagement patterns. Algorithms reinforce existing beliefs and sometimes nudge the crowd down some horrifying rabbit holes by catering to their biases and boosting relevant influencers, another pentad element that I will get to in a bit. In order to keep the crowd on site, they steer the influencer's attention and encourage the crowd's output. If the algorithm creates a viral event that the traditional media, cable TV and newspapers report on, that's an indicator of success. If the viral event spreads to other social media platforms, that's also an indicator. It means that the algorithm has elevated the viral event to such an extent that has broken out of the platform like a contagion to spread to other mediums like the bird flu infecting humans. The algorithm targets several pentad members to achieve a viral event like influencers and the media, but lift-off is attained by targeting the crowd. DiResta says that virality is a crowd collective behavior. Each user makes a deliberate choice to post or retweet content because they find the post appealing, they believe in it or they're outraged by it. In a kind of enclosed system, algorithms feed the crowd with content generated by inside the bubble trusted spokespeople, influencers, and then the crowd amplifies the rage machine with their response, which in turn gives the algorithms and influencers indicators about which ideas are working and which ones aren't, to help them generate and highlight more content in the same vein, which causes the crowd to engage with the rage machine again. This is a massive self-sustaining do loop designed to keep the machine humming. The most diabolical feature of the system is that new ideas that exist outside the bubble, even if they somehow miraculously make it inside for a bit, do not live long within the bubble. DiResta says that consensus reality, our broad shared understanding of what is real and true has shattered, and we're experiencing a Cambrian explosion of subjective bespoke realities. A deluge of content sorted by incentivized algorithms and shared instantaneously between aligned believers has enabled us to immerse ourselves in environments tailored to our own beliefs and populated with our own preferred facts. Once you're inside the bespoke reality, it's hard to disengage too. We've all talked to friends, relatives, and colleagues who are deep within their own information bubble that no matter what counterfactual you present, they see it as fake news or some organized conspiracy to stifle the truth. That is the very definition of bespoke reality. [ Music ] Before the internet, crowds in the real world were local, in fleeting, they'd show up, protest, may be engage in various degrees of violence and then disappear. The same members of that crowd would likely never see each other again. After the internet, crowds became persistent and global. DiResta says they engage symbiotically with influencers, but don't require a leader or physical space to assemble. Crowds decide independently what to amplify. When they do this, they help influencers rapidly identify the memes and messages that truly resonate and capture public attention. So yes, algorithms manipulate members of the crowd and influencers monetize their rage. But in exchange, the crowd finds community, entertainment and comradery, a sense of shared identity, which leads to collective behavior and the formation of online factions in which they actively participate. And sometimes crowds turn ugly. They transform into mobs. Think angry people wielding pitchforks and torches and angry hunting dogs like in the old Frankenstein movies, but digital. The 2014 Gamergate controversy is a good example. A group of mostly Twitter and Reddit members organized to harass female game developers in critics. The mob blended coordinated attacks, harassment, and the spread of false narratives. The digital equivalence of dogs, torches and pitchforks. DiResta says that this mob behavior requires fuel to sustain its outrage, to sustain its bespoke reality, and each pentad element provides the fuel in their own way. But the element that leads the charge is the influencer. [ Music ] A social media influencer is an individual who has built a significant following on one or more social media platforms and leverages this audience to influence their opinions, behaviors, or purchasing decisions. By a significant following, I mean they have at least 10,000 followers, but the most successful have millions. For example, Charli D'Amelio has approximately 156 million followers on TikTok. Influencers often have a niche or area of expertise too, such as fashion, fitness, travel, technology, or beauty, and they create content that resonates with their followers. DiResta says they seek virality, a word lifted from epidemiology, the study of how diseases spread. They use targeted ads and paid boosts that appeal to either the algorithm, the crowd, or both. They dynamically test messaging strategies designed to grow followers or uplift engagement, and they're always aware of how the algorithm and the crowd responds. World-class influencers innately understand how to connect with their base. Seemingly at the same time, they appear to develop intimate, trusted relationships with their followers, but achieve a massive reach that rivals what TV networks used to achieve before the internet. And let's be clear, they cater to the algorithm and the crowd because the success of that effort determines how much revenue they bring in. This is not bad or good either. It just is. Some influencers are just in it to make a buck by selling their brand and their products. Taylor Swift is a good example of this and good honor. She has found a way to connect to her audience, to sustain her career. If only all of us could be that successful. Other influencers, though, the culture warrior influencers bring in revenue by fanning the flames and their bespoke realities, pointing the finger at those people over there as the cause of all the problems for the true believers, the members of their bespoke community. Their methodology is to look for the weak spot in the national culture and drive a wedge through it. They broadcast rumors as if they are true and never retract it if it turns out not to be true later. They just move on to the next rumor in order to keep fanning the flame, to keep driving the wedge. And with all of that effort, sometimes the message breaks out of the bespoke reality and into the mainstream media. What I mean by the media is the collection of old guard traditional TV and newspaper companies like Fox, CNN, the New York Times, the Washington Post, and the Wall Street Journal. If Daniel Boorstin were alive today, he might say that the media sometimes reports on a viral event as if it was important, but in truth, it is a mirage that looks real, but it isn't. To be fair though, mainstream media companies are incentivized to cover sensational content in the same way that social media influencers are incentivized on the platforms, to generate revenue. DiResta says that the media practice of 24/7 news causes it to create content for content's sake and the elevation of people who are famous for being famous. Media generates manufactured important moments that capture public attention, but are in fact meaningless. [ Music ] Now that we understand how propaganda works in the modern digital age, thanks to the books from Renee DiResta and Nina Jankowicz, and we understand how each element of the propaganda pentad functions, the platform, the algorithm, the influencers, the crowd and the media, and they each work in a mostly closed system of systems designed to efficiently broadcast propaganda messages. And assuming that you're just an average citizen who has no desire to be practicing culture warrior stuff yourself, what then can you do to inoculate yourself from the spread and influence of propaganda and to not get sucked in to the rage machine? It turns out that there are several things you can do as an average American for each element of the pentad. For the platform, know the content moderation policies of your social media platforms of choice. Make a conscious decision as to whether you agree or don't agree with what they're trying to do. For bonus points, when you notice behavior that violates the policy, report it to the platform administrators. Understand that the execution of content moderation policies is not a violation of the US First Amendment. Don't believe it when culture warrior influencers try to tell you that it is. It's just another way to hype the rage machine. Remember that platform owners don't care about your culture warrior issue, except that they make money by hyping your rage about it. For the algorithm, know that algorithm designers leverage the brain's attraction to divisiveness. If you reach for the rage button to like comment and rebroadcast a message that makes you steaming mad, that's the algorithm pulling your strings. The algorithm wants to manipulate you like that. When that happens, the algorithm wins. If you're okay with that, fine hit the rage button, but maybe a better solution, it's a sit back, cool off for a day and reevaluate. If 24 hours go by and you're still stomping mad, you might have to come to terms with the idea that you have slipped into a cultural warrior role living in your own bespoke reality. This toolkit isn't for you. Unless you're a culture warrior seeking to gain followers, your engagement with the rage machine is an essential. Nobody cares what you have to say about --

Computer-generated Voice: Pick your favorite culture warrior issue.

Rick Howard: Your like, reshare or comment is not fundamental to the national debate. Consider stepping back from the rage machine. The algorithm can't create a viral event if a crowd doesn't respond. For the crowd, ask yourself if you have slipped into a crowd's hermetically sealed information bubble, your own bespoke reality. Do you reject authority figures in science, government and academia out of hand in favor of what Kevin from the Bronx has to say on the subject? I'm not saying that authority figures are always better informed and never make mistakes, sometimes they aren't and sometimes they do. But if your go-to move is to reject them out of hand without any thought, you might be in your own bespoke reality. Do you favor scoring points against the other side instead of actually considering the issues? Have your friends, colleagues, and family members stopped talking to you because of your ideas? Do you only consume content from your side of the culture war? Do you immediately assume that Kevin's rumors about --

Computer-generated Voice: Pick your favorite culture warrior issue.

Rick Howard: -- are true without any source material? Do you immediately assume that Kevin's rumors about --

Computer-generated Voice: Pick your favor culture warrior issue.

Rick Howard: -- are true because you want it to be true? Do you often think that you and your side are the only ones who know the truth? And do you engage in digital mob behavior? One or more of these traits might indicate that you're in your own bespoke reality. And finally, for the influencer, remember that some influencer's main motive is to drive a wedge through the culture to create size. Think about the source of the information and why you trust what Kevin from the Bronx has to say about --

Computer-generated Voice: Pick your culture warrior issue.

Rick Howard: You might say that Kevin makes good points. Fine. But maybe also consider the motive behind his message. Consider how Kevin makes money. In other ways he might benefit in his effort to fan the flames of the culture wars. Does Kevin regularly broadcast rumors saying things like, if this turns out to be true, it could be bad, or, I'm just asking questions. If you hear those kinds of caveats from Kevin, consider not mashing the rage button until you know for sure. If that's Kevin's go-to get out of jail free card when he is accused of broadcasting misinformation or disinformation, consider that Kevin's motives might not be pure. Does Kevin routinely point to some other individual or group as the source of all the problems in his bespoke reality? Is the other the focus of his rage? Does it make you feel good to focus your rage on this other? If that's true, it might be time to step back, take a breath and consider why you feel this way. For the mainstream media, remember, just like social media influencers, they need to generate revenue too. They're looking for stories to get your attention. Remember that a viral event, although noisy on a social media platform, is just a relatively small crowd compared to the US population. This is a group of like-minded people yelling into the ether. Think New York Giants fans cheering a touchdown. You probably don't have to pay attention to it. When you hear the mainstream media reporting on viral events, be suspicious. If they do it regularly, consider changing your source of mainstream news. [ Music ] At the beginning of this episode. I promised that I would provide you with at least the beginnings of a toolkit that will enable you to identify the propagandist tropes that have been around for centuries but have been arithmetically amplified in the modern age on social media platforms. My purpose was to help the average citizen, not the culture warrior, thriving in their own bespoke realities, navigate the 2024 presidential election information storm by enabling them to distinguish between deceptive narratives and legitimate content. My theory is that if we all understood how the system works, we could become self-aware and at least recognize the mechanism. We still might decide to engage, but at least we are aware of who is profiting from the endeavor and how we are being used by the pentad. I think I've done that. [ Music ] Next week in Part 2 of the series, we will examine how propaganda efforts have influenced countries in the past. So stay tuned. This special CSO Perspectives episode on election propaganda is brought to you by N2K CyberWire, where you can find us at thecyberwire.com. On the show notes pages, I've added some reference links to help you do more of a deep dive if that strikes your fancy. And believe me, the well is deep here. We've only just scratched the surface in this episode. And don't forget to check out our book, Cybersecurity First Principles, A Reboot of Strategy and Tactics, that we published in 2023. And by the way, we'd love to know what you think of our show. Please share a rating and review in your podcast app, but if that's too hard, you can fill out the survey in the show notes or send an email to csop@n2k.com. We're privileged that N2K CyberWire is part of the daily routine of the most influential leaders and operators in the public and private sector, from the Fortune 500 to many of the world's preeminent intelligence and law enforcement agencies. N2K makes it easy for companies to optimize your biggest investment, your people. We make you smarter about your teams while making your teams smarter. Learn how at n2k.com. One last thing, here at N2K, we have a wonderful team of talented people doing insanely great things to make me sound good. I think it's only appropriate that you know who they are.

Liz Stokes: I'm Liz Stokes. I'm N2K's CyberWire's Associate Producer.

Tre Hester: I'm Tre Hester, audio editor and sound engineer.

Elliott Peltzman: I'm Elliott Peltzman, executive Director of Sound and Vision.

Jennifer Eiben: I'm Jennifer Eiben, executive producer.

Brandon Karpf: I'm Brandon Karpf, executive editor.

Simone Petrell: I'm Simone Petrell, the president of N2K.

Peter Kilpe: I'm Peter Kilpe, the CEO and publisher at N2K.

Rick Howard: And I'm Rick Howard. Thanks for your support, everybody.

Unison: And thanks for listening. [ Music ]