8th Layer Insights 6.8.21
Ep 2 | 6.8.21

The Battle for Truth: Disinformation, Misinformation, & Conspiracies

Transcript

Samantha North: We have to remember that disinformation is not new at all.

Allie Wong:  Disinformation is not a new thing. This has been a well worn strategic aspect, particularly of Russian information operations, pretty much since the Bolshevik era in the early 20th century. The way that we're seeing it now in the advent of the digital age is amplified that much more because information is so much more accessible and we're seeing it on a larger scale in one of the more prominent aspects of what we're experiencing geopolitically.

Perry Carpenter:  If you've been living on Planet Earth for the past few years, you may have had the feeling that there's an escalation in things like disinformation and misinformation and conspiracy theories, and things that might make us question the very nature of reality itself. Certainly the nature of truth or the information that we're seeing.

Samantha North:  Everyone is susceptible to disinformation as humans, regardless of where we are on the political spectrum.

Mick West:  It's hard to measure the degree of penetration of acceptance of conspiracies into popular culture. The tools that we have are relatively simple and blunt tools, like public opinion surveys or, one researcher did studies of the Letters to the Editor in New York Times over 100 years to see what was the drift of increasing conspiratorial things, and he found that to be relatively constant. I think what's happened over the last few years is that it's made conspiracy theories more visible, in part, by making them more acceptable to talk about by making certain conspiracies more mainstream.

Perry Carpenter:  And this is really something that transcends simple things like political affiliation or what country someone holds allegiance to. What we're seeing is that everything that can be questioned or confused, is being questioned and confused. In today's episode, I want us to explore this. We'll put aside the trappings of politics and all the other things that go with that, and look at the general problem, the motivators behind that problem, the financial incentives behind that problem and where technology comes in. And then, ultimately, what we can and should be doing about this. Participating in today's discussion, we have disinformation expert, Samantha North and Allie Wong. We'll be hearing from conspiracy theory expert, Mike West, and, to help set the stage, here's a clip from a recent interview that I did with cybersecurity expert, Bruce Schneier.

Bruce Schneier:  If you go back 100 years, being a publisher was hard and being a recipient of publications was hard. You had a world of information that didn't really flow at all. Now, you move into the world of the printing press and increased literacy, and you had broadcast and publishing was hard, but being the recipient was easier. And that moved into radio and then television and, again, one too many was the way it all worked. Now we're going half backwards. It's very easy to publish and it's very easy to receive information. So, now, we're in this world where everybody is speaking. We had history nobody speaking and nobody listening to 100 years ago, a few people speaking, everyone else listening. Today, everyone's speaking, everyone listens. This is new and I think we don't fully understand the ramifications of the world we're in. But this is something people way smarter than me are studying.

Perry Carpenter:  On today's episode, we wrestle with the concepts of disinformation conspiracy and truth, and how we can find relief from the deluge of manipulation that we all find ourselves in. Hi there. My name is Perry Carpenter. Join me for a deep dive into what cybersecurity professionals refer to as the 8th Layer of Security, humans. This podcast is a multidisciplinary exploration into the complexities of human nature and how those complexities impact everything from why we think the things that we think, to why we do the things that we do, and how we can all make better decisions every day. Welcome to 8th Layer Insights. I'm your host, Perry Carpenter.

Perry Carpenter:  As we think about today's episode, I want to start us off by framing it a little bit for you. And I'll reflect back on a meme that I've seen floating around Facebook and other social media for a while. This is a quote from Mark Twain and it says, "It is easier to fool people than to convince them that they have been fooled." Think about that for a minute. Wait, just a second.

Mark Twain:  Pardon the interruption my young audio journalist. But I must offer a humble correction, lest I leave you ignorant and your skull a drafty void for lack of true knowledge.

Perry Carpenter:  Oh, wow, you look just like the guy in the meme.

Mark Twain:  While it may oft bear my name by way of attribution on the interwebs, that quote never appeared in my writings.

Perry Carpenter:  Really? What are you doing with that chair? Why are you standing on my chair and Coral, why are you starting the music again? I've lost control.

Mark Twain:  Here is what I wrote. "The glory which is built upon a lie soon becomes a most unpleasant encumbrance. How easy it is to make people believe a lie, and how hard it is to undo that work again!" Autobiographical dictation 2nd December 1906. Published in Autobiography of Mark Twain, Volume 2 (University of California Press, 2013)

Perry Carpenter:  Oh, okay. Thank you for that and down you go off my chair and out of my house. Thanks, Mark Twain. I appreciate it. We all like to be corrected.

Mark Twain:  I have a question for you. Just what is a meme anyway?

Perry Carpenter:  Yeah, we don't have time for that. Bye, now. So, I've been wrestling with this episode for quite a while now and the problem really hasn't been so much the topic, as it's been my fear that people might misinterpret the intent of this episode. Let me explain. Anytime you're talking about cognitive bias or things like disinformation, or conspiracy, people are naturally going to feel like you're talking about them, or their bias or their specific buy into some alternate view of the way that the world works. And my hope is that we fly above all of that and we don't dive into those things specifically, but I want to talk about the, the mental processes and the way that attackers and bad actors might use these types of techniques.

Perry Carpenter:  But, here's the thing. When you really study this topic, what you'll see is that virtually every political group and every country uses these types of tactics to further their agenda. So, this is a universal thing and everyone, and I mean each one of us, is vulnerable to manipulation campaign or falling down a conspiracy rabbit hole. That's because each one of us is searching for the truth. And it actually turns out that we're not that good at finding it. In fact, research shows that those of us that are on social media a lot are not as good as we think we are at determining if what we're seeing is trustworthy. Whether an article is true or false or whether a website of article is partisan or not. That finding comes from a study conducted by The Reboot Foundation. This is a foundation that focuses on understanding the state of critical thinking in the world, and the study is titled "Is There a Fake News Generation?"

Perry Carpenter:  Here's a really disturbing correlation, especially with the pandemic and all of us being in various forms of lockdown and spending more than ever on social media. What they found is that the more time a user spends on social media, the worse their judgment in determining what real and fake news was. And that was true regardless of a person's age, income, or political affiliation. So, to put it simply, the more time someone spends on social media, the worse their news judgment becomes. The study also found that it's not just fake news and click bait, but users are susceptible to a wide variety of disinformation techniques. Here's another disturbing thing. Of those that participated in the study, only 1% exhibited real fact checking techniques. For example, participants often assumed that a .org website indicates that it has reliable information, while some of the other participants really only cared if they could find that website on Google. I encourage you to take a look at the report. It's called "Is There a Fake News Generation?" And I'll have a link in the show notes.

Perry Carpenter:  If you listened to our last episode on “Trojan Horses for the Mind," you'll recall how powerful the Trojan horses of emotion and story can be. And now, think about the disinformation campaigns and conspiracy theories that you're aware of. What does each leverage? I bet, when you look at them, you'll find that they use quite a bit of emotion and story, so, as a bridge between our last episode and today's, let's bring back a familiar voice.

Joe Lazauskas:  Hi. My name is Joe Lazauskas. I am the co-author of The Storytelling Edge, and the head of marketing at Contently.

Perry Carpenter:  In my interview with Joe, he made a statement that really stuck with me, and that statement relates directly to our topic today. This will be the only time that you hear Joe's voice on today's show, but it fits too well to pass up. So, let's hear from Joe.

Joe Lazauskas:  Storytelling is a weapon that can be used for evil or for good, and I think we've seen that more than ever with the rampant acceleration of disinformation and propaganda on Facebook. Those are classic storytelling techniques that a lot of folks are using to change minds, to persuade people, to present little pieces of information in a story and playing on our brain's natural inclination to want to connect the dots. To want to tell a story and, and map associations between seemingly random pieces of information.

Perry Carpenter:  And now, let's be honest with ourselves, which of us over the past several years hasn't reacted emotionally to some kind of social media post? And then, based on that emotion, clicked to share that post without critically thinking, without verifying the credibility of the article or the facts? And if you were in the room with me when I said facts, I used air quotes. And that's really the danger, when it comes to disinformation and misinformation and conspiracy and so on. It's this idea of story and narrative and how they work to build a reality. Based on these falsehoods, or there misinterpretations of truth, we then build our reality or our understanding of our place in the world, or the world itself, on shaky ground and confirmation bias.

Perry Carpenter:  This topic also becomes interesting and important from an information security practitioner perspective. Because, in so many ways, the same principles and psychology that are involved in why people fall for disinformation campaigns or promote or fall for conspiracy theories, those same principles apply when it comes to why we fall for social engineering attacks, click on phishing emails, get taken in by scams, or a whole host of other things that create risk in our lives. So, at this point, it would probably be helpful to define some terms, to make sure that we're all using the same lexicon whenever we refer to these things.

Samantha North:  The terms mis and disinformation are different. One means just untrue information and other means deliberately untrue, but spread with the intention to deceive, and that's disinformation.

Perry Carpenter:  That was Samantha North.

Samantha North:  I'm a self-employed disinformation researcher and consultant, working with clients around the world, while also doing my PhD at the University of Bath, which is in computational social science.

Perry Carpenter:  So, Samantha, if disinformation and misinformation are two different things, how are they used differently? What does that look like in real life?

Samantha North:  What's interesting is that I've seen examples of, especially in Facebook groups, people sharing what is disinformation but when they share it, their intention is not to disinform. They think what they're sharing is true and they share it to help others. It might be like something false about the vaccines for COVID. It's disinformation that's come from the source, but when it gets to the old lady in the village, who shares it with her village Facebook group, she's doing it to help other people because she thinks it's true. It's the creators who have the intent to deceive, not necessarily the sharers.

Allie Wong:  Disinformation is information that's deliberately designed to weaken your adversaries and purposely planted in newspapers or other media stories.

Perry Carpenter:  That was the voice of Allie Wong. Ali is the VP of Disinformation, Misinformation and Malinformation, Response and Resiliency with a company called [PHONETIC: Limbick] which partners with research centers, the private sector, and government securities entities to do threat assessment, mitigation and counter-massaging for mis/dis and malinformation. In addition to that...

Allie Wong:  I'm a strategic communications officer at the United Nations Institute for Disarmament Research. My research areas have included information operations between the U.S. and [UNSURE OF WORD] during the Cold War and then the United States and Russia in the post-Cold War space. And I'm particularly interested in looking at weaponized information now.

Perry Carpenter:  Allie agreed to speak with me and to share her personal perspectives about this problem, and so it's important to note that these are her own views and may not represent the views of any organization that she is affiliated with. So, Allie, Can tell us a little bit about how all of this works. Let's start with disinformation. Where does that come from and what's the effect?

Allie Wong:  There's a saying that disinformation is 80% truth, and 20% fiction, and that's why it's really appealing, because it's not just completely random fact. It's something that resonates with some aspect of truth with people, but it's taking that truth and warping it. It polarizes an issue. Misinformation is information that's false, but it's not necessarily spread with the intention of causing harm. The people who are spreading this information are often times doing so unwittingly.

Perry Carpenter:  That's the thing about misinformation, it gets picked up and amplified by unsuspecting people. People that are in our lives, people that we know and respect and, in this world, where social media dominates, where everyone is a publisher, it becomes so easy for misinformation to spread and be believed.

Allie Wong:  That's something that I saw in my own Facebook team, things spread all over the place by people that I really respected, and a lot of times it was because people saw this and were like, oh my gosh, not realizing that it was a piece of disinformation that had been created. But, because they were spreading it, it was misinformation. And finally, malinformation is information that's accurate but it's oftentimes misrepresented and shared with the intent to cause harm, or it's something that's published out of context.

Perry Carpenter:  Now, let's see where something like conspiracy theories tie-in to this. To do that, we're going to need to bring in another expert.

Mick West:  My name is Mick West. I am a former video game programmer and now I am a writer and a debunker of conspiracy theories, and I specialize in how to talk to people who believe in a conspiracy theory and how to help them escape the rabbit hole.

Perry Carpenter:  So, Mick, from the outside, it seems like conspiracy theories and disinformation and misinformation are pretty related. How should we think about these things in relation to one another?

Mick West:  A conspiracy theory isn't necessarily disinformation, per se, because a lot of conspiracy theories are organic, and the people who are promoting them actually believe what they are saying. They're not deliberately spreading disinformation, it's more like misinformation. Those two words, misinformation and disinformation are a little bit too close and are, naturally, going to be a little bit ambiguous. One of them is just information that is wrong, and the other one is information that is deliberately wrong, and conspiracy theories can be both.

Perry Carpenter:  Now, before we go any further, we need to talk a little bit about some science. We need to explore what it is within human nature that makes us susceptible to all of this. We need to understand just what is going on within our minds, when we're tricked into sharing disinformation or when we buy into a conspiracy. We'll start with the work of behavioral economist, Daniel Kahneman. In Kahneman's work, Thinking Fast and Slow, he describes two types of thinking, System 1 and System 2. System 1 is very fast, it's emotion driven, it takes shortcuts, and it's great, but taking shortcuts means that our minds are constantly making assumptions, and that can lead to errors. So, System 1 is fast, but it's error prone, and System 1 is driven by heightened emotion or just relaxing into a decision and doing what feels right at the time, what comes in the moment.

Perry Carpenter:  System 2 is much slower. It's more methodical, it takes effort, and we don't often like the mental process of putting in the effort. But it leads to better, more reliable results. The problem with all of this is that System 1 accounts for about 95% of our thinking and actions, and System 2 accounts for only 5% of our thinking. Think about that. About 95% of our thoughts and actions are governed by emotion and taking shortcuts, and can have the tendency to be error prone, and that's not a good ratio. Here's an example that just everybody can relate to. Imagine yourself sitting at your desk, maybe drinking your morning coffee and scribbling down a few notes to remember later that day, and you set your pen down for a minute to focus on something else. And then, out of the corner of your eye, you notice your pen rolling towards the edge of the desk, about to plummet to the floor. At that point, without any conscious thought or effort, your hand shoots out, with Ninja-like reflexes, to snatch the pen from the precipice.

Perry Carpenter:  These are reflexes you didn't even know you had. But other people would certainly be impressed with if they saw. But, there's just one problem with that. While in Ninja reflex mode, you accidentally swept away your coffee cup causing a huge mess. Seriously, coffee everywhere ruining the next five to ten minutes of your life. All because your mind never logically assessed the situation. It saw a stimulus and it reacted by taking a shortcut. It's these same types of mental shortcuts that are in effect when we're tricked into doing things like clicking on links in phishing emails, sharing clickbaity social media posts, or propagating misinformation, or even when we're enticed into diving into that message board that claims to have all the answers that will help us to make sense of the puzzle pieces of our lives and the way that the world really works.

Perry Carpenter:  And what we know is that disinformation, misinformation, conspiracy theories, phising and social engineering attacks, and all of the things like that, are most effective when they tap into an emotion or a pre-established system of thought, a bias. Because, in essence, cognitive bias is really just a mental shortcut. And here is where the proverbial rubber meets the proverbial road. When we consider how these mental models intersect with the attention economy that we all live in, the attention economy of social media and the algorithmic encouragement of viral sharing, combined with the dopamine rush that we get when we share, and when we see the "likes" pour in, and the re-shares happen. And then we move into what's known as the filter bubble.

Bruce Schneier:  It's pretty clear that technology is increasing polarization just by allowing more discrimination. By that word, I mean the ability for people to segment themselves and to be segmented.

Perry Carpenter:  That's cybersecurity expert, Bruce Schneier.

Bruce Schneier:  You can live a lot of your life now and not come into contact with an idea you disagree with, except in a divisive way. That is something that just wasn't true before the modern Internet technologies. So, that is affecting things. We're not sure how but it, obviously, is.

Allie Wong:  Obviously, we're living in a really significant inflection point in U.S. political history. We were incredibly fractured and polarized in terms of just right and left and that, again, has just been a well worn strategy of the Soviet Union where, even throughout the Cold War, they would stoke and inflame both the right and left sides of an argument. Whether it was something that had to do with civil rights issues or nuclear weaponry, or that kind of thing, that's pretty common. But we're seeing it now just because of social media and because of how accessible media at large is. But it is something where people who otherwise might not be involved in political discourse, your proverbial aunt on Facebook can now have an opinion and now have access to information, whether true or false, that she wouldn't have access to in previous generations.

Mick West:  When you get this critical mass of people, then it, it becomes less of something that loners do in their spare time, and it's something that's actually infecting the main population, which is rather a troubling thing, I think.

Allie Wong:  I think that's where you can do A/B testing as a disinformation creator. You can play around and see how something grows over the course of 24 hours and, if it's not getting traction, then you just create something else. The long game can be something that takes a week, rather than five years, like it has in the past. You have more opportunities to plant stories or to use [UNSURE OF WORD] trolls or things like that to help move a story.

Samantha North:  I think we all focus on politics a lot and it gets lots of media coverage in disinformation and elections. But I believe that there's a real rise in disinformation for profit and it's very closely related to commercial marketing. So, people are monetizing this in different ways. That could be anything from on-line advertising which, as we all know, the more clicks you get, the more money you make. But I've also seen stuff like, like conspiracy theory websites that, that solicit donations from their fan base.

Mick West:  YouTube's algorithm used to be, essentially, blind. Like a blind monster just trying to guide you towards more clicks. And so, what happened quite naturally was that if you watch a video of one particular type, it will show you more videos of the same type. YouTube, in the past, guided people towards these more and more extreme and more compelling conspiracy theories and it did it in a way quite naturally, it wasn't really designed to do it this way. It did it in a way that just could do it incrementally, so it would gradually lead you into it. Perhaps if it showed you this more extreme video at the start, you wouldn't fall for it, but because you'd been led to it, it sucked you in. The good news is that YouTube doesn't really do that so much anymore with conspiracy theory videos. It probably does it with whatever the popular viral videos are now, but it's something that they've taken deliberate steps now to try to stop the spread of, of conspiracy theories and so, they try very much not to recommend videos of that vein anymore.

Pete Leyden:  There was a time not too long ago when the Internet was supposed to bring a democratization of media, a plethora of choices.

Perry Carpenter:  That's Pete Leyden. He's the founder of a company called Reinvent, and they focus on driving communications about what's coming within the next ten years and how we can fundamentally reinvent strategies and systems to deal with these new realities.

Pete Leyden:  We were supercharged that this was going to really transform the way media worked all for the better. That people would have more choices, the individual would be able to get all kinds of different perspectives on things, they'd be better informed, they'd be better consumers and, certainly, they'd be better citizens. But it didn't totally turn out that way.

Perry Carpenter:  This audio clip is from Pete giving an intro of Eli Pariser, who wrote about the filter bubble way back in 2010. Pete was gracious enough to give me permission to use these clips, and this is from a video that was taken in 2018 with Eli talking to a group of people about how to pop the filter bubbles that we all live in. And, of course, we'll have a link to the video in the show notes.

Pete Leyden:  Turns out that the individuals with all those choices would actually pick and choose exactly what they wanted, and the technology companies and the media companies learned very quickly how to actually give them exactly what they wanted. And because of that, we ended up with this situation where we really started to actually polarize and start to really get to the point where the individuals weren't really understanding what other people were doing, on the other side of their bubble. And, over time, this was getting to be a big problem. One of the people that saw this early was Eli Pariser, who's our speaker for tonight. He came up with his book in 2010, called the Filter Bubble. He started to see what was really starting to happen and how dangerous that could actually be and how, over time, this could get very poisonous. This has gotten to be a problem for business, for society and for our politics at large.

Perry Carpenter:  As Peter described, the Internet and many of the algorithms that funnel social media started with very good intentions, to share information freely, and to allow people to find the things that they're interested in. But, as Eli describes, that algorithmic encouragement, the finding the things that we're already interested in and showing us more of that, created the filter bubble. Eli describes the filter bubble like this.

Eli Pariser:  Increasingly, we're all experiencing media through the lens of algorithms that understand, or think they understand, who we are, what we want to consume and that are, essentially, trying to serve us more of what we're most likely to engage with. This has a couple of pernicious effects. One is it's passive and it's invisible, so we don't know who these algorithms think we are, we don't know on what basis they're ruling information in or out. And so, we don't know how to adjust for that skew. Like, if I pick up a magazine, I know what the editorial viewpoint of the magazine is, but I know much less what the editorial viewpoint of Facebook is relative to me. And then I think the second challenge here is, we don't know what we don't see, so the stuff that makes it through this membrane, at least we encounter all of that, but all of this, all of this diversity of ideas we can lose sight of how far off of a common set of information we're in.

Perry Carpenter:  And Eli makes it clear that the filter bubble isn't something that is only technology driven. It's being driven, as Bruce Schneier said, also as a part of self selection in the way that we are deciding to live.

Eli Pariser:  So, this whole thesis that social media is driving polarization, I think it's a piece of this conversation, but I actually think, when we back up a couple of steps, it's not the only factor and maybe not even the most important factor in what's driving this phenomenon in our society.

Perry Carpenter:  What Eli is saying is very reminiscent to what Bruce said earlier on, is that we are self selecting and using our own biases, probably unconsciously, to self select the places that we live, the communities that we engage in, the conversations that we have, the news that we read, the Internet sites that we go to. So, we are both intentionally creating some of these bubbles and then also, those bubbles are being facilitated by the technology around us as the technology learns our preferences. In other words, we are creating and allowing technology to move us into both physical and digital tribes.

Eli Pariser:  Americans increasingly live in communities with people who think like them, with people who share their beliefs, with very little exposure to people who think differently. It creates this fact that people don't have relationships, not only just across political divides, but also across class divides. That increasingly we just live with people who are very much like us, which I say as a Brooklynite who like, goes to my little coffee shop and drinks my fair trade coffee with my friends, who are doing activism. I'm part of the problem.

Perry Carpenter:  This becomes very dangerous when you think about a problem like disinformation or misinformation, or conspiracy mindsets, because we're not allowing ourselves to be exposed, in healthy ways, to broad sets of ideas that may better inform our set of facts. Our trust is so limited and fractured, and our source base is so limited, that these filter bubbles create echo chambers that are extremely hard to get out of. And worse than that, we don't even want to get out of them. We want to be part of an in-group and we want to identify an out-group. And we trust the people in the in-group, and we distrust and we may even demonize, people in the out-group.

Perry Carpenter:  And it's this idea of a secret knowledge and a deeper understanding of the way that the world really works and the things that are really going on with the world, coupled with this idea of segmentation, the creation of an in-group and an out-group, where people in the in-group really know things, and people in the out-group really don't know what's going on. All of those things come together to form the main components of a conspiracy mindset. So, let's spend a minute talking about conspiracy mindsets and rabbit holes.

Samantha North:  People seek a framework to make sense of the world and on top of that, there's also this need that humans have to belong to an in-group, to be accepted by a tribe.

Allie Wong:  There's something called ingrouping and outgrouping where it basically takes an issue and creates people on the inside and people on the outside. So, we understand, you don't understand. If you're not for us, you're against us.

Perry Carpenter:  And that's where something like belief in a disinformation campaign, or really holding fast to a conspiracy mindset, comes in. It's, I have this information, I have seen the truth and anyone that doesn't believe it is just on the outside. They can't see it.

Allie Wong:  Conspiracy theories are falsehoods that are presented as truth. Truth that a particular group of people has a unique insight into, that everyone else doesn't understand.

Mick West:  To get drawn down the rabbit hole you essentially have to consume a lot of information.

Perry Carpenter:  That's conspiracy theory expert, Mike West.

Mick West:  You have to consume enough information to change your world view and, usually, this requires having a good degree of spare time. If you look at the people who are conspiracy theorists, they're all across the spectrum, but I think it's much more disproportionately that you get people who don't have a full-time job. And the fact that they don't have a full-time job has meant, in the past before the pandemic, their social framework and their understanding of the world through other people, is very limited. Because, if you're not actually engaging with people on a daily basis, you've just, essentially, got yourself and the Internet.

Perry Carpenter:  This idea of social isolation and disgruntlement, feeding into conspiracy mindsets, really makes a lot of sense. When you think about the past several years, with the increased political divide, people feeling like social issues have gone unaddressed, and then the pandemic, and everyone at home, and more social isolation than ever before, and then algorithms, and systems that are constantly suggesting and they draw people in. The more time that they watch and share and click and like, the more the algorithmic gravity sucks them down the rabbit hole.

Mick West:  You've got these people who are often just sat at home and spending a lot of time reading things on the Internet, and not actually talking to other people in the real world.

Allie Wong:  Conspiracy theory, often times, is bred in disgruntlement with disgruntled groups of people who feel like they don't matter, who feel like there's been a chronic issue that has gone unrecognized.

Mick West:  Has it changed? I think, to a degree it has, because I talked about the lack social contact with people, as social contact gives you context about how the world works. But if you reach a critical mass of conspiracy theorists, where there's lot of people around you, who believe these things, then it's really easy to get sucked into that, because it becomes, in a way, the default position. I think people in clusters, in social groups, where a number of people start talking about these conspiracy theories and maybe they start talking about the less extreme ones, and people get sucked into them, just because it becomes part of the social circuit. And that's a stepping stone into believing all kinds of conspiracy theories.

Perry Carpenter:  Let's recap. We know that there are a lot of different kinds of lies being spread and traps being set. We've got disinformation, which is the intentional creation of a falsehood, usually done for some kind of geopolitical purpose, but it can also be broader than that. We've got misinformation, which is when you, I, or our proverbial cousin on Facebook unintentionally shares disinformation or false information. And we have malinformation, which is information that's true but is posted out of context, in a way to serves to further a false narrative. And then we've got conspiracy theories which can contain any and all of the others. We also learned that our minds are extremely vulnerable to these types of deceptions because of System 1 thinking. That mode of thinking that we are in about 95% of the time, that's extremely fast, but it's error prone, because it likes to take shortcuts and can be easily hijacked by emotion or laziness.

Perry Carpenter:  We know that technology and the way that social media companies use algorithms to service the things that we're interested in, or to keep us on the platforms longer, have contributed to the problem by creating filter bubbles, enhancing tribal thinking, and serving people increasing amounts of polarizing and conspiratorial content, based on that person's prior engagement with that type of content. But, what can we do about it? A great question, glad you asked. We'll talk about that next time on 8th Layer Insights... Just kidding. Okay, here we go. Here's the reality. People like you and I can't do much about how these begin. We're not the ones that are creating and launching disinformation campaigns or coming up with entirely new conspiracy theories. And most of us don't have control over the technology that may be able to stop these types of campaigns. So, by the time you or I see or hear about any untruth, like a disinformation campaign, it's most likely already morphed into misinformation or a conspiracy theory that's already gained a foothold.

Perry Carpenter:  In this next section, we'll hear from our experts about what can be done, from having better conversations, to the hope of having better technical controls, to when and where government regulation comes into play. So, let's first explore the power of a simple conversation.

Samantha North:  There's this whole effect when you get behind your keyboard that really adds fuel to the fire, and people don't hesitate to get really offensive towards others, in a way they would not do face to face.

Allie Wong:  What does dialog look like? What does having a civil conversation look like?

Mick West:  I think that basic politeness is something that's very important to have a good conversation. If you meet a stranger in a strange land, you might address them as a friend. "My friend, could you help me with this?" This is perhaps something that other countries do as part of their communication style, is address someone as a friend. But if you think of someone automatically as being on your side, then you view a conversation as working together towards a mutual goal, which is figuring things out. Rather than if you see the person you're talking to as some kind of adversary or somebody who is stupid or crazy, or being brainwashed or something like that, some kind of negative connotation. Then, it just doesn't work.

Samantha North:  Encouraging people with different views to, to talk to each other in person more often, I realize you can't do this on a mass scale, but I've always found that people have very different views. Where they meet in person, they're not going to be hostile usually, in the way they are on-line. I think after COVID, people are going to want more physical contact. We've been behind our computers for over a year, stuck behind them.

Mick West:  There are certainly people who you can't reach that day. If you talk to them, there's nothing whatsoever you could say to them, or that you could show to them, on that day, or maybe even that week, or maybe even that month, that is going to make them change their mind. But that doesn't mean that you've got to give up, because people do change over time. You can help them change in the right direction and they can, eventually, get out of whatever conspiracy theory rabbit hole they are in, and I've seen it happen many times.

Perry Carpenter:  One of the most important things to keep in mind in our fight against disinformation, misinformation, malinformation, and conspiracy mindsets, is that our enemy really isn't the other person. The enemy is the false information. And so, as Mick said, there's a lot to gain by viewing the other person as our friend. When we adopt this lens, and when we realize that we're on the same side, then the enemy becomes very clear. The enemy is the falsehood. And the way to defend against that is to expose the truth, and adopting that mindset may even open new ways to help discover the truth. Let's listen in, as Eli Pariser describes an experiment that his company, Upworthy, conducted.

Eli Pariser:  I run a media company called Upworthy. We decided to like, actually test this out. So, about a third of our audience isn't into vaccines or isn't fully convinced about vaccines, and we were running an article that was an argument as to why you should care about vaccinating your child. For half of the people who came to the page at random, we showed this little box, it was almost ridiculous. It was this little box that just said, "Hey, you're a great person for consuming this kind of content. Good for you," and then, we sent them on to read the article. The group of people who got that information were much more likely to be open minded about the information that was presented, than the group of people who didn't. I think it sounds really cheesy and kumbaya to say, why don't we all just affirm each other? But I think, actually, psychologically, it's a really important part of the equation that we often skip over, which is how do you provide people that sense of warmth and security so that then, in a domain that's difficult, they can actually have a hard conversation?

Perry Carpenter:  Being open to real conversations is the key to success, and that means that we need to be able to admit where we are and are not experts, where we get our opinions from, what our sources are, and so much more.

Allie Wong:  I think that as long as we have conversation, there is hope for people to be hearing things.

Perry Carpenter:  That's Allie Wong.

Allie Wong:  Something that I've been saying to many of my close friends and family is, I think that it's really important for us to understand that we're not all experts in everything. You can have conversation on something, whether it's the economy or environmentalism, or nuclear policy and, oftentimes, we enter into a dialog thinking that, just because we watch the news, we're an expert on something. But that's not true. I don't know a lot about the economy, I don't know a lot about environmentalism. I know a lot about nuclear policy and so, I'd like to believe that if I enter into a conversation about economic or foreign, I can take a back seat and say, "Hey, would you teach me about that? Would you, would you share with me about your expertize? Can you chime in on that, because I don't know?" And I think that we don't really enter into public discourse with humility often, but I think it's so crucial.

Mick West:  I think it's actually really interesting, talking about the Socratic Method. To go back and actually read the original Socratic dialogs, a lot of them somewhat dated, obviously, but the actual techniques that are used there, even though it's an invented conversation, are this very gentle and polite exchange between two people. And what you want to do, if you're doing a Socratic Method of talking to somebody is to maintain that tone, two professional people politely discussing a topic. And if you ever stray from it, it doesn't matter if you feel like you're getting somewhere with it, you're probably not. If you're straying away from this politeness and you're starting to get a bit angry with each other, then you take it back and perhaps go in a different direction. Talk about something else for a while and then guide it back towards the actual topic.

Allie Wong:  And in that, I, I think that we're seeing so much polarization and everyone would probably acknowledge that, despite what side of the aisle you're on, that, hopefully, we can get to a place where we say, okay, we've hit a ground swell and this has to stop.

Mick West:  If they tell you something, you want to ask them why they believe that thing. And that in itself is a difficult to do without coming across as confrontational. You can't just say, "Why do you believe that ridiculous idea?" or "That's nonsense, why do you believe that?" which is the, the temptation to do. You can just start out, instead of asking them why they believe it, is to ask them to tell you more about it, give more details. Lead them towards them figuring out the answer for themselves. Rather than telling them the answer, guide them towards it. And you can also move the conversation away from the point that they raised directly, because often, when people raise issues, they're almost not relevant, they think that they are a good argument, but they're not, and it's almost a lost cause to, to try to keep going down that path. So, you can ask them about where they got their information from. You, obviously, don't say "Where'd you get your information?" like it's some kind of interrogation. But you ask them "Where can I learn more about this?"

Mick West:  This idea that you're saying, what are the sources that you use here, that you consider to be valuable sources? Then, you can get to talking about what is a trustworthy source and what isn't and then why they think that and the conversation organically evolve from that point.

Perry Carpenter:  And that begs an important question. How do we bring in things like fact checking? It seems like nobody really enjoys being fact checked. So, it's probably important for us to explore some tactful ways to do fact checking in ways that we can bring in other sources that won't immediately turn someone off, or drive them to cling to their current set of sources even tighter. Let's start with Samantha North.

Samantha North:  We don't like to admit that we're wrong really, do we? If you've wrapped your whole world view and your whole identity up into a certain set of views, if it all falls apart, admitting that it's failed is, it's almost like admitting that you're a failure as well. I believe that could be pretty tough for a lot of people to come to terms with.

Mick West:  Sites like Snopes have become problematic, but they're still very useful sights, because you can use them as a source for references. If you want to check a claim, you don't want to just post a link and say, this has been debunked and here's the debunking, look it up, and you post a link to Snopes and it says, "False" or you, you post a link to PolitiFact or whatever it is, FactCheck.org. All these fact checking sites are usually very good and most of them are pretty accurate in their analyses. But they have been betrayed, quite deliberately in many cases, as being some kind of propaganda and so, the fact that it is being debunked on this site is evidence, in the mind of the conspiracy theorists, that it's actually true. So, if Snopes says something is false, that means that it is true in their mind. So, it's not helpful to say, "Oh, that's been debunked on Snopes." Instead, you've got to go in and you've got to read the actual Snopes article and see what they did to debunk it and see what the facts were, and try to get them to the person in a more neutral framework, a more neutral set of sources.

Mick West:  Consider what effect the source is going to have on the individual, on your friend, when you try to give them information, and try to do it in the most neutral way possible. Something I find a lot with these more partisan or slanted sites, is that the headline is often the misleading part of the entire article, and then the first paragraph comes out of that. But then, if you keep reading, you'll go down and you have to skip a few ads and pictures and things like that, often you'll find, near the bottom of the article, they actually do a reasonable analysis of these things. So, if you dig deep into these articles and you read the whole thing, you can find where they talk about, scientists say, or other people have a different take on this. In the more extreme things like the UFO cases, you will see that they will always start with "Amazing flying orb seen over the White House" and then, at the bottom of the article, "Officials say that at the time there was a balloon release over the White House and that was responsible for the sighting."

Mick West:  So, they often bury the lead, bury the debunk and you've got to dig to find it, but you often can find it. And if you can find it in a source that is more palatable, that's great.

Allie Wong:  I think those sites are great, but I think that there needs to be an iteration of those that hits perhaps other subtext. Maybe it's something that's more acceptable to people. Whether you're a republican or democrat, black, white, male, female, there needs to be something that isn't just homogeneous in the way that it's presented.

Perry Carpenter:  And then, of course, the question about where technology comes in. How does technology help us combat this, as we go forward as a society?

Samantha North:  That's the big question, isn't it? What can we do to avoid it in the future? The way that social media intersects with these dynamics is really fascinating and very powerful. I think it's risky to over moderate things because, let's say QAnon is wiped off every mainstream platform, it's going to pop up on less mainstream platforms, right it?

Allie Wong:  It's not something that we can really control on our end, especially for places like Russian and China that are unapologetically doing it. You had an expose in the New York Times, I think, a few years ago that just talked about a troll farm in St. Petersburg and where it was and who was employed and the different kinds of media sources that they use all over the world, and it hasn't stopped it. You know, it's not like exposing it changes it. And so, I think that that might put the onus more on technology companies to get more insight into, what are the behaviors of these trolls? Whether it's something like natural language processing and looking at how they're using Google Translate to try to propagate their message and identifying users that are not really users, or things like IP addresses or tracking them down.

Samantha North:  The social media platforms obviously have a massive role to play. I think they're the primary actors in this. And it's helpful if certain content just becomes unavailable on the platform, because then, basically, fewer people will see it. Also, no platform really owes anything to disinformation creators who any individual who uses it. It's a private company and they don't have to give you air time. The problem is, though, with the engagement driven business models that the platforms currently have, it's tough to say what the incentive would really be for them to do this on a really huge scale, because they don't want their engagement to drop, do they? Because that means that revenue drops as well.

Perry Carpenter:  And when I asked Bruce Schneier about this and related topics, he kept coming back to regulation as one way to try to leverage economic factors as a partial remedy.

Bruce Schneier:  This gets back to regulation. It seems odd that we would organize our societal political discourse around the short-term financial benefit of a handful of tech billionaires. That seems a really weird way to organize our political system, and to organize politics.

Allie Wong:  I definitely will give credit where credit's due, and I think that we are seeing this largely as a reactionary aspect to what's been going on in the past few years in big tech, seeing the double-edged sword of having so much information. They've implemented things like intelligence aspects of their company, analysts who are doing content moderation looking at patterns of radicalization and extremism and the spread of conspiracy theories. In terms of solutions, I don't know what the solution is, but I do know that it's not just big tech. It's not just government, and it's not just civil society. I think it's a blend of all three, of cultivating digital diversity within civil society, with the government stepping up and putting in more regulations. Again, easier said than done. And then, technology companies not just having the bottom line dollar, in their view, but seeing how, even though it might be more costly, they do have the responsibility to moderate content, to keep people safe, to keep their platforms from becoming agents of radicalization.

Mick West:  I think there's a combination of factors. One factor is, is monetary. Advertisers do not want to have their ads run on conspiracy theory videos. And you could simply say, don't have these videos with ads, but I guess Goggle decided it was better to not have these videos show up in people's feeds so they could, instead, show them something else that does have an ad. So, if you've got somebody watching YouTube, do you show them a conspiracy theory video, that you can't monetize, because the advertisers don't want to do it? Or do you show them an advert for Viagra, which you can monetize and you can get money from? And so, they, they choose to do a video that's safe. So, there's a degree of money involved. I think there's also, obviously, a lot of people within Google and Alphabet who want to do good and they to improve the amount of true information in the world, and that's part of of it too. And then, there's also the specter of regulation.

Mick West:  If it became clear that spreading this disinformation was essentially inciting riots, which is what happened, then YouTube could potentially be held liable for being allowed to spread misinformation or, at least, regulation could be introduced which could severely curtail YouTube's ability to exist. There's the idea in law now that, if you provide a platform for people to post things, then you are not a publisher and so you're not responsible for what they are publishing. But there's been moves to try to change that and then YouTube would have to do much more rigorous policing of their material. So, by doing some policing now, it steers away from the necessity of the government introducing more strict regulation, which would be financially a lot more crippling down the road. So, again, I think there's, there's very much a, a financial incentive for them.

Bruce Schneier:  Yes. I would like to see government relation here, because the for profit model of political speech isn't serving our country very well.

Perry Carpenter:  Okay, as we get ready to wrap up, I'm going to give the last word to Allie Wong. She had an interesting thought about the fact that it's actually good that we are wrestling with terms like disinformation and misinformation and conspiracy, because, when we wrestle with these, it actually opens up conversations about the fact that we need to know more and do more.

Allie Wong:  I think having these terms enter into the mainstream lexicon is important, but also for people to really understand their concept and understand the need for digital literacy, that we need to be critical thinkers and that that's something that's going on enrich us as humans but also enrich our experience, as we navigate what we consume.

Perry Carpenter:  I think that sums it up pretty well. Unfortunately, most of us can't really control the technology and most of us can't stop disinformation and conspiracy theories at their sources. But what we can do is engage in meaningful conversations, and we can focus on relationships. Outside of that, we need to realize that we are living in an attention economy. Social media platforms make money by attracting eyeballs to screens, and the algorithms behind social media engagement aren't restricted to encouraging positive engagement. So, it's our job to change the economic dynamic wherever and in whatever way we can. And then, we also need to be ready to engage in meaningful discussions about when and where regulation and oversight is appropriate. These are, obviously, really complicated problems and there is no single answer. But open conversation, rather than shouting matches, polarization and filter bubbles, is probably a good place to start.

Perry Carpenter:  With that, I'd like to thank my guests, Allie Wong, Bruce Schneier, Mike West, and Samantha North, and then also thanks to Pete Leyden and Reinvent for allowing me to use audio from Eli Pariser's talk. And thank you for listening. If you'd like to get in touch, please reach out to me on LinkedIn. I also participate in a weekly chat on Clubhouse in the Human Layer Club. Just search clubhouse for human layer, and you'll see it. Be sure to check out the show notes for links to all the relevant information from today's show, additional reference material, and more. Thanks for listening to 8th Layer Insights. If you liked today's show, we'd be really grateful if you rate and leave us a review on Apple Podcasts. Doing so helps appease the algorithm gods and also helps other people find our show. You can subscribe for free on Apple Podcasts, Stitcher, Spotify, or wherever you like to get your podcast fix. Until next time, I'm Perry Carpenter signing off. 8th Layer Insight is a proud member of the CyberWire Network.