2022 Jailbreak Security Summit: Danny Rogers: Disinformation and Its Threat to Democracy.
We are at a watershed moment in the history of information. The internet has given historically underrepresented voices unprecedented access to tools for self-expression, new platforms to build communities, and new capabilities to speak truth to power. But in response our social internet is now being corrupted, exploited, and weaponized by those with the power to control the flow of information and distort reality. Marshall McLuhan and others predicted this rise of "fifth generation warfare" as far back as the 1970s, and now we are seeing social media provide the "perfect storm" of disinformation, with deadly consequences around the world. In an era where the dominant internet business models reward, to an extreme degree, engagement above all else, and where these engagement-driven algorithmic feeds warp the realities of well over half the world's population, no less than humanity's progress since the Enlightenment itself is threatened. Dr. Rogers will talk about how we arrived at this crisis point, how to define disinformation in the first place, and what can be done at an individual, commercial, and global policymaker level to combat this scourge.
Dr. Daniel J. Rogers is the co-founder and CTO of the Global Disinformation Index, a non-profit focused on catalyzing change within the tech industry to disincentivize the creation and dissemination of disinformation. Prior to founding the GDI, Danny founded and led Terbium Labs, an information security and dark web intelligence startup based in Baltimore, Maryland. He is a computational physicist with experience supporting Defense and Intelligence Community Cyber Operations, as well as startup experience in the defense, energy, and biotech sectors. Danny is an author and expert in the field of quantum cryptography and has published numerous patents and papers on that and other subjects. Prior to co-founding Terbium Labs, Danny managed a portfolio of physics and sensor research projects at the Johns Hopkins University Applied Physics Laboratory. He has a bachelor’s degree in Math and Physics from Georgetown University, a Doctorate in Chemical Physics from the University of Maryland, is an Adjunct Professor at New York University in their program on Cybercrime and Global Security, and a Security Fellow at the Truman Project on National Security.
Transcript:
Danny Rogers: Thanks. Hi, everybody. I'm Danny. I have some good news and bad news - no fake news, I promise.
(LAUGHTER)
Danny Rogers: The good - I'll just start with the bad news. So the bad news is we get to be a little bit uncomfortable now. You'll see why. The good news is this is just before lunch. So you know that the discomfort will end at a - very soon, I promise. And I do take to heart the responsibility of being the speaker before lunch. So I promise to be efficient with my discomfort. So really quickly, who I am and why I'm here - so I am actually a physics guy by training, a weird thing to be. I spent about six years just up the street that way, I think, at the Applied Physics Lab, one of the esteemed sponsors and, I'm sure, employers of a number of people in this room. After that, I left and, about 2013, started a dark web security company called Terbium Labs. We're based up in Baltimore. Maybe you heard of us, I don't know. Now, that's part of Deloitte. And then, I went off to start this nonprofit that is why I'm here to talk to you today. And why am I here talking about disinformation at a security conference? So this is where it starts to get a little uncomfortable because disinformation is an information security problem in a way, in that if you think about sort of the fundamentals of information security - right? - the C, I, and A - the confidentiality, integrity and availability - right? - well, it's kind of an integrity problem - right? - an integrity of information problem. But this is where it starts to get uncomfortable. It's different than other security problems. So whether you're talking about, you know, confidentiality or availability, these are binary things. Either the data is secret, or it's not. I have root; I don't have root. The server's up; the server's down. Even integrity - right? - the hash checks out; the hash doesn't check out. And if you think about a lot of the integrity problems that we model in information security, they're these binary models, right? Like, we're going to, you know, change our grade or change the health record or something like that, right? In a way, disinformation is an integrity attack. But in an uncomfortable way, it's not a binary thing. And we'll talk about what it is in a bit. But before we actually talk about what it is, let's kind of try to figure out why we're here and why in the year 2022, we have an information security conference about disinformation. So I actually also teach a class on disinformation and narrative warfare at the NYU Center for Global Affairs. And a question I ask in my class is, why are we - why does the NYU Center for Global Affairs have a class on disinformation? Now, they didn't have one 20 years ago - 'cause people lying to each other is not new. In fact, who here has heard of the story of Antony and Cleopatra? Come on. I hope more of you. Yeah? OK. There we go. All right.
(LAUGHTER)
Danny Rogers: We've read a few books in high school. So I hate to break it to you, but that's a disinformation campaign that's about 2050 years old. So in, like, 25 B.C. or something, the Emperor Octavian wanted to - was competing for the throne, or to succeed Julius Caesar, with Mark Antony. And Mark Antony had this kind of well-known sort of, you know, side affair with Cleopatra. And Mark - and Octavian started an entire disinformation campaign to discredit Mark Antony and usurp his inheritance of Julius Caesar's throne. And so he printed up a fake will in which he left - in which he attributed Mark - to Mark Antony, leaving all of his wealth and land, et cetera, to Cleopatra. He printed coins that supposedly had Mark Antony's face on one side and Cleopatra's on the other, and he distributed this. He went to the Senate. He went to the Roman population. And he smeared his name saying, here's this, you know, supposed Roman general who's now selling out your country to this Egyptian person who, by the way, was dark - (inaudible)...
(SOUNDBITE OF MIC TAPPING)
Danny Rogers: ...Was dark-skinned and a woman which was very resonant within the Roman society. And the disinformation was so effective that he not only became the ruler, but we are still telling that story today as if it were true. So it's not new, right? But something is different. There's a reason we're talking about it now versus talking about it 20 years ago. And I think the reason - or let me put the reason - but the line in the sand, like, the year that - or the month, actually, that kind of everything changed, I put at about May of 2012. So a few things happened in May of 2012. The first thing that happened was the sham reelection of Vladimir Putin in Russia. So I don't know if any of you remember this, but the reelection was very - sorry, I'm trying not to trip over the wire – was very obviously staged, right? There were, you know, plenty of video - there's plenty of video footage of his goons stuffing ballot boxes. There was - you know, it was just generally accepted this was not real. And so what happened was the largest protests in Russian - in modern Russian times - so since the revolution - in Bolotnaya Square in Moscow. And unlike a lot of the previous protests, Putin couldn't, like, arrest people and shut them down. He kept trying and trying, but they were actually organized organically on Facebook, much like the Arab Spring protests, much like the things we had observed - been observing in the previous years. And it frankly put the fear of God in him.
Danny Rogers: He decided - because at the same time there were people from the U.S. State Department going around the world talking about the sort of effects of what they were calling connection technology threatening autocrats. And he decided at that point that Facebook must have been a tool of the CIA to overthrow him. And he wasn't alone. A lot of people in power were watching what was happening with the Arab Spring, with other kind of, you know, popular uprisings around the world enabled by this - you know, what they were calling connection technology and kind of decided that it was a major threat and it sort of must be neutralized. And thus sort of entered the era of what, you know, people - thinkers much smarter than I am on this, but called sort of fifth generation warfare, sort of network on network warfare, where if you think of, you know, third generation of army versus army, fourth generation, think the global war on terror, which is army versus network, now think network versus network where we've been fighting this war, and yet we're all soldiers, and we don't even know it.
Danny Rogers: And so that's kind of one big thing that happened, and I put that line at around May of 2012 when sort of the world's established authoritarians and people in sort of entrenched power felt threatened and started to employ social media to fight back. The other thing that happened in May of 2012 is Facebook went public. And if you remember back to 2012, there was actually a lot of skepticism. So here's a headline from 2012 - May 15, 2012, in the Wall Street Journal. Michael Bellinger, a lawyer from Oklahoma City, invested his personal money in the stock market. But he'll be skipping the Facebook IPO because he thinks its valuation is totally, quote, "out of whack." Scott Schermerhorn, chief investment officer of investment management firm Granite Investment, says the hype around Facebook's IPO is going to keep his firm away. It's a cult stock. Just remember this when you trust your financial advisor. Yeah.
Danny Rogers: And so remember, there's a lot of skepticism. And so this company who had this sort of business model that was - everyone was scratching their heads at of selling literally just people's attention had a lot to prove. And they set out to show their 3% growth or whatever percent growth a quarter at any cost. And the third thing actually happened not quite in May of 2012, but around the same time. And it was the release of the Facebook social contagion experiment. Does anybody remember this? One hand - just one hand - couple hands, awesome. Yeah.
Danny Rogers: So this was a really interesting and disturbing release. Facebook ran an experiment on 689,003 users where it says they could affect the content that showed those users, posted to Facebook, more negative news feeds led to more negative status messages and more positive news feeds led to positive statuses, meaning they could actually make you feel things. And a lot of the research ethics world was pretty up in arms about this because, you know, here was a private company doing essentially human subject research. And when asked if they did Institutional Review Board review and all of the ethical things required, they said, no, it's our data. We can do what we want. And, you know, they didn't correct for things like what if you had a user who suffered from depression or had suicidal tendencies, et cetera. But they also learned a bunch of stuff at that time too, right?
Danny Rogers: So when researchers reduce the appearance of either positive or negative sentiments in people's newsfeeds, when the features got generically less emotional, those people stopped writing so many words on Facebook. Make people's feeds blander and they stop typing things into Facebook, aka less engagement, aka less money to Facebook. And so they learned a lot too, right around the time they were going public, about what it takes to make money on the internet. And it basically takes emotion, right? That's what gets engagement. And in fact, subsequently, if you have followed the Frances Haugen whistleblower documents, they discovered specifically that negative emotion drives more engagement than positive emotion to the tune of, like, 5 to 1. And so in the Facebook newsfeed algorithm, when you dislike something or express anger at something, that something gets five more points than when you like something, in which case it gets one point.
Danny Rogers: So all these things are sort of happening. And now if you fast-forward to 2022, and you have well over half the world's population all getting their sort of information about the world from one of a few handful of algorithmically mediated social media platforms - so Facebook newsfeed to, you know, TikTok, YouTube, Twitter, et cetera. All those things are optimized to maximize our negative emotions. All those things are personalized to each individual one of us. We've essentially, at this point, hit the monkeys-and-typewriters phase of the internet, where there's an infinite sea of content for these algorithms to choose from. And it's kind of no wonder that none of us can agree on basic things like the Earth is round and that we're fighting all the time. Not surprising. So that's why we're here talking about this today. But what is this? What is disinformation? So this is where, again, we get a little bit uncomfortable because in the security community, we're really used to binary problems. And so we like to think about disinformation in a really simple, true-false kind of way, right? So think, people lying to you on the internet, right? That's what a lot of people kind of use as the sort of, you know, stand-in definition for disinformation. But that's way too simple - right? - because if it was really just somebody lying to you on the internet, then we'd be clamoring to moderate every mention of Santa Claus off of Facebook. We're clearly not doing that. Sorry to disappoint anybody who didn't realize that Santa Claus was a lie that we tell on the internet. In fact, if every Christmas Eve, the U.S. government puts up the NORAD Santa tracker - again, sorry to disappoint you, but that's not real.
(LAUGHTER)
Danny Rogers: Is that a U.S. government-funded disinformation campaign?
(LAUGHTER)
Danny Rogers: In all seriousness, though, it's not, right? And so that means that we have to think about a much more nuanced and much more uncomfortable definition of disinformation that is not as simple as true, false. So here's a relatively infamous page from a relatively infamous website called Breitbart News. I'm not going to editorialize on Breitbart News, even though it is technically my job to do so. But I want to point out something about this page. So this is a section on their website. You can go look at this right now. Each one of these stories is, you know, fact-checked to be correct. It's a local news story about a crime - you know, some petty crimes, some not-so-petty crimes - committed by an undocumented immigrant. And if you press Breitbart on this page, they're saying, this isn't fake news. Each one of these is true. Now, my nonprofit, the Global Disinformation Index, characterizes this website as having a high risk of disinformation because this page is doing something that we consider to be disinformation. So what is it doing? It's peddling what we call adversarial narrative conflict. And this helps us get to a much more useful definition of disinformation. So anywhere where somebody is intentionally, often implicitly, spreading an adverse narrative that is intentionally misleading, adversarial in nature and creates a risk of harm, that's disinformation. And so what is that Breitbart page trying to convey? Even if every story on that page is technically true, by cherry-picking element - you know, cherry picking stories about individual petty crimes committed by undocumented immigrants, what they're trying to convey is this intentionally misleading narrative that undocumented immigrants disproportionately commit crimes, which is statistically incorrect. And by putting that together in that way, they're trying to walk this line and say, we're not lying to you. We're telling you the truth. But, in fact, they're telling you a misleading narrative and, more importantly, one that's adversarial against immigrants. And most importantly, that sort of adversarial narrative against an at-risk group is creating a risk of harm, because that's the kind of content that radicalizes people online and leads to things like the shooting in El Paso. And so anywhere where there is an expected risk of harm, anywhere where there's an adversarial narrative against an at-risk group - and that at-risk group could be, you know, children. It could be science. It could be democracy. That is disinformation as we define it at our nonprofit. And that is how we track it. We found it to be much more useful than a simple true-false test. And so other examples, right? Technically, they're quoting Rudy Giuliani. And we've actually had this legal argument with Breitbart before - that they would say, that's a technically true headline. But it's actually, again, pedaling an anti-democratic narrative when they're not putting it in context and providing the full story, et cetera, et cetera. So now that we have a working definition of disinformation - and by the way, I'm more than happy to argue with you about this. So please...
(LAUGHTER)
Danny Rogers: Please, challenge me. Seriously, it's an informal place. If you have a question, shout it out. It's totally fine. We don't have to wait until the end. But I've stunned you into silence, I can see. So let's keep going. So why do people do this, right? There's actually - and I like this sort of two-dimensional plane of what I call the disinformation threat landscape or the threat spectrum. As I see it, there is a - there's sort of four kind of quadrants of why people engage in this sort of adversarial narrative conflict. There's political actors who do this for kind of ideological or geopolitical reasons. There are - and then there are financial actors who do this for purely financial reasons. And, in fact, as we've talked about, the entire internet business model is built on using engagement metrics to build audience and monetize that audience. And it turns out nothing builds audience better than fomenting conflict, driving anger and then soaking that emotional kind of engagement for either advertising dollars or merchandise or direct donations, however you can monetize that audience. And so there's a whole kind of side of the spectrum of financially motivated actors. Some of them are highly centralized influence operators like Cambridge Analytica. And some are individual entrepreneurs like this guy. So I'm actually going to take a break from talking, and I'm just going to play you this interview with this guy from 2018. My friend Cameron Hickey at PBS NewsHour tracked one of these individual entrepreneurs down and interviewed him about his job. And I'll let his words speak for themselves.
(SOUNDBITE OF ARCHIVED RECORDING)
Judy Woodruff: Now to our deep dive on the continuing problem of false or misleading news or what you might call junk news. Much of the attention recently has centered on Facebook. And yesterday, the company's founder and CEO, Mark Zuckerberg, told Wired magazine that it may take up to three years to fully prevent all kinds of harmful content from affecting people's news feeds. Tonight, Miles O'Brien's latest report profiles a man who has been a leading purveyor of junk news and how he has been exploiting Facebook to reach an audience. It's part of our weekly series on the leading edge of technology.
Unidentified Person #2: There has been a shooting...
Danny Rogers: Hang on one second. Did you catch that, by the way? Zuckerberg said in 2018, it might take three whole years. It's been a long three years.
(SOUNDBITE OF ARCHIVED RECORDING)
Unidentified Person #2: ...At a high school in Parkland.
Cyrus Massoumi: Right now, we have about 5,300 people and change on the website.
Miles O’Brien: It was a busy day at the office when we met one of the internet's most prolific distributors of hyperpartisan fare.
Cyrus Massoumi: Actually, in a story like this, we do actually beat the mainstream media for these types of breaking-news events.
Miles O’Brien: It was the day of the high school shootings in Parkland, Fla. And as the horrific events unfolded, Cyrus Massoumi was spinning facts reported by others to fit the worldview of his audience.
Cyrus Massoumi: You can see that, like, he's wearing a Make America Great Again hat.
Miles O’Brien: Right.
Cyrus Massoumi: And he has lots of photos of guns, so obviously, this is going to be a very controversial issue.
Miles O’Brien: His site is called Truth Examiner, and it caters to liberals with headlines like this designed to entice clicks on stories with little substance. His writers are among the five most successful at luring those clicks on Facebook.
Miles O’Brien: People want to read those lines to reaffirm their beliefs, right?
Cyrus Massoumi: Correct.
Miles O’Brien: And that's - that is not rocket science, is it?
Cyrus Massoumi: It's not rocket science, but doing it faster and better than your competitors is an art.
Miles O’Brien: Lately, Truth Examiner has added something else to the formula - a steady stream of conspiracy theories, ironically accusing the Trump administration of peddling fake news. Massoumi has thrived in this murky world for eight years, hedging his bets, serving up grist for liberals and conservatives through various Facebook pages.
Cyrus Massoumi: They want, like, 250-word, like, little hit them and go. It's like - basically, like a coke addict. Every hour, he just needs to, like, you know, get that little dopamine rush. Like, a fan on the conservative side or the liberal side needs to take out their phone, look at it - oh, Trump sucks. Trump sucks so bad. All right. All right. I'm done. I'm done. And then - right? - like, that's it. That's it.
Miles O’Brien: People don't care about the facts.
Cyrus Massoumi: Yeah, of course. People don't care about facts. Take it to the bank.
Miles O’Brien: He estimates he has spent over $1 million in ads reaching over 100 million people and has made several million dollars by selling that audience to advertisers on his own site and on Facebook.
Miles O’Brien: Do you create fake news?
Cyrus Massoumi: No. No, I don't.
Miles O’Brien: Tell me what it is then.
Cyrus Massoumi: Always inflammatory, like, excluding facts from the other side but never fake - my team, they don't cover news angles which are favorable to opposition, in the same way that CNN would never cover a favorable angle to Trump, or MSNBC.
Miles O’Brien: He lives in the home where he grew up, on a nine-acre vineyard in Napa, Calif.
Cyrus Massoumi: We grow a brand of cabernet, which is, I'm told, very nice. Although, I'm not a wine person.
Danny Rogers: I fucking hate this guy.
(SOUNDBITE OF ARCHIVED RECORDING)
Miles O’Brien: He is a self-described cultural libertarian, freethinker and lover of politics. For him, it all started in high school. He was selling anti-Obama T-shirts and decided Facebook was a good way to reach more customers. It worked. He learned how to build an audience on Facebook, dropped the T-shirts and created Mr. Conservative, his first hyperpartisan site.
Cyrus Massoumi: So I am a marketer with a love of politics. And, you know, I contend that, like, marketers will be the king of the future of media. I think that the danger is not the Russians or the Macedonians but that the actual danger is when you have a marketer who doesn't love politics.
Miles O’Brien: Producer Cameron Hickey found Cyrus Massoumi during our 16-month investigation of hyperpartisan misinformation on Facebook. Cameron's key reporting tool - software that he wrote that analyzes social media looking for the sources of what we call junk news.
Cameron Hickey: It's clear that a lot of the publishers are domestic. And I think we've given a lot of attention to Russian disinformation or Macedonian teenage profiteers, but both of those groups, I think, learned it from these guys. They learned it from Americans who have been long profiting on partisan information or other kinds of junk.
Miles O’Brien: Social networking allows us all to bypass the traditional arbiters of truth that evolved in the 20th century.
Danah Boyd: Historically, our information landscape has been tribal. We turn to the people that are like us, the people that we know, the people around us, to make sense of what is real and what we believe in.
Miles O’Brien: Computer scientist Danah Boyd is president and founder of Data & Society.
Danah Boyd: And what we're seeing now with a networked media landscape is the ability to move back towards extreme tribalism. And there are a whole variety of actors, state actors, non-state actors, who are happy to move along a path where people are actually not putting their faith in institutions or information intermediaries and are instead turning to their tribes, to their communities.
Miles O’Brien: Cyrus Massoumi's first big jackpot exploiting this trend toward tribalism was linked to yet another mass shooting at a school. This one in Sandy Hook, Conn., in 2012. In the midst of that horror, he bought a Facebook ad that asked the question, do you stand against the assault weapons ban? If so, click like. Those who did became subscribers to his page, ensuring his content would rise to the top of their newsfeeds. He had bought thousands of fans at a very low price.
Cyrus Massoumi: I felt, subsequently, that I built my first business sort of, if you want to call it, on the graves of young children who were killed.
Miles O’Brien: Well, how do you feel about that?
Cyrus Massoumi: I don't know. How do people feel about things that they do badly? I feel bad about it, but, I mean, we do what we do to pay the mortgage, right?
Miles O’Brien: The strategy Massoumi helped pioneer spread like virtual wildfire. By 2016, marketers, political operatives and state actors were all using the same playbook of hyped headlines, political propaganda and outright falsehoods.
Danah Boyd: They were all in an environment together, a melting pot, if you will, and with a whole set of really powerful skills, when they saw a reality TV star start to run for president. And that's pretty funny. That's pretty interesting. And so it was fun to create spectacle.
Miles O’Brien: The stage was set for the 2016 presidential election, and an unprecedented misinformation campaign waged on several fronts. Back in Napa, Cyrus Massoumi was doing well, running a conservative page called Truth Monitor, along with the liberal Truth Examiner. Massoumi says anger is what generates likes, and conservative stories were more lucrative.
Cyrus Massoumi: Conservatives are angrier people.
Miles O’Brien: Tell me about that.
Cyrus Massoumi: You ever see a Trump rally on TV?
Miles O’Brien: Yes.
Cyrus Massoumi: Yeah. It's gold.
Miles O’Brien: But since the election, the conservative side of Massoumi's business has dried up. His site that used to offer that content has moved into feelgood stories. He says competition among conservative hyper-partisan sites created a junk news arms race, making the content too extreme to be ranked favorably by the Facebook newsfeed algorithm.
Cyrus Massoumi: On the conservative side, I think that we were, at one point, publishing low-quality clickbait. That's what the conservative side devolved into.
Miles O’Brien: Is it unpatriotic to do it?
Cyrus Massoumi: To publish low-quality clickbait? I think that people like what they like. And my goal, at one point, was to deliver to them what they like. And unfortunately, the reality of that is, is that people are prone to go for the lowest common denominator.
Miles O’Brien: But for Cyrus Massoumi, the target really doesn't matter so long as he hits the mark. Stirring up anger, no matter on which side, is very good for business.
Danny Rogers: I've watched that so many times, and I still fucking hate that guy.
(LAUGHTER)
Danny Rogers: But think about a few things that he said, right? So, I mean, where do you start? There's so much in there. One is, like, he's making millions of dollars doing this from his parents' basement.
(LAUGHTER)
Danny Rogers: Always, for some reason - in Napa.
(LAUGHTER)
Danny Rogers: His nine-acre fucking vineyard in Napa.
(LAUGHTER)
Danny Rogers: Anyway, I'll get it together, I promise.
(LAUGHTER)
Danny Rogers: He clearly - like, clearly has his own political bias but also, like, will happily feed anybody anything they want, right? And he made this sort of comparison to, you know, coke addicts, but, like, never really made the mental leap that made him the coke dealer. It's kind of interesting. But I think - and then another takeaway, obviously, is like he lived or died off the Facebook newsfeed algorithm, right? Like, at one point, it stopped doing as well, and there's too much competition, and Facebook tried to change something, and suddenly, he had to completely change his tactics - right? - just to show you sort of how much power Facebook has over the information environment we all experience. And then the last thing I'll point out, and my friend Cameron, I think, you know, expressed this, is that, like, he's one of thousands - thousands upon thousands of people who do this for a living and make a really good living doing it. And in fact, if you follow the Alex Jones lawsuit recently, you know, what was really useful about that lawsuit was it brought out all of, you know, through discovery, like all the details. Like, Alex Jones, probably one of the most successful people at doing this exact thing, made - I think he was worth, like, $800 million just from selling, like, you know, doomsday butt wipes on his website, for all the people who are being driven to his site through these, you know, algorithmic social media channels.
Danny Rogers: And so, like, this is a business. This is the business of the internet. This is what the internet's for. This is what it's designed to do. And this is what's making guys like Cyrus Massoumi, and Alex Jones and Mark Zuckerberg and, you know, everyone else, millions and millions and billions of dollars. The problem is the harm is real. And we sort of laugh at him, but, like, people's lives are destroyed by this stuff - whether it's their families, whether it's, you know, the Sandy Hook - the parents of the Sandy Hook victims, whether it's, you know, kids not getting their polio vaccines. I mean, my God, we have polio back in New York City. Like, what century is it, right? The harm is real. And so while we laugh, like, this - I mean, I really think this has the potential to, like, end the enlightenment if we don't really think about it seriously. And what Danah Boyd was saying in that interview of, you know, the back to tribalism, like, that is the dark ages. That is the end of, you know, a shared sense of objective reality. So I don't mean to be hyperbolic, like, I really do believe this is that important, which is why I, you know, changed careers for it, honestly.
Danny Rogers: So what do we do about it? This is kind of where I get to be a little bit optimistic before we all break for lunch. But I will say that, like, one of the things to notice - the last thing to notice about that TV clip is that it was made in 2018 and, like, not much has changed. Maybe it's no longer as much on Facebook. And now, it's more on TikTok - which is, by the way, you know, the fastest growing source of news for 10- to 14-year-olds if you want a scary thought. But overall, the business model hasn't changed. The laws haven't changed. The platforms haven't changed. And, you know, Zuckerberg's had his three years. Not much has changed, right? And so, you know, how do we fight it, right? What we've been doing I don't think has been enough, although we're starting to finally make headway, I think, you know, as of maybe this year, last year. But there's a lot of focus on fact-checking, journalists who are saying - looking at these, you know, specific articles or specific claims and saying, well, there's a lie on the internet. Well, we already talked about why identifying lies on the internet is not enough. It often actually, you know, entrenches people further or, you know, there's a lot of, like, when someone calls out a lie, it just drives traffic to the original source, et cetera. So I think there's a lot of limitations of that approach. Plus, it is manual, and it takes forever. And you know, the sort of adage - right? - the lie travels halfway around the world before the truth puts its pants on. And I think that applies to that approach.
Danny Rogers: There's also a giant chasm, I think, of mistrust, right? So every time, you know - and for good reason - every time, you know, Zuckerberg gets up and says, we're going to solve this problem, we don't believe him anymore. And clearly, I think we've all kind of come to the understanding that the business model is the problem. And these companies are kind of in this business. There's a bunch of commercial efforts. I'm not super convinced that the commercial efforts are going to do the job, partly because it's sort of the commercial business models that got us here in the first place. And you really have to break this in a different way. And I'll talk about how we do it - or how I think we should do it and some things we're working on. And plus, I don't know that there's like a commercial market for, you know, countering fake news, which is why when we started the Global Disinformation Index, we did so as a nonprofit five years ago. And then also, you know, there's sort of an ongoing debate of does it require people? Does it require technology? And I'm firmly in the camp that it requires both. And we had our nonprofit use both people and technology because it has to scale, but also it has to involve human judgment. And so that's what we do.
Danny Rogers: I'll tell you 30 seconds about my nonprofit, the Global Disinformation Index. We're an independent, transparent, neutral, not-for-profit. It's global in nature. We're co-headquartered in New York and London, about 40 people around the world on, I believe now, every continent except for Antarctica - that's coming soon. Just kidding - funded by a variety of both philanthropic and government funds. And we exist to risk rate the open web, basically, for disinformation. So we have kind of three main categories of work that we do. I'll quickly just, you know, touch on this, but I don't want this to be about me or our work. We do - like I said, we produce open web risk ratings based on our definition of disinformation - although now we're starting to move into apps and YouTube channels and broadcast TV as well. We provide that data to platforms, ad tech companies, et cetera, to give advertisers, give monetizers, the tools to choose what content they get to monetize based on disinformation risk. In 2019, we estimated that the open web advertising ecosystem that supported disinformation was about a quarter billion dollars a year. And fast forward to, you know, 2022, and we think it's down by about 70%. So we think we've - down to about 100 million a year or less. Other things we do - we do open source intelligence work for various kind of government partners and platforms, very select small group. And we do a lot of policy work. And I'll talk a little bit about the policy work that we do.
Danny Rogers: But overall, how do we combat this? I think it falls into three categories. One is media literacy as people talk about it. But I just sort of talk about it as giving talks like this, talking about the problem, shining light on the problem, making everybody understand what is shaping the information environment in which we live in today. So how do you find out about the world and realizing that everywhere you find out about the world is nudged, is influenced, is pushed around by the current internet business model - even if you're watching television, right? Because, remember, television is competing for the same set of eyeballs that's also looking down at its phone at Breitbart or at Drudge or any of these other websites. And so they have to compete, and they have to do the same thing, even if they're not actually on the internet. And so there's actually interesting efforts, whether it's what First Draft was doing in educating journalists, whether it's the AARP educating retirees, whether it's this other new project - The Disinformation Project - educating high school kids. It's a long-term investment in making sure everybody understands how their information environment is shaped. There's platform-level interventions. That's where we tend to live in my organization. But there's others like us. There's groups that do public advocacy, there's groups that do private advocacy, some of each. And then there's the sort of what I call the ground war of disinformation of countermessaging.
Danny Rogers: So taking up the space, you know, if you have to buy the news - buy space on the news feed, do it. But you can't while you're waiting for these longer-term investments to play out. You can't just let kind of the other side win. It's very expensive. The algorithms give, you know, the adversary, so to speak, a 10 to 1 advantage, and that's actually a number that has been measured, meaning, if you're putting out, you know, adversarial narrative content, you have a 10 X return on investment compared to things that aren't because that's what these attention-driven algorithms give you. And so it's hard. It's expensive. But it's what we do while we're investing in everything else. And so some examples of, you know, others in the space doing this important work - you know, people putting out Instagram ads, trying to, you know, provide positive messaging; people providing data, providing information to kids through libraries, et cetera; people doing public advocacy, like when some of others in the space managed to get the Gateway Pundit kicked off of the Google ad network was a kind of a big win.
Danny Rogers: To me, all of this, though, you know, comes down to this sort of issue that's more of, I think, actually a crisis of government sovereignty more than anything else. We are in this era where these platforms at this point have, I think, as much, if not more, power than most, you know, most national governments in the world. So you think about, for example - and this is only partially documented because it's been really hard to report on - but there have been some pretty intrepid reporters talking about how, you know, when Facebook moved into India, they had access to, you know, a huge new market. And in order to get access, they promised, you know, the Modi government - or wasn't the Modi government at the time, but became the Modi government - help in running Facebook-based political campaigns. And so you think about the power that that platform holds and you think about - going back to what I was talking about at the beginning of a world in which this connection technology was threatening established power and thinking about how established power has responded in trying to gain control of that information environment after being threatened by it and it's no surprise that people like Elon Musk have just bought Twitter because if you want to make sure you control and you maintain your power, you have to own the platform. And so whether you're Putin tasking Prigozhin to, you know, fill Facebook full of garbage so nobody trusts it so that they don't organize protests on it, whether you're Elon Musk saying, I want to own this platform to, you know, shape the world to my worldview, that is a huge amount of power. And so I think one of the crises that we're in is that the solution to that is the collective power of our own government or of our own democratically elected government. And it's not in your writing fake news laws - right? - none of this sort of stuff that various other authoritarian regimes are doing around the world to kind of use this issue against itself almost? But it's really in changing the internet business model.
Danny Rogers: So you have, you know, these small number of companies that have a huge amount of power. They have unfettered access to the, you know, information about us that they need to personalize our experiences to this extreme degree. And most importantly, I think they have this complete lack of liability for what they do. I'm sure many of you have heard of Section 230 of the Communications Decency Act. But what they often call the 14 words that created the internet basically give internet platforms complete liability waiver from having to - from having to have any - or complete liability there for whatever happens on their platforms. And so if Facebook recommends a white supremacist group that results in 65% of that membership coming from the Facebook recommendation engine - which, by the way, is what actually happened - they cannot be sued for that. If they, you know - if there are any sort of implicated by the United Nations in being involved in the genocide of the Rohingya people in Myanmar, they can't be sued for that. And so these sorts of reforms are the things that I think we need to change the internet business model and to honestly protect the Enlightenment itself. And you can feel free to read more at the link above, but this is getting to kind of the policy work that we advocate for at our nonprofit.
Danny Rogers: And there's some - already some successes in this area brewing. The European Union, for example, just passed the Digital Services Act. This is a very promising new piece of legislation across Europe that implements a lot of these changes, whether it's, you know, between that and its companion legislation, the Digital Markets Act. These start to go down that road of liability, of antitrust, et cetera. But it's just a start. And it remains to be seen kind of how strongly they'll be enforced. But governments are starting to push back. It's a challenge because a lot of the companies in this space are American companies. They're very powerful. They - kind of, you know, whether it's through lobbying, whether it's through, you know, branding or whether it's through just, like, the story of American innovation - carry a lot of political weight. And so I think we're kind of in the fight for our lives in the fight for the internet right now in the U.S., because this is where the most important legislative efforts need to happen and aren't happening yet. And then just to kind of leave you with a most optimistic note - is it gets worse from here. Because as bad as it is with these big American platforms, TikTok is taking over. TikTok is more addictive. It's more popular among young people. It takes away all the pretense, right? I mean, who here actually spends time on TikTok? Really? Nobody?
Unidentified Person #3: Wrong crowd.
Danny Rogers: Oh, right, right, right.
(LAUGHTER)
Danny Rogers: I look at TikTok from six months ago, right? But it's addictive as hell. And it doesn't have anything - it's just the videos. It's just 30 second videos, and you can't look away. And there's no transparency. And there's no accountability because the company is not - you know, there's a subsidiary, but the company is ultimately Chinese-owned. And so, you - we can make all the laws we want and, you know, good luck. And so it's a scary time in this sense. Like I said, there's some signs of hope. But, right now, I'm - I'll put it this way, like I'm optimistic for the long-term future because we're all having conversations like this. I'm 50-50 on whether we make it. And that's my really uplifting talk for you before lunch.
(LAUGHTER)
Danny Rogers: But I wanted to leave a bunch of time to have questions, debate. Please challenge me on this. Please beat me up. And then maybe we can get some lunch a little early. Let's go here first.
Unidentified Person #4: So just, real quick, I applaud you for standing up there and doing this. This is a really interesting topic, something that I spent a couple years studying. And one thing that - I spent time in the '90s doing counterdrug and counterinsurgency work in South America and realized the things we did then did absolutely nothing, from policy to, you know, direct action and things like that. So I see a corollary - this is just me - with what's happening today with disinformation. If we don't - if we don't come - put something in place to demonetize disinformation - do you see what's happening with drugs now? It's - I mean, what we did then did absolutely nothing, absolutely - just pissed off a bunch of people, screwed up Venezuela (inaudible). It's just...
Danny Rogers: Not wrong.
Unidentified Person #4: It's really horrible. And now you sit back, and you look what's happening here in this space. And I don't - do you know Deb Hurley? She's a Harvard professor, also teaches at Brown. She was actually my mentor and my adviser. She spends a lot of time - she lectures on the legislation for that European Union disinformation. So she has some really great - some opinions in this space. But her whole point is - and I believe it as well - it boils down to education, educating our kids, introducing this a lot earlier on. And then, you know, obviously providing the proper educational boundaries so that we're teaching the right things. And that's where, you know, we spend - I believe - we don't spend enough time or effort or money on educating our children because they're the ones that are most influenced by what's happening here. So I just - I'm just curious about your thoughts on that.
Danny Rogers: Yeah. I mean, I completely agree. I think there's probably another level of detail in that. Like, so on the one hand, I kind of used to say, like, the kids are all right, right? The kids grew up in this digital native world. They know what it takes to put something on the internet, so they know, like, to be skeptical, right? But I'm not sure that being skeptical of everything you read is the solution either, right? Because at some point you stop believing everything, which is almost in many ways the goal, right? And so I agree. And in fact, these sort of - what motivates me and a lot of my colleagues is honestly this - specifically, that one right there, the effect of TikTok on kids. It's horrifying, honestly. Like, the mental health crisis that is brewing with - for children right now who have access to TikTok is terrifying. And so I agree.
Danny Rogers: But I also think, like, you know, efforts by the AARP are also equally useful because, you know, it's also, like, the generation that grew up sort of believing everything that came out of the glowing box was true. That's also part of the problem, right? I mean, I'll tell you a story. Like, my mother-in-law came to visit, you know, back when we lived in Baltimore. She came down to visit from New York. And she stayed with us for a few days. And I remember, like, I cooked us all breakfast one morning on my cast iron pants. This is going to be a really random story, but I promise it's going somewhere. So we eat breakfast. And she goes and she takes my cast - my really nicely seasoned cast iron pan - and she goes in the sink and starts scrubbing it with the...
Unidentified Person #4: No.
Danny Rogers: I know, right? My heart broke. It was - I like jumped up. I was like, what are you doing? She says, but diseases. And I was like, what are you talking about? It's been at 300 degrees on the stove, like, there's no diseases. And then I realized, like, when was the last time you all saw a Clorox commercial, right? Like, they zoom in on the little cartoon diseases, and they're like, buy Clorox. They scared the crap out of you. Like, your kids are going to get diseases - like, oh, those diseases, you know? Like, if it came out of the glowing box, it was true. She's from that generation where it came out of the glowing box, therefore it must be true. And now we have this glowing box where any schmuck like me can put something on it, you know, and suddenly, like, they don't know what to think anymore. So I don't think it's, like, one versus the other. I just think, generally speaking, this is a really knotty problem where everyone kind of brings their own - you know, the kids know not to believe what's on Facebook, but then they go believe what's on TikTok 'cause people are kind of dumb, right? And, like, you can still tell them, eat salad, eat salad, and they're still going to go to McDonald's and get french fries. And so I think it's much more about the structural pieces as much as the education, which is why I think it's all of the things I've talked about together.
Danny Rogers: I think we had a question here first. So - yeah, right here.
Unidentified Person #5: Yeah. I was going to say - so as of, like, in addition to (inaudible) past disinformation, you're going to be, like, immediately accused of peddling that disinformation in some cases. So how do you - I don't know - convince people of your intentions?
Danny Rogers: So we generally don't do a lot of interfacing with the public, in large part because I don't think it's productive for precisely the reason you just outlined. Like, I don't want to be part of the - and, in fact, there is a part of our industry that, like, actually profits from getting into fights with Dan Bongino and Tucker Carlson. And, you know, they're on one side getting death threats, but the secret is those death threats feed the business model, right? And every time they get a death threat, they get a $100 donation, right? And so, like, we've taken ourselves out of that, and we work, like, behind the scenes very much with industry to sort of - with the reasonably minded people, who are not screaming on social media. And we generally stay off of social media except for just, like, the generic release - so here we have a report, a really dry, data driven, boring report, right? And so I don't actually think it's productive to yell and to get in arguing, you know, fighting matches about whether Drudge sucks or not, right? That's playing into the business model. So we kind of do much more institutional type work to try to, again, change the structure of what's shaping our information environment versus, like, participating in it, if that makes sense.
Danny Rogers: All right. I think here, and then we'll go there, and then we'll go here.
Unidentified Person #6: So data is like a drug, right? Outrage Data, you know, claims to bring, like, cocaine. The war on drugs failed. You know, the war on drugs vowed to, you know, making sure they can't distribute it, you know, don't get drugs into the United States, whatever - huge failure. Why do you think the war on disinformation is going to be a success if disinformation is like a drug and you're just taking the exact same approach? And also, doesn't it not only fail but make things worse because when there is a prohibition, the people dealing that prohibited substance become very rich and powerful, like the cartels (inaudible)?
Danny Rogers: So these are really interesting analogies, and we in the information security world live in the land of tortured analogies. So I...
(LAUGHTER)
Danny Rogers: I mean, that's all we do, right? Everything is so abstract now, all we do is torture analogies all day. So I think the drug argument - first of all, like, don't even get me started on, like, the whole war on drugs. Like, that's a whole - we could spend an entire lifetime arguing about that. But I will say, like, I am 100% pro-free speech. Our organization does not go around saying these should not be on the internet. These people should not be able to publish. All we're saying is the people who pay for that content should have a choice about what they pay for and be fully informed. And part of their problem - and this is the reason I always say, like, antitrust is part of the solution - is that if you're a marketer and you want to go advertise on the internet, you just take a third of your budget and you give it to Google, and Google does whatever, and you have no say. And, in fact, a couple of summers ago, you may remember that, you know, 200 of Facebook's largest advertising customers had a revolt. They said - you know, they started this whole #StopHateForProfit campaign. And Zuckerberg was - and this is a Bloomberg headline - like, defiant in the face of a revolt by 200 of his largest advertisers. Like, any company that can be defiant in the face of 200 of their largest clients clearly has too much market power, right? And like - and so to us, it's about providing the data and the structural solutions to, like, let the market start to take care of the problem itself. And it's not...
Unidentified Person #7: (Inaudible).
Danny Rogers: This is the thing that always kills me. Like, I am actually not for, you know, banning Trump from the internet, right? I just don't think Trump should have unfettered access to everyone's megaphones. And, you know, if you remember, he started his own blog after he lost his Twitter account, which he voluntarily took down because no one was going to it. Like, if that doesn't tell you, like, that, like, the problem is the algorithm, not Trump, like, I don't know what other data you need. Like, you know, I wasn't for him censoring himself. He decided to take his own blog down because it wasn't getting the traction he wanted. So, like, I am not for banning the information. I'm not for, you know, again, these free speech infringing fake news laws. Those are almost always - in fact, 100% of the time - tools of authoritarians for political repression. All we're - that's what we focus on is the monetization, is the market structure, is the business model that creates this problem in the first place. This is a toxic externality of a business model. This is, you know, this is dumping waste in the river after making, you know, house paint. This is fast food causing an obesity epidemic. This is not - you know, that's why I think the war on drugs analogy ends at a certain point. And we're not talking about bans. Like, we're not talking about, you know, banning this information. We're talking about changing the market structure, providing the data to try to rebalance this unbalanced market and stem these toxic externalities. So I know it's a little nuanced, but that's why I think the analogies only take you so far.
Danny Rogers: See over here. I think right there.
Unidentified Person #8: So kind of going back to your (inaudible) end analogy versus, like, (inaudible) now, have you seen, like, any data on what the market actually targets? Are they targeting the older generations that are more likely to believe, or are they still trying to feed disinformation to the younger generation that are (inaudible) out? And how successful are they in peddling disinformation to each audience?
Danny Rogers: So this is like the Cyrus Massoumis of the world? Who are they targeting?
Unidentified Person #8: Yeah.
Danny Rogers: I mean, I think it's very experimental, and I think they just sort of put stuff out there and let the algorithm - like, the algorithm does that for you, right? Like, you saw the part where he put up this ad. Everyone who clicked like on that ad subscribed to his newsletter. Like, they just self-selected. Who knows what the demographics are? And so I think, again, going back to like, the core problem is that you have an algorithm that gives five points to anger and one point to like, that when you scale it up to 3 billion people, like it creates this world in which we all have this personalized, highly polarized reality. So I don't think you can say, like, everyone in that world is doing this or that. They're all just putting stuff out into the, you know, the algorithm and letting it do the work. Let's see I think we have here, and then we'll go there and then her.
Unidentified Person #2: I have a question that I feel like is a structural problem. One of the parts I find very interesting (inaudible) it would be an extremely lucrative way to do business think that it's possible, like, how do you change that? Cause I feel like people are going to do whatever they want.
Danny Rogers: How do you go back? How do you go back from this, right? And I don't think that, like I don't think we're going to put the targeted advertising genie back in the bottle, right? Procter and Gamble loves it too much, right? And it works well enough that they're willing to pay all this money for it. That's why I think, like, you regulate it. Like, you use the collective democratic power to put boundaries on it, right? Because you could have made the same argument 100 years ago saying, well, I can't have, you know, my chrome bumper unless I put the chromium hexafluoride into Baltimore Harbor. And the fact is, you could, you know, if you took the collective power of our democratic government and you asserted against these corporate interests and you said you're going to learn how, and then they will. I mean look, like you think this is a crisis? Like, let me tell you about climate change, right? We're all fucked, right? And it's the same argument, right? Like, it's really, like, how do you get oil companies to start pumping, you know, fermented dinosaurs out of the ground? Well, we all have to collectively demand that. And whether that's our spending pattern, we don't - here's one thing you all can do. Like, delete your Facebook account today, you know. What? You're laughing because no one here would ever have a Facebook account? Yeah, same. All right. But that's how - I mean, I think that's the answer, right? Let's see. We had over here. OK, then let's go here and then there.
Unidentified Person #9: Do you feel like it's harder to describe more for people that believe what you're saying about the algorithm is what controls it to, like, older crowds or younger crowds? Like, do they get that there's actually an algorithm controlling this and not them actually thinking, but it's the actual one to one likes or whatever as opposed to one hundred?
Danny Rogers: That's a hard question. I think it depends on who I'm talking to. Like my mother-in-law, I've said it to. And she hears me and then like, just goes right back to reading the Daily Mail. You know, like, what can I do, right? But like, when I teach the class, like, my goal with my class is like by the end, everyone's deleted their Facebook account and thrown their Alexa in the garbage. And I'm usually pretty successful. So let's see. I think we had a question here. Yeah, right there - oh? One more? OK, let's do one more. Then I'll be around for lunch. We can chat more, too.
Unidentified Person #10: So you mentioned your (inaudible) content moderation on social media? (Inaudible).
Danny Rogers: Oh, God. We only had time for one more question. That was - of all the questions? Like, it's a really hard problem. I will tell you this. Like, I will actually go back to, like, the drug analogy, right? Because I think the one strategy that has actually worked is what's called harm reduction. And so I think, like, harms-based approach to this problem is the way you start when you think about content moderation or any policy kind of question about this, is saying, like, first acknowledging - right? - like, there's this thing that kills me, as people talk about online content and real world harm as if, like, what happens online is not the real world. And this is sort of a mental jump that, like, we're still, as a society, trying to make. It's like, oh, it's on the internet. It's not real. Like, it's on the Internet is like saying, well it's in the newspaper. It didn't really happen. Like, it makes no sense, right? But we still think that. We still think that the Internet's a video game. It's not. It's the real world. These are real people communicating, you know, real stuff. And so I like to talk about offline harm or, you know, even online - I don't think we should be making this distinction between, you know, online and real world.
Danny Rogers: And so I think, like, starting by understanding that expected risk of harm to say, well, this anti-immigrant content has been demonstrated to drive radicalization and ultimately violence - right? - is where these policies need to start. And this is, I think what, you know, I mean, I don't think Elon Musk gets anything. And he's P.T. Barnum who thinks he's Tony Stark. Like, he doesn't understand that. But I think this is where, like, the policy people need to start thinking about it. And there the drug analogy does help, because harm reduction in combating drug use is actually highly effective. And there is a framework for it. So I think that's all the time we have for the big Q and A. But I'll be around for lunch, so I would love to chat with you more about this. I might step outside so I can, like, air out my face, but this has been awesome. This went way better than I thought. I thought I would freak you all out, and you'd hate me. So I'm really glad that it doesn't seem like you all do. And so thank you for your time and enjoy lunch.