Caveat 3.3.21
Ep 67 | 3.3.21

Microtargeting as information warfare.

Transcript

Jess Dawson: When we look at the social media space, there isn't anybody really in charge. There's no regulation over this data in many ways. And so we're really at the whims of the social media companies to do anything in this space.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner. And joining me is my co-host, Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: On this week's show, Ben looks at efforts to keep the government from tracking your location. I've got a story that wonders if gathering Congress' cellphone records is constitutional. And later in the show, my interview with Major Jess Dawson from the Army Cyber Institute on microtargeting as information warfare. 

Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, let's dig into some stories this week. What do you have for us? 

Ben Yelin: So my story comes from Recode, part of the Vox family. It's by Sara Morrison. And this is about an inspector general report from the Treasury Department saying that government agencies purchasing data from what this article refers to as data launderers perhaps is unconstitutional under the Fourth Amendment. And that might offer users of various applications some protection from this data laundering. 

Ben Yelin: So just to give a little bit of background - obviously, with all the apps that we use on our smartphones, they are collecting our location data. And that location data is very valuable not only to private companies, but also potentially to the government. It's been used by a bunch of different government agencies, mostly on the law enforcement side, so the Department of Homeland Security, the FBI, but also the IRS in terms of tax enforcement. 

Ben Yelin: The reason they call it laundering is it's not like these companies sell the data directly to the IRS. It's sort of the way, you know, when a company - when you get a mortgage and it's, you know, through a certain bank, they'll package your mortgage with a million other mortgages and sell those... 

Dave Bittner: Right, right. 

Ben Yelin: ...Down the street. 

Dave Bittner: Before the ink is dry, that - your mortgage has been sold to another company, right? 

Ben Yelin: Exactly. 

Dave Bittner: Right. 

Ben Yelin: And that's exactly what's happening here. So you're part of a treasure trove of data. You know, when you logged on to your Panera application to place your lunch order and, you know, you weren't conscious of your location settings when you were doing that, that just starts the long journey of your data being aggregated with millions of other people. You are a data point, and you've been sold and sold and sold again. 

Ben Yelin: So there's this company, X-Mode, who has a SDK that was present in a lot of different applications, ones that I've never used. I don't know if you've used CityMaps2Go. 

Dave Bittner: Nope. 

Ben Yelin: But it was used in a lot of applications. And it turned out that X-Mode was selling a lot of location data, more than your average application, down this stream that ended up in the hands of government agencies. And they were subsequently banned from the Apple App Store and the Google - and Google Play. 

Ben Yelin: The problem is you can't stop these companies fast enough before new companies pop up and start selling your data in the same way. So, you know, you're not going to get much protection from the App Store. They're just not capable of doing it. It's not really in their interest to do it because that information is so monetarily valuable. 

Ben Yelin: But you may be getting protection from the government. So this inspector general report said that if the government is going to purchase this data, they should be required to obtain a judicial warrant. And the reasoning relates back to a Supreme Court case that we've talked about probably a hundred times on this podcast, United States v. Carpenter, which says that... 

Dave Bittner: It's the case that keeps on giving. 

Ben Yelin: It just keeps on giving. 

Dave Bittner: (Laughter). 

Ben Yelin: We'll probably talk about this 200 more times, you know, in the next year. But it was such a groundbreaking case in this field. And that said that the government needed a warrant to collect cell site location information because of the depth and breadth of that information and how so much of that collection is involuntary on the part of the user. And this inspector general report analogized that to data laundering that ends up in the hands of government agents. 

Ben Yelin: So I can already anticipate your questions since I know you so well, Dave, but the... 

Dave Bittner: Go on. 

Ben Yelin: So what happens now, which is always a great question - you know, the inspector general reports are sort of advisory in nature. It kind of depends on the - either the agencies themselves to do something about it or for Congress to do something about it. So we could see changes at the agency level. It could be an administration-wide effort to, you know, require a new standard for purchasing this data from these data brokers. 

Ben Yelin: We could also see congressional action. A Senator I think we mention very frequently on this podcast, Ron Wyden - this has been a pet issue of his, and, you know, he's done his own work trying to track location data as it's sold in this marketplace. And he's put together legislation to try and require the government to obtain a warrant before they purchase this data. So certainly not out of the question that we could see legislation. 

Ben Yelin: But I think this inspectors general report is the first indication that it's not just the data privacy advocates who have a problem with this process and with the system. It's members of our government and its oversight bodies themselves. 

Dave Bittner: I have to say that I feel a little bit let down by Senator Wyden, who is calling his legislation the Fourth Amendment Is Not For Sale Act. 

Ben Yelin: Where is the acronym, Ron? 

Dave Bittner: It's - that's FAINFSA (ph), which means nothing. 

Ben Yelin: No. 

Dave Bittner: Come on. You got to put the effort in here, right? I mean, he's got to have some staffers who need something to do to come up with some clever acronym, right? 

Ben Yelin: I mean, it's so easy because Fourth Amendment is FA. 

Dave Bittner: Right. 

Ben Yelin: And, you know, you can do so many things with that. And if he's looking for free - some free consulting services on acronyms, I think you and I would certainly be available to him. 

Dave Bittner: Yeah, yeah. Absolutely. 

Dave Bittner: The meat of this issue, just to be sure that I'm understanding this, is that if, for example, the government wanted to get my location data, and they wanted to do it in the old-fashioned, traditional way, they would have to get a warrant, right? 

Ben Yelin: Yes. That's right. 

Dave Bittner: They wanted to do it directly, they would have to get a warrant. And so by going through these data brokers, it's sort of an end-around. It's a back door. And the notion is because this data is already out there for sale, anybody can buy it. 

Ben Yelin: Right. It's commercially available, yup. 

Dave Bittner: Right, right. OK. 

Ben Yelin: Yeah. 

Dave Bittner: And presumably, I've given permission to have this data sold because I clicked through a EULA somewhere. 

Ben Yelin: Yeah, permission in quotation marks. 

Dave Bittner: Right, right, right. 

Ben Yelin: Yup. 

Dave Bittner: So I've sacrificed my Fourth Amendment right by clicking through on the EULA. 

Ben Yelin: Yeah, that's technically true. We really want to order our lunch, so we're not going to, you know... 

Dave Bittner: (Laughter) Right. 

Ben Yelin: ...Read the terms and conditions when we download that new application 'cause I'm thinking about the delicious sandwich I'm about to order. And, you know, so you are agreeing to things. I don't think most people understand the full scope of what they're agreeing to. 

Ben Yelin: And, you know, it's not - for people who've read the EULAs, I don't think it's readily apparent how extensive this data sharing is. You know, you can say, by pressing the I agree button, you're consenting to this application and the app store potentially selling your location data. I don't think that quite captures what's going on here in a way that would be understandable to a layperson. 

Dave Bittner: Yeah. 

Ben Yelin: So I think that's a problem as well. 

Dave Bittner: Right. You're not really allowing them to sell your location data. You're allowing them to sell your first born, and you didn't even know it (laughter). 

Ben Yelin: Right. Yeah. I mean, unless the warning's, like, in all-caps letters... 

Dave Bittner: Right. 

Ben Yelin: ...And, you know, references this article and our podcast and every article that's been warning about this, I just don't think it's something that would jump out to the average consumer, who doesn't read these things anyway. 

Dave Bittner: Yeah. All right. Well, it's an interesting story. We'll have a link to that in the show notes. Again, that's from Recode over at Vox, written by Sara Morrison. 

Dave Bittner: My story this week - this comes from The Intercept, written by Ken Klippenstein and Eric Lichtblau. And it's titled "FBI Seized Congressional Cellphone Records Related to Capitol Attack." This is fascinating to me because there's a lot of stuff in here that I previously did not know about. 

Dave Bittner: So turns out that after the storming of the Capitol on January 6, the riot and insurrection, the FBI used emergency powers. And I don't know about you, but any time I hear about a government agency using emergency powers, I get a little nervous (laughter). 

Ben Yelin: Yes, always. Your ear should perk up a little bit whenever you hear that. 

Dave Bittner: Right, right, right. The FBI used emergency powers to gather up all kinds of cellphone information - location data, information about who's connecting with whom and all that sort of stuff. Basically, I'm guessing they went to the telecommunications companies and said, give us everything you got, and the telecommunications companies have said, yes, here you go. Here's everything we've got. 

Ben Yelin: Yeah, we don't want - we want to help in any way we can. You know, they don't want to be on the wrong side of this issue. 

Dave Bittner: Right, right. Exactly. 

Dave Bittner: Where this is running into some issues is that that includes the information of members of Congress. And some of the members of Congress are saying - particularly, I believe it's Senator Sheldon Whitehouse, who's a Democrat from Rhode Island. He warned the Justice Department to not get involved in investigating the attack because - he's claiming some separation of powers principle that the executive branch should not be able to investigate the legislative branch. Help me understand this, Ben. What's going on here? 

Ben Yelin: So it's a great question. To be honest, I commend Senator Whitehouse on this because, you know, he's a pretty liberal senator and probably wants to go after some of his Republican colleagues on this, but I think he's adhering to higher principles here. 

Ben Yelin: So those higher principles come from what's called the Speech and Debate Clause in the United States Constitution. For those of you who need a citation, Article I, Section 6, Clause 1, which... 

Dave Bittner: (Laughter) Now you're just showing off, Ben. 

Ben Yelin: I'm showing off. I did not memorize that. I had to look up the section number. 

Dave Bittner: (Laughter). 

Ben Yelin: Basically, to put it in a nutshell, members of Congress, except for a very limited number of delineated crimes, are privileged from arrest during their attendance at their respective houses. And so that gives them basically immunity from federal prosecution in the course of their duties. 

Ben Yelin: The reason for this, and I think it's, you know, very sensible on the part of our Founding Fathers, is you didn't want a situation where a president was about to lose an important vote in Congress, and he thus would have a bunch of House or Senate members arrested to prevent them from voting the wrong way. So this comes from a common law principle. It's seen in a lot of our legal ancestors. United Kingdom - they've had similar parliamentary immunity laws. 

Ben Yelin: You are also immune from civil suits in the course of your work in Congress, meaning you have a privilege, for example, against defamation if you say something during congressional debates because we want to foster free and open debate. So those are the explicit powers of this provision. Then it's sort of, what does this provision mean more broadly? It's about separation of powers. You know, Congress does not want the executive branch to have policing authority over what happens on Capitol Hill. They want to be able to police their own members. And this is - you know, it comes from that speech and debate clause. But this is also about just the principle of separation of powers. 

Ben Yelin: And this is something that's gotten - you know, there's bipartisan support in Congress for this principle. I remember in my early days of following politics, there was a congressman from Louisiana who was in some legal trouble. And they found - it was something about money laundering, and they found $90,000 in his freezer in his... 

Dave Bittner: Yeah, yeah, yeah, yeah, yeah. 

Ben Yelin: ...Washington, D.C., congressional office. And members of Congress of both parties said, we don't care about the substantive crime, but the fact that the FBI was coming into congressional office buildings to investigate members of Congress goes against the speech and debate clause and goes against this principle of separation of powers. And that's what Senator Whitehouse is getting here. There are many avenues within Congress to hold members accountable for - if, you know, the public or members of Congress think that other members have misbehaved in some way, there are censure procedures, ethics committees, et cetera. 

Dave Bittner: And we all know how well that's been working out lately. 

Ben Yelin: Well, yeah, exactly. 

Dave Bittner: But let me dig in on that point, though... 

Ben Yelin: Sure, sure. 

Dave Bittner: ...Because besides the snark, I mean, aren't we kind of putting the fox in charge of the henhouse, then? 

Ben Yelin: Yes. Yeah. I think - the idea is that Congress, in the Constitution, is given the power to make rules for its own proceedings. It's given power to expel its own members. So I think that the principle is that Congress should and does have the constitutional power to police itself. 

Dave Bittner: Yeah. 

Ben Yelin: I think that principle is legitimate, even though, as you say, you know, Congress guarding itself is - perhaps makes you a little suspicious because you have this fox guarding the henhouse thing. 

Dave Bittner: Right. 

Ben Yelin: I think the alternative would be more dangerous, which is the executive branch targeting members of Congress based on their political decisions and prosecuting them in the course of their work as members of Congress. So you kind of have to just balance out those - you know, the harm that would result from those two systems. 

Dave Bittner: It's a lesser of two evils choice (laughter). 

Ben Yelin: Yeah. And sometimes that is the choice we are left with. 

Dave Bittner: So what do we suppose is going to happen here? How will this play out in terms of the FBI having access to this information with the Capitol riot? 

Ben Yelin: Right now the FBI has disaggregated data and its metadata, so it's not the content of any conversations. So because we're at such a preliminary stage, I don't think we'd actually get into conflict until we start to have an actual more specific investigation. Let's say we discovered that, you know, a member of Congress made a phone call to somebody under federal investigation for fomenting the insurrection. And I'm not alleging that this has happened. This is just a hypothetical... 

Dave Bittner: Right. 

Ben Yelin: ...At this point. You know, then that's where the rubber is really going to start to meet the road because that member, and perhaps other members, of Congress are going to want to prevent federal law enforcement from abusing its authority and investigating members of Congress for things that they were doing that were conceivably within their duties or responsibilities as members of members of Congress. So Congress, you know, has limited tools at its disposal to stop this. You know, the emergency tools are using their power of the purse to - through the appropriations process, prohibiting the federal government from doing such a thing. 

Ben Yelin: But also, this may seem old-fashioned, but kind of raising hell about the issue - that's what happened with Congressman Jefferson, the $90,000 in the freezer guy - is when all of Congress stands together and says, I may politically disagree with my fellow members, but I'm going to protect their constitutional rights. Oftentimes, the FBI has been forced to back off. I mean, they don't want to be hated by members of Congress from both parties because those are the people who are appropriating money that pays their salary. So... 

Dave Bittner: Right. 

Ben Yelin: I think that's something that we very well could see happen. Perhaps there's a criminal investigation that reflects poorly on a member of Congress, and we might see pushback on the investigation itself even if people disagree with that particular member. 

Dave Bittner: Wow. All right. It's interesting. It's tricky stuff, huh (ph)? 

Ben Yelin: Yeah, it is. I mean, the speech and debate clause itself is so tricky because so much of it as well - is this really in the scope of his or her employment? You know what, if you're arrested at a traffic stop in Washington, D.C., you know, by federal law enforcement? Were you driving to work? Like, it gets into these kind of complicated, archaic details. 

Dave Bittner: Yeah. 

Ben Yelin: But, you know, I still think the principle itself is very important because we don't want a situation where the president is sending, you know, the FBI to the steps of the Capitol to arrest people on spurious charges because they're afraid of losing a vote or something. 

Dave Bittner: But I suppose, I mean, the flipside is, like other things in our legal system, that occasionally that might mean that someone gets away with something in service of the greater principle. 

Ben Yelin: Yeah, yeah. That is, as you say, really a foundational part of our legal system. If we're going to have these rules in place, we have to be willing to accept that certain people we want to see apprehended will not be apprehended. And we see that with things like the exclusionary rule. Like, a lot of guilty people go free because of the exclusionary rule, you know, because there was something wrong with the gathering of evidence in a criminal investigation. 

Dave Bittner: I see. 

Ben Yelin: But, you know, sometimes the values supersede the importance of nabbing, you know, that criminal or any other criminal. 

Dave Bittner: Yeah. All right. Well, again, this article is from The Intercept. It's titled "FBI Seized Congressional Cellphone Records Related to Capitol Attack." And as always, we will have a link to that in the show notes. We would love to hear from you. If you have a question for us, you can call us at 410-618-3720 and leave us a message. You can also email us at caveat@thecyberwire.com. Ben, I recently had the pleasure of speaking with Major Jess Dawson. She's from the Army Cyber Institute. And we discussed microtargeting as information warfare. Here's my conversation with Major Jess Dawson. 

Jess Dawson: So what prompted this paper was I read Chris Wylie's book "Mindf***." And this was the book where he went into how Cambridge Analytica, you know, gathered all of this data collection and then started using it to influence elections. Now, there's really big questions about the data and whether or not this works as a form of manipulation and mind control or not. We don't actually know what this stuff does. But it seems that it really is something that we should be looking at and taking seriously. And the reason why I position the paper the way that I did is that when we look at the social media space, there isn't anybody, really, in charge. There's no regulation over this data in many ways. And so we're really at the whims of the social media companies to do anything in this space. The problem with that is that we are not really recognizing the way that this space can be weaponized, either by normal actors who are just seeking to get a rise out of folks and go viral, out of domestic actors who are seeking to use this space for power and, possibly, profit and by foreign actors who are seeking to erode the United States from within. All of those things happen on these platforms. And there's no regulation on any of them. And so it's a pretty big threat space that I think we're just now starting to wake up to inside of the DoD. 

Dave Bittner: You know, I remember back, you know, to the early days when a lot of these platforms were starting up and this whole notion of microtargeting advertising in particular was beginning. And I think, like a lot of people, I thought, well, this seems like a good idea. You know, if it means that I only see ads that I'm interested in and I don't see ads that I'm not interested in, that seems like that works for everybody. Time has passed. And it seems like we've gone past that. You know, we talk about how, you know, it feels creepy sometimes the way that these ads... 

Jess Dawson: Yeah. 

Dave Bittner: ...Follow us around the internet. To the point of this paper, I think, in the past couple of weeks, I saw a story come by where, you know, Facebook allows you to target soldiers. 

Jess Dawson: Yes. 

Dave Bittner: That seems problematic to me. 

Jess Dawson: So yeah, I mean, if we were just talking about advertising - right? - and you were in the market for diapers and, you know, Pampers had a sale on - right? - nobody would really, probably, freak out about that. 

Dave Bittner: Right. 

Jess Dawson: Like, if we were really just talking about getting the product to the eyes of the people that are likely to buy it, that's, like, the holy grail of advertising. That's the whole industry's, like, reason for existing. That's not what we're talking about anymore. We're talking about - so it's not just interest that gets the ads in front of you, it's the ability to pay, right? So if you've got more money, you can take over more of the advertising space. And foreign actors, for example, are able to advertise their products inside of, you know, the American space. That stuff isn't necessarily regulated. And so when we think about what is being advertised, it's not just diapers or the latest headphones, it's ideology, it's ideas. And a lot of these ideas are meant to invoke very, very strong reactions and get us not thinking about our responses. That's part of the reason why - when we think about memes, why they work so well is because it hits that truth. And then you immediately respond and spread it. So the meme propagates very, very quickly. So we're not just talking about advertising in the purest sense. We're talking about political manipulation. We're talking about foreign adversary manipulation. And the targeting of soldiers is just another feature of the platform. It's not seen as a threat in many ways because the platforms are working like they're supposed to. 

Dave Bittner: What are the specific concerns here of the DoD, in your estimation? 

Jess Dawson: So I can't speak to the DOD's concerns. What I can speak to is the issues that I've raised from where I see the concerns there. Where I see the concerns is eroding our cohesion - right? - creating divisions internal to the formation, right? When we think about COVID and all of the misinformation that's been going on about - around masking, don't mask - right? - well, every time that, you know, some random soldier decides not to mask and gets exposed, well, now their whole squad has to go into quarantine. And that means that that squad's not training anymore because the vast majority of military jobs are not going to be able to be done at a desk, right? So that's just one example. When we think about, you know, the messaging around this election, the previous election - right? - if you have a segment of the force that believes that the president is illegitimate, well, that's a problem for following the orders of the office of the president, right? The president is the commander in chief. So that fundamentally undermines chain of command and command authority. And this is not a new phenomenon, right? We have had - you know, people have been questioning the legitimacy of the president, you know, since Bush v. Gore, right? So this is not something that social media has started. But it has certainly amplified it. And when you're in these spaces that reinforce the perspective that you already have and reinforce these, for lack of a better term, tribal boundaries, it can really cause problems inside the force in terms of unit cohesion and the ability to function as a team. 

Dave Bittner: How much of this is the lack of any sort of oversight from the social media platforms? You know, I'm imagining if I were running an ad in the - even a local newspaper, you know, there would be some editor, there'd be some editorial process, a real human being who would be looking at that ad. And there'd be that ability for oversight to say, who's running this ad, and why are they running it? Is it accurate to say that a lot of that sort of thing just doesn't happen on the big social media platforms? 

Jess Dawson: I think it is. Part of the problem that we're dealing with with these analogies - because the editorial analogy is a good one, except that it doesn't scale. When we think about all of the, you know - I don't know that I've ever seen numbers about how many people can advertise on Facebook, but there is no vetting of getting a Facebook advertising account. You open up a business account, you put in a credit card, and you can advertise. 

Jess Dawson: So we're talking about, you know, millions and millions of advertisers. They don't have enough staff to vet these one by one. And to be honest, it's not in their best interest to vet any of these, right? They benefit whether an advertiser is a legitimate advertiser or whether it's fraudulent or whether it's a foreign entity. They have no incentive to vet these things until it starts working on their legal liability. 

Jess Dawson: So the problem is really the problem of scale. And where we need to really start thinking about is how are we going to regulate who is allowed to advertise inside of the United States cognitive space, and by that I mean the social media space. That is a very serious question that we have to answer collectively. We can't say, you know, to the social media companies, you have to do this, because they're going to go with what's going to not get them sued. That doesn't necessarily mean it has the best interests of the United States' people and other countries around the world's citizens' best interests at heart. They're going to operate in their best interest. 

Dave Bittner: It seems to me like this is one of those rare areas these days that has bipartisan support. I mean, there's recognition from both sides of the aisle that this is an issue that needs to be addressed. Do you see that sort of will to move forward on regulation in this space? 

Jess Dawson: I think so. It is very hopeful that we see bipartisan, you know, work being done on this. The congressional cyber solarium that just released its report, I think it was last year, we at the Army Cyber Institute had one of our - one of our teammates was working on that. Professor Erica Borghard was providing advice and guidance to that. 

Jess Dawson: So there is bipartisan work being done. There's bipartisan investigations being done on this. And it does give me hope that we will come to a solution that will protect the people, that will protect the democracy that we have, because right now, all of the anger and hate towards the other side - you can't govern that way. 

Dave Bittner: What do you suppose effective regulation would look like? Do you have any suggestions of things that might be put in place? 

Jess Dawson: So one of the things that I've been talking about with a colleague that's in the advertising industry is perhaps having an agency that you have to sign up for in - like, with the federal government in order to be able to advertise on these spaces, right? So you apply - and not making this, you know, too egregious so that, you know, regular mom and pop stores can do this, right? But you have to apply to this agency for an account, an account number. And then the social media companies can grant you access to advertise based on that. 

Jess Dawson: That's going to absolutely cut the amount of advertising dollars that go into these systems. But the advertising - like, the amount of profit that these systems make is inversely related to the, you know, the damage being done inside of the civic sphere. That's just one idea that we've been kind of tossing around, is - like, would that be viable? 

Jess Dawson: But this is a wicked problem to solve in a lot of ways - right? - because we want to protect free speech. We don't want to start regulating everything using the artificial intelligence sensors that, like, you know, for example, China is rumored to be using. Our freedom of speech is fundamental. So how do we do that? And I think it comes down to we really have to think about what are we going to allow to be advertised. We do this all the time. We've done this with cigarettes. We've done this with alcohol. We do this with other products. So it's time to really have a national discussion about what are we going to allow to be advertised and really come up with how do we protect the civic sphere. 

Dave Bittner: What about this notion of having sort of an online version of the FDA applied to social media companies? In other words, in the same way that anyone who wants to put some pharmaceutical medicine out there, they have to prove that it's not going to do harm. And there are processes that they have to do that with. You know, similarly, could we have something like that with these algorithms that platforms use? You know, before you're able to put something like this in place, you must demonstrate to us that it's not a harm or a threat to democracy. 

Jess Dawson: I think the idea of an FDA sort of, you know, demonstrating something is safe is a good idea - right? - in principle. I think it really looks like - and it's going to - but it will fundamentally change how these algorithms are developed, right? It will require that these companies slow down, that they actually have testing on this, that they actually have to do research to demonstrate that these things don't cause harm. I don't think that that's a bad idea. 

Jess Dawson: I think that these companies will push back drastically on that because it's really been, you know, develop, release. And what we're saying now is, develop, test for harm, secure, then release. That's a drastically different time horizon when it comes to the tech kind of manufacturing cycle. 

Dave Bittner: Yeah. You know, and I keep coming to these analogies, and perhaps, you know, some of them are straining under their own weight. I also think about - you know, back in the '70s and before that, you know, when you had factories that were dumping waste into rivers and oceans and things like that and - they were saying, well, you know, in order to operate at this scale, this is what we have to do. And there came a point where we said, no, you can't just pollute this way. You know, that will not do. We as a society are not going to allow this. It strikes me that we may be in that mode right now with some of these social media platforms, where they're still polluting. And we need to say, OK, yes, this is going to hit your profits to clean this up. But this is a standard that we're going to require that you meet. 

Jess Dawson: There's a lot of good analogies, right? Analogies are never perfect. 

Dave Bittner: Yeah. 

Jess Dawson: But I think when we think about the pollution of the civic sphere, the pollution of the public sphere, that's a very, very good analogy in a lot of ways - right? - because, you know, one of the tactics in disinformation is to flood the zone with BS so that people just get exhausted and they don't know what's true or not anymore. So I really think that that's a critical way of thinking about this, right? Like, social media, in a lot of ways, is a good way to keep in touch with family. It's also a good way to consume a massive diet of junk news and junk food so that you stop really thinking about what's real and you're just angry all the time. That's not healthy for anyone. 

Dave Bittner: Right. Well, you have several recommendations in the paper here. Can you take us through that? What are some of the things that you're proposing? 

Jess Dawson: So one of the things that I think we really need to look at is these algorithms need to be opened up for research. They need to be publicly available to research institutions that can dive in and really understand what's going on there. 

Jess Dawson: One of the most concerning things that that I've seen sitting in some of these different working groups with some of these social media companies is they'll sometimes, like, let it slip that they don't actually know what these algorithms are doing. So we've released these tools onto the general population with no controls, with no idea about how it actually works. And then it's only after there's a disaster that people go, oops, maybe we shouldn't have done that. So I think opening up the algorithms to academic research to get some transparency in there - we really, really need to look at that. 

Jess Dawson: Or another piece of this is data privacy. Every app on your phone is gathering data and aggregating it and selling it. We have no idea what is out there on us. We can check our credit report annually. And there is a method of redress there. We have no idea what data are being gathered on us, nor do we know who has it. That needs to stop because right now the DOD is one of the few agencies that are not allowed to actually look at this data and understand what it is, but our adversaries can use it to target all day long. 

Jess Dawson: So we really need to look much more deeply at public-private partnerships, where we get some regulation over the space but we also get some transparency because right now there is no transparency about these algorithms. And that's where I think tanks like the ACI can really help because that's where - we've got the skillset to understand the algorithms, to dig into it and help, again, understand what's going on there. And we put in the human protections from an academic standpoint that these companies are not considering right now at all. 

Jess Dawson: I really think that, as we think about the social media space, we really need to push back collectively at all of this data-driven emotional manipulation. That's what's going on in these spaces. And it's not enough that I do it or that you do it. We really need to do this collectively. And I think that when we start all kind of standing up and pushing back on this and pushing for accountability by these companies, we'll have a chance to get at this. 

Dave Bittner: All right, Ben, what do you think? 

Ben Yelin: Such an interesting interview, obviously a topic that we've talked about before and we're both so interested in. And I think she made a couple of points that really stuck out to me. One is I think we'd all be willing to accept a certain level of microtargeting, you know, that wouldn't be offensive to us. So people... 

Dave Bittner: Right. 

Ben Yelin: As she said, you know, if people see that I'm buying a bunch of baby products, it's convenient for me and really not that much of an invasion of privacy If Google or Facebook or whomever learns that I am in the market for baby products. That's fine. But when it gets to the point we're talking about Cambridge Analytica and getting detailed profiles on individual voters and microtargeting people with misinformation, that's when it becomes a broader societal problem. 

Ben Yelin: And I really enjoyed hearing, you know, some of her potential solutions, which is, for whether it's a mom and pop shop or some of these big companies, you know, going in front of some governmental body to get authorization to be on these sites. And, you know, I think that's similar to an idea that you've brought up that I know you mentioned in the interview of having sort of an FDA for information, a government regulator. Now, there would be costs to that, both financial costs and - you know, there would be a restriction of content on the internet that some people might not be comfortable with. 

Dave Bittner: Right. 

Ben Yelin: But, again, it's about balancing, you know, whether we're willing to have that type of regulation to get rid of this very pervasive problem, which is the proliferation of false information. 

Dave Bittner: Yeah, absolutely. All right. Well, we want to thank Major Jess Dawson from the Army Cyber Institute for joining us and for all the folks at the Army Cyber Institute for coordinating that call. We appreciate them taking the time and sharing their expertise with us and with you. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.