Caveat 2.2.23
Ep 159 | 2.2.23

Safely and securely moving to the cloud.

Transcript

Willie Hicks: Moving to the cloud is, you know, important. But moving safely and securely to the cloud is probably paramount.

Dave Bittner: Hello, everyone. And welcome to "Caveat," the CyberWire's privacy surveillance law and policy podcast. I'm Dave Bittner. And joining me is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: Today, Ben discusses a new academic paper on the Fourth Amendment implications of terms of service. I have the story of a recent federal court ruling on a Section 230 case. And later in the show, Willie Hicks from Dynatrace on the accelerated adoption of secure cloud infrastructure and services by the federal government. 

Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, we've got some good stories to share this week. Why don't you start things off for us here? 

Ben Yelin: So there is a new academic paper out from imaginary friend of the podcast, professor Orin Kerr, from... 

Dave Bittner: (Laughter) Just to be clear - Orin Kerr is not imaginary; just his affection for us is. 

Ben Yelin: Exactly. 

Dave Bittner: (Laughter) OK. 

Ben Yelin: Exactly. I did not make him up. 

Dave Bittner: OK. 

Ben Yelin: He is a real professor... 

Dave Bittner: Right. 

Ben Yelin: ...At the University of California, Berkeley. 

Dave Bittner: Yes. 

Ben Yelin: And he's written a draft of a paper entitled "Terms of Service and Fourth Amendment Rights." And I think it's absolutely a fascinating paper. I went through it, and it certainly changed some of my thinking on this issue. 

Dave Bittner: OK. 

Ben Yelin: So the crux of it is we all sign these EULAs, these terms of service. The way he described them were very funny. It's like a long CVS receipt worth of... 

Dave Bittner: (Laughter). 

Ben Yelin: ...These various provisions. 

Dave Bittner: Right. 

Ben Yelin: And there have been some court cases which have held the proposition that we can forfeit our Fourth Amendment rights and a certain piece of data, piece of information, by signing these terms of service. And that makes instinctive sense to us because the Fourth Amendment only applies when there's been a violation of our reasonable expectation of privacy, and how can we have a reasonable expectation of privacy if we get this long CVS receipt worth of terms of service that explicitly tells us, hey, this isn't private? Like, we reserve the right to give your information to law enforcement. We reserve the right to sell your information. Everything that's included in every single EULA that none of us read. 

Dave Bittner: Right. 

Ben Yelin: So courts have actually come to differing conclusions on this, and he goes through some of the case laws. There have been several courts who have taken the view that you forfeit Fourth Amendment rights, your reasonable expectation of privacy in certain communications, by signing that terms of service. And some of the cases we - he cited we actually have talked about on this podcast. There's one about somebody at a university who was being observed on campus Wi-Fi. They - law enforcement obtained evidence that he had connected to campus Wi-Fi in a school building. And the holding in that case was, basically, he forfeited his reasonable expectation of privacy because in the terms of service, it said, once you are connected to our network, you've essentially forfeited your expectation of privacy on the internet at school. So that's kind of one line of cases. 

Ben Yelin: There's a separate line of cases that have gone the other way, that have basically said terms of service are private agreements between two private parties. It's the user and the owner of the platform that are coming up with an agreement, and that does not implicate Fourth Amendment obligations or Fourth Amendment rights. And professor Kerr comes strongly on the side of that second group of cases, basically arguing that by signing the terms of service, we should not be forfeiting our Fourth Amendment rights. That's just simply the wrong way to think about these issues. So getting a little deeper here, he tries to analogize terms of service to other contracts that we sign in the nondigital world. And all of these contracts have to do with rights in so-called shared spaces. So I think the case law is very obvious when it comes to law enforcement invading the home that we own, for example. 

Dave Bittner: Right. 

Ben Yelin: Where things get a little... 

Dave Bittner: Our castle (laughter). 

Ben Yelin: Exactly. Where things got a little murkier are those in-between areas where we're renters. Maybe we're just staying at somebody's Airbnb, somebody's hotel room. So there's all these doctrines that sort of reveal when we actually forfeit our expectation of privacy in these shared spaces. And he goes through some of them here. One of them is the private search doctrine, where somebody else has access to the space, they see something incriminating, and they report it to law enforcement. You don't have a reasonable expectation of privacy there 'cause you should know that your roommate or whomever has access to that space. 

Dave Bittner: Or someone who comes in to clean the room, like your hotel rooms. 

Ben Yelin: Exactly. 

Dave Bittner: Something like that. Yeah. 

Ben Yelin: The hotel - yeah. 

Dave Bittner: Right. And I have 20 pounds of cocaine sitting on the desk. They could report that. 

Ben Yelin: Right. Things like direct consent, where you explicitly let in law enforcement to one of your spaces, and you've been authorized to give that consent because you have access to that shared space, that's something where you are very clearly forfeiting your Fourth Amendment rights. Third-party consent - your roommate answers the door, and he or she consents to a search. Generally, you've forfeited your right because you should know that that roommate has just as much of a right to the space as you do. And so they - once they consent to that search, you no longer have that expectation of privacy. And then abandonment - if you no longer wish to use that shared space or you've affirmatively given it up. So you're no longer renting an apartment, or you are no longer using a hotel room. You have checked out. They have limited access to your hotel room. You've lost Fourth Amendment protection in that hotel room. 

Dave Bittner: Isn't that the one that lets them go through your trash, also? 

Ben Yelin: That's - exactly. 

Dave Bittner: Yeah. 

Ben Yelin: Yes. There was a Supreme Court case on that very topic. 

Dave Bittner: OK. 

Ben Yelin: And you don't have any reasonable expectation of privacy in your trash 'cause you've literally put it on the curb... 

Dave Bittner: Right. 

Ben Yelin: ...In a public space. Rats could get into it and start eating it. So that's certainly not private. 

Dave Bittner: (Laughter) Yeah. 

Ben Yelin: He goes over these rights-losing doctrines, and I think really persuasively argues that none of these are analogous to terms of service. Terms of service are - it's an agreement between a user and the owner/operator of the platform. It governs the relationship between those two private parties. And each private party has its own interest in signing the contract. For the user, it's obviously, I want to use this platform, so I'm going to sign it as quickly as possible. But for the owner, it's about their own legal liability. It's about their ability to exert some type of editorial control on their platform, and it's to protect the safety of other users. So by signing that terms of service, you're entering into a contractual agreement with the other party, but that does not implicate your Fourth Amendment reasonable expectation of privacy because your relationship with the government is fundamentally different. 

Ben Yelin: And so that's really the essence of the argument here, is that there's a real distinction between having this agreement between two private parties and implicating the government in some way. And a lot of the case law, where courts have held that terms of service preclude Fourth Amendment challenges, have actually related to governments themselves. In other words, the government is one of the parties. So a federal agency is the one that comes up with the terms of service, and one of their employees violates it, and that implicates Fourth Amendment rights. In that case, it only implicates Fourth Amendment rights because the government is one of the parties in those cases. 

Dave Bittner: Well, let me put you on pause here for a second and get you to... 

Ben Yelin: Please do. I've been talking for way too long. 

Dave Bittner: No, no, no. But I - so help me understand here. So you - we often talk about the First Amendment and, particularly with social media platforms, how common it is for people to misunderstand or misinterpret the First Amendment because the First Amendment applies to the government not being able to restrict your free speech, right? 

Ben Yelin: Right. 

Dave Bittner: And the private platform owners are not the government, so they're allowed to restrict your free speech. How is that compared - how does the Fourth Amendment apply that way? Does - in other words, is the Fourth Amendment specifically targeting the government's ability to invade your privacy versus a private platform owner's ability to invade your privacy? Is that the analogy that we're - does it mirror that, or is it completely different? 

Ben Yelin: It really closely mirrors it. So the First Amendment is more explicit because it says Congress shall make no law abridging freedom of speech. 

Dave Bittner: Right. 

Ben Yelin: Really, in the Fourth Amendment, it's more implied. The Fourth Amendment is a prohibition on unreasonable searches and seizures. But there's all this language in there that clearly indicates that that right protects you from the government. So language about how you have to obtain a warrant based on probable cause - that would have no applicability against private parties. Not to mention, if you look at the history of the Fourth Amendment, it comes from our founding fathers, who have the experience of being in the colonial United States, where the United Kingdom would send these so-called writs of assistance or general warrants where they'd have no particular suspicion that you were doing anything illegal. They'd basically just use these so-called general warrants to harass people. Let's go into people's homes and see what we can find that might be incriminating. 

Dave Bittner: Ransack the place. 

Ben Yelin: Exactly. 

Dave Bittner: Yeah. 

Ben Yelin: And those were all government searches. When we're talking about our English legal ancestors, it was the king or queen and his or her minions who were conducting these searches. 

Dave Bittner: Yeah. 

Ben Yelin: So there really is that key distinction. There are few constitutional rights that don't implicate the government. One of them is the 13th Amendment, which is a prohibition on slavery. That's certainly a prohibition against private action. But for the vast majority of constitutional amendments, including the Fourth Amendment, is - it is the right against tyrannical government action. 

Dave Bittner: Yeah. 

Ben Yelin: And even though, in the words of professor Kerr, terms of service can help define the relationship between private parties, these types of private contracts cannot define Fourth Amendment rights because Fourth Amendment rights implicate the government, and the government is not a signatory to these terms of service, and they're really not involved in either party proposing or signing these terms of service. 

Dave Bittner: So what's the implication of professor Kerr's argument here? 

Ben Yelin: The implications are pretty wide-reaching. So he's a pretty influential law professor in Fourth Amendment jurisprudence. He's cited in most Fourth Amendment cases, so it's not like he's just some random academic who's... 

Dave Bittner: Right. 

Ben Yelin: ...Spouting out a theory here that's never going to be adopted. I mean... 

Dave Bittner: Right. 

Ben Yelin: ...I think this is something that really could make its way into our court system. I also think the argument is pretty persuasive. And if his view held, then companies - or then criminal defendants, in particular, would not concede their reasonable expectation of privacy in any discreet form of communication simply by choosing to sign a terms of service. One of the things he says is that's not any type of informed consent anyway, beyond the issue of it being a private contract between private parties. People, as we've talked about a million times, don't know what they're signing. He actually cited an experiment where researchers created a fake social media site, had users sign up for it, and buried within the terms of service was a provision saying, if you sign these terms of service and use our platform, you are granting us the right to your firstborn child. 

Dave Bittner: (Laughter) Of course. 

Ben Yelin: And an extremely small percentage of people actually noticed that provision and raised it with this fake social media company. So the broader implication, if this view were adopted, is terms of service would play little to no role in an analysis of whether a person had a reasonable expectation of privacy and, therefore, whether they had Fourth Amendment rights and a discreet form of communications. 

Dave Bittner: I'm still unclear, though, here. Is this reasonable expectation of privacy against a government search of their social media content, or reasonable expectation of privacy against, for example, the social media platform selling their information to a third or fourth party? 

Ben Yelin: Well, that's a big part of the argument, is how do you define reasonable expectation of privacy. 

Dave Bittner: Yeah. 

Ben Yelin: I think what he is saying here is that it really is a reasonable expectation of privacy that this will make it into the hands of government actors, of law enforcement. That's what he's really trying to clarify. 

Dave Bittner: So he's saying you can't sign that away, that the Fourth Amendment supersedes your ability in this - what he claims is a private contract. You can't sign that away. 

Ben Yelin: Right. The argument against his viewpoint would be, while some of these terms of service say, specifically, as a private company, we will comply with all law enforcement requests. And therefore, you should have a general idea that you actually are relinquishing your reasonable expectation of privacy as it relates to the government. And professor Kerr doesn't think that that argument carries any water. 

Ben Yelin: There are other instances in the nondigital world where we might sign away something that implicates possible government intrusion. We might consent, theoretically, to a government search on behalf of, say, a leaseholder or, like, a property manager, for example. But the government doesn't actually get involved for Fourth Amendment purposes until they're actually called to execute a search, if that makes sense. So it's all theoretical until then. Saying that we might give information to the government isn't the type of forfeiture of a reasonable expectation of privacy that exists when the government actually is summoned and gets involved. 

Dave Bittner: It's all theoretical till somebody breaks down a door? 

Ben Yelin: Exactly. 

Dave Bittner: (Laughter). 

Ben Yelin: I mean, I think that's what he's saying... 

Dave Bittner: OK. 

Ben Yelin: ...That you can put anything in the terms of service. 

Dave Bittner: Right. 

Ben Yelin: But until the actual search has taken place, until the information has actually been given to the government, then you don't have what constitutional scholars refer to as a Fourth Amendment event. So, you know, I think there are - this is certainly not cut and dry. I think we have evidence of that based on the fact that very smart federal judges have come to different - and state judges have come to differing views on the role that terms of service play in Fourth Amendment jurisprudence. But I just think this paper persuasively argues that you do not relinquish your reasonable expectation of privacy simply by signing this essentially private contract between two private parties. The Fourth Amendment rights are in kind of their own metaphysical realm away from the contracts that you sign. So I just thought it was a very interesting paper in that respect. 

Dave Bittner: And in terms of, like, what happens next, do we wait for a case to come up where someone uses professor Kerr's arguments in their own case, in their own arguments? 

Ben Yelin: Yeah. I mean, I think, in the short term, we're going to see a continuance of what we've seen over the past several years where there's really a inconsistency in jurisprudence on this issue, where some courts look at a terms of service and say that's - somebody signed that; that means they're forfeiting their expectation of privacy for Fourth Amendment purposes; other courts taking the professor Kerr view. I think, in writing any law review article, the hope is that you influence enough legal scholars that eventually your view wins out. So we're not going to have a determinative case, I think, anytime in the near future. But sometimes, you know, major judicial doctrines can be traced to smart academic articles. I always think academics overstate their importance, but it's true that having - putting out thought leadership on an issue can pay off, certainly, in the long run. 

Ben Yelin: So I think if the Supreme Court ever does take a view similar to professor Kerr's, here, you might be able to trace it back to this article. Now, this is just a draft of the article. He has opened the floor to comments on this draft, if anybody is interested. So maybe he can be swayed in the other direction, if any of our listeners feel very strongly. But I think this is kind of the - laying the groundwork for where he would like to see jurisprudence go in future cases. 

Dave Bittner: All right. Interesting, for sure. All right. Well, there's one to keep an eye on, right? 

Ben Yelin: Absolutely. 

Dave Bittner: OK. We will have a link to that in the show notes. My story this week comes from the folks over at Techdirt. This is an article written by Tim Cushing, and it's titled "Service Providers Can't Be Sued Over User-Generated Content, No Matter How Creatively The Allegations Are Framed." This is according to a federal court. And this is a Section 230 case, made its way into a federal court. This is a federal court decision. 

Dave Bittner: And what this comes down to - it seems like it is a tragic case of a user of a social media platform - a couple social media platforms - who was allegedly harassed on these platforms and ultimately took their own life. And the survivors are making the case that the social media platforms are responsible, that they did not do enough to help prevent this, you know, obviously, horrible outcome here. The folks who brought the lawsuits against the social media platforms - one of them is called YOLO, which is not one I'm familiar with, but YOLO... 

Ben Yelin: Me neither. Yeah. I had not heard of that. 

Dave Bittner: YOLO is short for you only live once. And so the folks in this case - the author here refers to it as a cause-of-action grab bag. They brought 12 causes of action under state law against the defendants. And they said strict product liability based on a design defect, strict product liability based on a failure to warn, negligence, fraudulent misrepresentation, negligent misrepresentation, unjust enrichment, violation of the Oregon Unlawful Trade Practices Act, violation of the New York General Business Law, violation of the New York General Business Law, violation of the Colorado Consumer Protection Act, violation of the Pennsylvania Unfair Trade Practices Law, violation of the Minnesota False Statement in Advertising Law and, last but not least, violation of California Business and Professions Code. 

Ben Yelin: Dave, get the bleep buddy - button ready... 

Dave Bittner: (Laughter.) 

Ben Yelin: ...'Cause this is what we call throwing beep at the wall and seeing what sticks. 

Dave Bittner: (Laughter) OK. Well, this is what I was hoping to get from you, which was - which is, can you unpack this? I mean, the court is clear. They said, the court finds that each of these causes of action is predicated on the theory that defendants violated various state laws by failing to adequately regulate end user's abusive messaging and is, therefore, barred by Section 230. Ben, help us understand what's going on here. 

Ben Yelin: So Section 230 of the Communications Decency Act immunizes platforms from lawsuits based on the content posted by users. I think what the plaintiffs were trying to do here is find an avenue around Section 230 protection. And they weren't able to do it because, basically, the plaintiffs are operating under a theory that defendants are violating state statutes and/or provisions at common law. But in reality, as the court says, they've done no such thing because they're immune from liability. They - under federal law, which preempts any of these state statutes, they are precluded from being sued based on content that's been put on their platform. And that's controlling, regardless of what each of these state statutes say, including allegations like false statements in advertising, which - I can't possibly conjure up how that could have been a reasonable cause of action in this case. I mean, one could be creative, but it just seems so disconnected from what actually went on. 

Ben Yelin: You know, the plaintiffs, for example, insisted that allowing users to create anonymous accounts is, quote, "a defective design feature." And as the author of this piece said, that's dumb and disingenuous in two way; one, that offering anonymity is irresponsible, which clearly, if you look at how many social media platforms work - that's not something that courts would ever define as irresponsible; and, you know, secondly, knowing that people who are inclined to be abusive would be deterred from this type of, quote, "defective design feature," to which - there's just no evidence of that being the case. 

Ben Yelin: I think the only hope for the plaintiffs here are a couple of things. One is congressional action that revises Section 230 and extends liability beyond where it currently is, where these types of platforms could be held liable, maybe under even some of these statutes that they cite here, if Section 230 is significantly weakened. Given that they probably can't rely on those types of changes to Section 230, obviously, those changes would be vociferously opposed by the industry. 

Dave Bittner: And not retroactive. 

Ben Yelin: And not retroactive. 

Dave Bittner: Yeah. 

Ben Yelin: Law enforcement in this case - or the defendants in this case, rather - were certainly operating under good faith based on what they believed the law to be at the time. The other avenue is to wait to see what happens in Gonzales v. Google. So this is a case that the Supreme Court is hearing sometime in the next year - sounds like it's going to be in next October's term. And that's the case in which the family of a victim of a - the 2015 terrorist attack in Paris is suing various social media platforms, alleging that they bear some responsibility for that tragic event by allowing terrorists, essentially, access to those platforms and creating an influence campaign that inspired others to create acts of violence. Depending on what the justices say in that case, that might implicate what happens in the case quoted in this article. And that could be a potential ground for appeal. But until then, I mean, Section 230 is pretty airtight, and it certainly precludes any lawsuits based on these 12 causes of action that are mentioned here. 

Dave Bittner: They point out in this article that the plaintiffs say that they'd received harassing messages, and they also say that the platforms had said that they would reveal who a user was if there was harassment taking place. And that did not happen. And so that's part of, I think, what they're trying to come at the platforms here - the platforms said they would do one thing in response to harassing behavior, and the platforms did not live up to their claims. And they're trying to come at them for that. But the court was not persuaded by that argument. 

Ben Yelin: Yeah. I mean, that's just not in the spirit of Section 230. It is - there's certainly a moral problem with some of these social media companies not targeting abusive users, but this is a question about legal liability. And according to the court, Section 230 precludes that liability. So there's just really - if you understand Section 230 the way the court does here, there's no cause of action that's going to supersede that shield of liability. And I think that's really what the court is saying rather curtly in their opinion here. 

Dave Bittner: What if this does go to the 9th Circuit Appeals Court? This article points out that that particular circuit appeals court has had interesting approaches to Section 230. 

Ben Yelin: The way they put it is the 9th Circuit has said some rather strange things about Section 230. 

Dave Bittner: (Laughter) Yeah. 

Ben Yelin: They link to an article - this is from a couple of years ago about the 9th Circuit Court of Appeals saying some of the, quote, "disturbing stuff about Section 230 while dumping another two sue Twitter for terrorism lawsuits." So obviously, the writers - and these are both tech writers - are skeptical of the 9th Circuit. The 9th Circuit does tend - have a tendency to go a little rogue on some issues, just kind of by nature of chance. First of all, it's the largest judicial circuit. They have a lot of judges, which means it's more of a crapshoot when you got a three-judge panel as to which judges you're going to get. But it's also - just tends to be more of a liberal circuit, mostly by coincidence, based on the judges that happen to be appointed by which presidents at which moment in time. So I think they are more prone to going out on a limb and protecting particularly plaintiffs in these types of cases. So you never know what's going to happen at the 9th Circuit. 

Dave Bittner: And that's part of what makes it so fun (laughter). 

Ben Yelin: Exactly. I mean, I think if I were to guess, the holding of the district court is likely to carry the day here just because of the strength of Section 230. But if any federal judicial circuit would mess around with Section 230, it would most certainly be the 9th Circuit. I think the best chance for these plaintiffs is a reconsideration of the reach of Section 230 - of the Section 230 shield, and that will only come as part of this Gonzalez v. Google case that it granted certiorari to earlier this year or last year. 

Dave Bittner: All right. Well, that's another one perhaps to keep an eye on. Again, that's from the folks over at Techdirt, and we will have a link to that in our show notes. We would love to hear from you. If you have some thoughts on our discussions here or something you'd like us to cover, you can email us. It's caveat@thecyberwire.com. 

Dave Bittner: Ben, I recently had the pleasure of speaking with Willie Hicks. He is from an organization called Dynatrace. And our conversation centers on the accelerated adoption of secure cloud infrastructure and services by the federal government. Here's my conversation with Willie Hicks. 

Willie Hicks: So before the cloud era, obviously, the Fed was, you know, very data center driven. They had their - you know, every - probably every agency had their own data center, massive data centers as they were, to the point I remember, you know, several years back a lot - and some still ongoing initiatives around, you know, data center consolidation and what we call, like, DCOI and some ways to kind of improve efficiencies. But I think the winds started to change, you know, a few years ago where we started to have more and more agencies looking at moving to the cloud - you know, having methods by which to move to the cloud, having, you know, FedRAMP and some other capabilities. At first it started off, I think, like, with FedRAMP low and moderate. So only certain workloads could move to the cloud. But I think that is starting to pick up steam a lot more. I spend, you know, my time in the civilian and in the DODIC communities, and I think the civilian side has been probably a little bit ahead of the game here. But we see the DOD and IC side of the house also picking up speed. You might have heard of things like JWCC, the Joint Warfighter (ph) Cloud contract, that was just signed. So we see that momentum moving on the DOD side as well in IC. 

Dave Bittner: And to what degree were the various agencies on board with this? What are the factors that go into them deciding if cloud is the right choice? 

Willie Hicks: So you know what? I think it's changed over the years, to be honest, now that I think about it. I would say that early on, the federal government was a lot like maybe some industry early on where it was just this idea that we need to move to the cloud. It's going to save us money. I mean, it was literally like, just - you know, the mantra was, you know, cloud first. It was - now as cloud's more - it's kind of moved around. But it was just this idea we were going to just start moving workloads to the cloud. 

Willie Hicks: I think, quickly, they started to realize that wasn't the best approach, that, you know, the efficiencies that they were hoping for based on some of the means by which they were moving to the cloud - so just doing, like, lift and shifts and just trying to take applications that really weren't designed for the cloud, that weren't really cloud-ready - moving those to the cloud, they learned some early lessons on that. And so I think now there's a much smarter approach. It's more data-driven approach that I see from a lot of agencies. I see a lot of agencies that are really starting to look at not just lift and shift, but either re-platform or, you know, really re-factoring the application or maybe creating a new application or going to a SAS model. 

Willie Hicks: But I think one of the things that I have seen that has been maybe a struggle for agencies as well is not just kind of the planning and making sure we're moving the right things to the cloud, but also, you mentioned, secure cloud. You know, moving to the cloud is, you know, important, but moving safely and securely to the cloud is probably paramount. And so, you know, just moving to a FedRAMP-authorized environment, be it moderate or high or whatever the baseline you need, is kind of not the only thing. Understanding kind of your security posture, understanding the processes are going to change as you move to the cloud - some of those things, those security processes and procedures, have to be modified and tweaked to really account for the cloud. So those are some of the challenges that I think I see with agencies - and then just complexity as you move to the cloud. This is a new environment, you know? And so it becomes a much more - you know, you might not have the skill sets and so forth. So there are a lot of things that go into that move, if that makes sense. 

Dave Bittner: Yeah, it does. Can we dig into some of the security issues? I mean, are they - zero trust is the hot buzzword these days. How much is that a part of this? 

Willie Hicks: Yes. Zero trust is the buzzword. And, you know, it's interesting because zero trust, I think, has gotten a lot of play today. There's a lot of talk about zero trust in the federal space, but also in industry. You know, there's the kind of the memorandums that have come out from OMB. I'm not sure how familiar your audience will be, but OMB 2216, for example, there are actually multiple memorandums that have come out from OMB that support the administration's push towards improving nation - the nation's cybersecurity. So that's Executive Order 14028 if anyone wants to look it up. But what 2216 really looked at and talked to is what you're talking about, zero trust, and kind of understanding and setting some hard guidelines and some timeframes around zero trust. 

Willie Hicks: And when I look at agencies moving to the cloud, zero trust is not new just because that executive order came out - is not just something that's new and the administration said, OK, we've come up with this, the zero-trust framework, and everyone's going to do it. Agencies have been thinking about this and working towards this well before that executive order. It didn't get a lot of airplay because you didn't have the SolarWinds attacks and the Log4Js and all the things that have kind of thrust this into the public spotlight. But I think, you know, these conversations, this move towards zero trust had already been - you know, pieces had already been in the works. There is a lot still to be done. But I think that from a zero-trust standpoint, it is something that is, you know, as we move to the cloud, it is something that agencies are kind of already and were already well, you know, in position to start to address. But I think, like all of these EOs prior to this budget cycle, there was no funding for it, so a lot of things weren't in place because there was no money to buy a lot of the hardware and a lot of the changes that had to be made. So I think that net-net agencies are prepared and understand the importance of zero trust. The execution and meeting it by the timelines, that's something I'm still kind of watching to see how all that's going to play out. 

Dave Bittner: What are some of the opportunities here in terms of upping their security game by moving to a cloud infrastructure? Are there things that they're able to do that they wouldn't have been able to do before? 

Willie Hicks: I wouldn't say, per se - you know, there are kind of evident things, like things, like, right in front of your face that they weren't be able to do prior to zero trust. I think what zero trust - kind of the whole thrust behind, you know, zero trust and creating a zero-trust architecture or framework is kind of, you know - tries, I think, is a little overplayed - but, you know, they always say, like, trust no one. You know, verify, you know, everything, everyone, every person. I think that one thing that zero trust is going to do that we didn't do before - and I honestly think is extremely critical - is that it's changing a mindset. It's kind of moving away from the idea of perimeter defense, kind of this idea of, you know, having all your firewalls in place and all your access points and everything kind of guarding the outside. But once you're inside, you know, you're trusted. That was always, I think, a dangerous proposition. I think that what is changing is that the perimeter is kind of really shrinking. It's everything. Everyone is being validated and revalidated. The perimeter is just kind of really tightening up. So I think that's one of the bigger changes. You know, how that might impact an agency - you know, I see - at times, you know, I'll talk to agencies and some of the concerns that might come up are around - and I think it's probably maybe a misconception - like, maybe user experience is going to be affected and things like that. 

Willie Hicks: There are people who argue that, actually, user experience and, you know, people's access, they're going to actually see an improvement because they're using more single sign-on. They're using more techniques that make it easier for the systems to be validated and you to be checked and rechecked. But it also makes it easier for you to kind of log in once and not have to really log in to every system. So there are some - you know, there's some argument on why things are easier, but I would say user experience is always something you want to, you know, concern yourself with - user experience and making sure that any changes you make, there is no impact to the user if anything is positive; if it's a negative impact, you're going to have a lot of pushback. So, you know, it's still important to watch those things. But again, I think that from what I've seen and from what I have talked to agencies about, it's more positive of the impacts than any negative impacts, at least from the outside, from the user. Now, there's a lot of work on the back end for the security teams and all to build these frameworks out, but I don't think I see a lot of negative impact on the customer side, if that resonates. 

Dave Bittner: Yeah. You know, I'm curious. The folks that you've worked with, is there a common thread among those who are being successful here? Are there things that they're doing that seem to make this transition smoother? 

Willie Hicks: I would say - and I don't want to call out an agency name in particular, but I was actually having a conversation a few months back with a CTO of an agency where he seemed - his approach honestly seemed to be, you know, working well, really practical and pragmatic. He took an approach where, you know, obviously he has been working on this for a while. His teams have been working on kind of how they would architect, you know, what changes would have to be made to the environment, what the cost of all of this was going to be. But also, you know, he didn't want to wait just for the, you know, the omnibus bill to be signed and to have, you know, more money. So he started off with smaller projects, like really small projects he could fund internally and made it almost like an agile-type approach to making changes. And he was, you know, taking very small applications, very small subsets, maybe making some in some test environments, looking at how they would do network segmentation and so forth. So he was taking a very, you know, deliberate, iterative approach to building out his environment. And from what I could tell, he was - I think that agency will be well, you know, positioned to roll out and make changes in the environment relatively quickly so they will be able to meet their deadlines. And it's a rather large agency, so they're going to have a lot of exposure. But I think those are the kinds of agencies that are being most effective. 

Willie Hicks: Then you have some agencies who, you know, don't have as much experience, they don't have big budgets, they don't have big security teams, but they're still going to be kind of maybe, you know, held to the same accountability that GAO is probably going to still look at them, and the inspector general's probably going to come in and make sure that they're complying. But I think there are going to be agencies that are going to have a harder time that, you know, really don't have the expertise. And really, I think those are the agencies that need to partner with the industry, that really need to partner with, you know, companies to come in and help with that development and the rollout of a good architecture for that agency. 

Dave Bittner: Ben, what do you think? 

Ben Yelin: One thing that really struck out to me is - and I think this is just something that those of us who are not involved in federal agencies, who aren't involved in that space, just don't really think about, is in the private sector, it's like, let's just get everything into the cloud. That's better for security purposes, that's better for important document retention to protect us from cyberattacks, etc. It's just not that easy for a number of reasons when we're talking about federal agencies. And it made me think of this scandal that we've all been following over the past several months, where two - we have a president - a former president and a former vice president who have been caught with physical classified documents. 

Dave Bittner: Right. 

Ben Yelin: And I think the connection that I'm poorly trying to draw here is that there are just very strict document retention policies, particularly when you're dealing with things like classified information, that simply putting something in the cloud the way you would in the private sector is just not as easy in the public sector. 

Dave Bittner: Yeah. 

Ben Yelin: So I thought that was something that was really interesting about the interview. 

Dave Bittner: Yeah, absolutely. All right. Well, again, our thanks to Willie Hicks from Dynatrace for joining us. We do appreciate him taking the time. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.