Caveat 9.2.20
Ep 44 | 9.2.20

Understanding the right technical assistance model for your organization.

Transcript

Sean Brooks: Everybody was more concerned and felt, accurately, that it was more likely that their phone was going to get stolen and they were going to lose access to their authenticator app than it was that their email was actually going to be remotely compromised.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's law and policy podcast. This episode is for September 2, 2020. I'm Dave Bittner, and joining me is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: On this week's show, I review an essay from top U.S. officials on persistent engagement as a U.S. cyber doctrine. Ben describes the potential implications of a ruling on geofencing. And later in the show, my conversation with Sean Brooks. He's director of the Center for Long-Term Cybersecurity's Citizen Clinic program, and they have a recent report, "Digital Safety Technical Assistance at Scale." 

Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben. Let's dig into our stories this week. What do you have for us? 

Ben Yelin: Sure. So I have a new case from a magistrate court judge relating to geofencing, and you'd never guess where I found this case. It's from our favorite man crush, Professor Orin Kerr from the University of California Berkeley. So I will try not to fanboy too much as I read about this case. 

Dave Bittner: All right. 

Ben Yelin: So geofencing, as we know, is a law enforcement tactic. I mean, it can be used as a tactic for any private enterprise. But particularly in the context of a law enforcement tactic, it's trying to find all of the devices that were in a particular area at a particular time so that you can start to narrow down the devices and figure out who might be the suspect in a particular crime. 

Ben Yelin: So in this case, we're talking about a series of armed robberies that took place, and law enforcement wanted to obtain from Google some geofencing information to figure out which devices might have been within the vicinities of these robberies. And this magistrate judge came up with what I think is the first actual opinion - legal opinion from a court, from a judge related to geofencing in this country. Obviously, it's one magistrate judge. This could change. This could be appealed. We could get different results in different jurisdictions. But as sort of the first bite of the apple, it's really interesting to see how the court came down on this question. 

Dave Bittner: Well, take us through. What'd they come up with? 

Ben Yelin: So the holding of the case is that the geofencing warrants did not satisfy the Fourth Amendment's specificity and particularity requirements. So a couple of important elements here - they started by analyzing this case in the context of the Carpenter case, which is about cell site location information, and basically dismissed that as an issue entirely, saying the parties didn't contest it. I actually don't think that that analysis was necessary or applicable in this case, but it is interesting that they cited back to Carpenter there. 

Ben Yelin: On the actual merits of the geofencing warrant, the court seems to treat - and this is in the words of Orin Kerr - this warrant like a search of a person whose phone's presence is revealed. So there are basically two types of warrants you can obtain here. Let's imagine a non-geofencing situation. 

Dave Bittner: OK. 

Ben Yelin: Just, you know, your standard, let's get into somebody's house warrant. 

Dave Bittner: Right. 

Ben Yelin: There's a warrant to enter a place and see who's there - so see if a certain person was in a particular place at a particular time. Or there's a warrant to actually search an individual - so pick their pockets. What the court seems to be saying here is that geofencing more resembles that more intrusive search where you're rifling through the pockets and possessions of everybody who's in a particular area and not that first type of search where you're just entering a place and seeing who is there. 

Ben Yelin: Professor Kerr - and I think I actually agree with him here - thinks that this warrant is more similar to that first type of inquiry where you're just seeing who is in a given location. You're not actually rifling through their pockets. 

Dave Bittner: Right. 

Ben Yelin: The reason that makes sense to me is that they're not actually looking for any information on this person's device. All they want to know is whether the device was in a particular location at a particular time. And, you know, that would seem to satisfy the particularity requirement. 

Ben Yelin: What the court is saying is, if we're going to treat this like a warrant where we're actually searching every device within a given area, you need to have some particularity. You need to have probable cause that every single device you're searching is going to be relevant for your criminal investigation. That's going to be extremely difficult to obtain, especially if you don't have a lot of information on whom the devices belong to in the first place, of how many people were caught by this geofencing warrant. 

Ben Yelin: So if you don't have to use that more intrusive process where you're searching everybody's pocket, you only need probable cause that a given device might be in the location that the crime took place. That's a much lower probable cause standard. Professor Kerr seems to think that that should be the governing standard here, and I think I agree with him. 

Dave Bittner: So let me ask you this. I mean, getting - sort of continuing that comparison with a regular warrant to search someone's home, it strikes me that - would it be a useful comparison to say there's a warrant for searching or entering a specific home, but then there's a warrant for going before a judge and saying, I want to go in every house in this neighborhood? I don't know which house might end up being the interesting one, but I need your permission to go in all of them. 

Ben Yelin: Yeah. I mean, that's an interesting comparison. I think when you're talking about a warrant to go into people's houses, it's much more intrusive and much more, like, actually searching through people's stuff. When you're talking about something like this where it's geofencing, I think you can draw a distinction because, like I said, you're not actually obtaining any personal information about anybody. You're just - on this very narrow point, you're trying to see if an individual device was in a particular place at a particular time. And so I think you'd need not have probable cause to just perform this rather broad search of whether a single device was in a particular area because if you had to have probable cause for each of the devices, then I think geofencing could potentially be ruled out as a law enforcement tactic. They're never going to be able to obtain probable cause for each of the devices in the area. 

Dave Bittner: Yeah. 

Ben Yelin: So I think, you know, if law enforcement wants to maintain geofencing as a valuable law enforcement technique, which I certainly think it is, then they're going to have to rely on other courts to interpret this differently and say that this is more of the equivalent of, let's enter a place and see who's there, and at that point, you don't have to have probable cause for every person that's searched. 

Dave Bittner: Well, let me continue my comparison here. I guess what I'm wondering is - I can imagine someone going in front of a judge and saying, I want to search every house in the neighborhood, and the judge would say, no, you need to tell me which house you want to search. 

Ben Yelin: Right. 

Dave Bittner: But I can imagine, for geofencing, it would be one thing for law enforcement to say to Google or what - any of the service providers - hey, give us all the information on every device that's been in this area during this period of time, and they go in front of a judge and say, this is what we want. To me, that's different than going in front of a judge and saying, hey, listen; we think that so and so - this particular individual - was up to no good; we would like the service provider to tell us if this individual was in this area - this individual and only this individual was in this area at this given time, based on geofencing information. 

Ben Yelin: Exactly. So I think that is the distinction here. Now, I should be clear. One of the original warrants was - you know, they did have probable cause about this one person. And according to judicial precedents - there's a case called Ybarra v. Illinois - you can get a warrant about - to search as to whether an individual was in a particular location at a particular time. That is a permissible warrant, even if you don't have any more particularity beyond that. But you would need separate probable cause to search any person that was in that area. 

Ben Yelin: So what the court seems to be saying here is that this search is equivalent to searching every single person in this geofenced area. And what Professor Kerr is arguing - which I think he's arguing wisely - is it is more like that original warrant, which just allows a search of that place to determine whether that individual, over which you already have probable cause, is in that area, if that makes sense. 

Dave Bittner: Yeah. OK. Yeah, that does make sense. So where do you think this goes next? I mean, what effect will this ruling have on things? 

Ben Yelin: So it is very limited to, you know, this one individual case and one individual circumstance simply because it's a magistrate judge. It has persuasive authority because it's the first of its case, most likely in the entire country. But I think we're going to see a split between courts on this. I think other courts will agree that the particularity requirement of the Fourth Amendment is satisfied. If you establish probable cause on one individual and you use geofencing to determine whether that individual was in that particular place at that particular time, that can satisfy the particularity requirement because you're not actually going in and obtaining personal information from all of the devices that are in that geofenced area. 

Ben Yelin: So I would predict that within the next year or so, you know, we're going to get a case where law enforcement used this geofencing as a tactic, and we're going to see a court come out on the other side. And I think it's up to legal scholars to try and figure out which argument is more persuasive, which conforms to our current understanding of Fourth Amendment jurisprudence. When you get these splits, you know, you start to see the academics get involved. They'll write law review articles, and then in turn that starts to influence future judges. And when it gets to higher courts, they're going to rely on that scholarship. 

Ben Yelin: So I kind of imagine it like a long staircase to get to some sort of satisfying conclusion where we know full well whether geofencing is constitutional as a law enforcement tactic, and we're just on the first step of that long staircase. I think there's just a lot of intellectual work on the part of both judges and legal scholars as to how this issue is going to be litigated going forward, especially as it gets into more prominent courts, federal district courts, courts of appeals, etc. So it's one of those things that we're just entering on the ground level here and, we're going to have to follow it very closely going forward. 

Dave Bittner: Do you find that long staircase to be a frustrating feature of our system? 

Ben Yelin: It's extremely frustrating because we all want answers now. 

Dave Bittner: (Laughter) Right. 

Ben Yelin: And here's the thing, Dave. 

Dave Bittner: Yeah? 

Ben Yelin: To a nonlegal person, what they want to know is, can law enforcement do this to me? They don't care about jurisdiction, Fourth Amendment jurisprudence, different types of warrants, the particularity requirement. All of that is the province of nerdy academics like myself. 

Dave Bittner: (Laughter). 

Ben Yelin: And it's very unsatisfying that we're not able to provide a solution here. And, you know, this isn't unique to this particular issue. All Fourth Amendment issues, or really all constitutional issues, go through this sort of evolution. You're presented with a novel circumstance - this is still a relatively new technology - and it takes a while to establish valuable precedent on what courts should do when faced with this circumstance, especially if there are two competing interpretations that are persuasive to different judges. But yes, it's absolutely frustrating. It's frustrating that we can't give people an answer. 

Ben Yelin: And I wouldn't necessarily say a chilling effect, but it could start to have an impact on people's actions. You know, if they are reluctant to go into a specific location because of the threat of geofencing and not knowing whether you have adequate Fourth Amendment protection against that tactic, you know, it could potentially change people's behavior in somewhat disturbing ways. I mean, of course, for the average person out there, it'd be far more valuable to get some finality on this, but I just think we're not there yet. 

Dave Bittner: All right. Well, (laughter) we will watch it play out, right? 

Ben Yelin: Yeah. 

Dave Bittner: I mean, that's all we can do at this point. 

Ben Yelin: Maybe in three years, we'll have a nice, satisfying answer for everybody. 

Dave Bittner: (Laughter). 

Ben Yelin: And we can look back on this first stair of the staircase fondly. 

Dave Bittner: Yeah (laughter). Oh, goody. 

Ben Yelin: Woo-hoo. 

Dave Bittner: All right. Well, interesting stuff, for sure. My story this week comes from Foreign Affairs, and it's actually an essay written by General Paul Nakasone, who is the commander of U.S. Cyber Command. He's also the director of the NSA and chief of the Central Security Service. Also, co-written by Michael Sulmeyer, who is a senior adviser to the commander of U.S. Cyber Command. And it's titled "How to Compete in Cyberspace: Cyber Command's New Approach." 

Dave Bittner: It's an interesting essay, also interesting to me just the way that this is put out there, the process of putting this article out before the public. And this is really U.S. Cyber Command and the powers that be who are in charge of our nation's defense when it comes to cybersecurity putting out their - what their policies are going to be, what - how they're looking at this stuff. And it's an interesting essay. 

Dave Bittner: Really, to me, what it comes down to is - what they're saying is that this - I suppose what is now an old-fashioned approach of being purely reactive, of having, you know, the old castle wall and moat and drawbridge and when someone tries to do something to you in the cyber realm that you respond to it, that that no longer works anymore. 

Dave Bittner: And so they're using this policy of what they're calling defending forward. Now, Ben, defending forward... 

Ben Yelin: Sounds like a terrible euphemism. 

Dave Bittner: (Laughter) I said - I have a tremendous amount of respect for General Nakasone, but I have to say that just when I saw that, it struck me as sort of classic military doublespeak. 

Ben Yelin: Yes. 

Dave Bittner: Like, what's (laughter) the difference between defending forward and - oh, I don't know - attacking? 

Ben Yelin: Yeah. It also kind of sounds like a bad political campaign slogan. We're going to defend forward... 

Dave Bittner: Right. 

Ben Yelin: ...Where it doesn't actually mean anything, but it sounds like it's strong and cool. 

Dave Bittner: Yeah. 

Ben Yelin: You know, they're actually onto something very interesting here. 

Dave Bittner: Right. 

Ben Yelin: The National Security Agency and the United States Armed Forces are moving toward persistent engagement in cyberspace. And this didn't come out of nowhere. In 2018, as part of the National Defense Authorization Act, Congress clarified that it gives the United States government statutory authority for military cyber operations to enable Cyber Command to conduct traditional military activities in addition to the more reactive activities that we were already undertaking. And after that law was passed, the White House under President Trump released its National Cyber Strategy, which sought to do much of the same thing that's outlined in this article - having a more persistent engagement in cyberspace, aligning our economic diplomatic intelligence military efforts, a more holistic approach. 

Ben Yelin: So, you know, part of me thinks about - why was this article released? And my instinct is it is a shot across the bow to our adversaries, saying that we're taking cyberwarfare very seriously - this is coming from the senior most official at U.S. Cyber Command - that, you know, if you are going to attack us, we are not going to be reactive; we're going to take it to you, our adversaries, to you, our geopolitical enemies. And so I think that really - as much as the American people, you know, for their information, this is what your government is doing - it's for our adversaries to know that this is a new day in the world of international cyber conflict. 

Dave Bittner: Well, and not coincidentally, I mean, peppered throughout this essay is the mention of elections - our own elections, other nations' elections. So I suppose not a very subtle hint at what they're going at here in terms of our adversary - as you say, the shot across the bow. 

Ben Yelin: Yeah, absolutely. You know, and I think it reflects the reality that we're in this international global competition. He - as the country who has the greatest power in cyberspace, is going to obtain economic and political advantage and is going to expand their sphere of influence. So, you know, doing things like, as they mention in this article, publicly releasing adversary malware obtained during hunt forward missions - that will make that malware less effective, and it also might make our adversaries think twice about whether to propagate an attack on our system. 

Ben Yelin: You know, I think it's sort of analogous to what you see in conventional warfare, where if you maintain a purely defensive posture - if we only had a National Guard, you know, it might please some of our libertarian friends out there, but, you know, it probably would not fulfill our overall long-term goal of protecting the country from foreign threats. Oftentimes, you have to go out and engage. Whether that's diplomacy, whether that's going into countries and nation-building - I think the philosophy of our military, particularly after 9/11, was not to just sit back and take this defensive posture. And I think what this article is getting at and the 2018 statue and the president's executive order is we are taking those principles and applying them to cyberspace. 

Dave Bittner: Yeah, it's also interesting to me that it reflects something that we've seen particularly from NSA, which is an increased level of engagement with the private sector but also the public. They mention there's a new facility that they've spun up here, not far from us here near Fort Meade. It's called DreamPort. And it's a place that is an unclassified location where these folks at Cyber Command can collaborate with folks who aren't within that, you know, bubble of the security within NSA and Cyber Command itself. And that, to me, I mean, that's a real shift that we've seen from NSA as well. They've been really deliberate about their intentions that they're going to have more engagement with folks outside of their community for the better of everyone, they say. 

Ben Yelin: Yeah, and I think that is a crucial part of this all-encompassing national cyber strategy. You know, one thing we have in this country is an entrepreneurial spirit, a lot of big tech companies with the most valuable technological tools in the world. And it absolutely makes sense for us to leverage that expertise and develop relationships. DreamPort is a great example. I know you've been there. I've been there. I mean, it's good to actually see that collaboration in practice. 

Ben Yelin: And, you know, I think it is all part of the same effort to elevate cyber conflict to its proper place. We now see, you know, based on our experiences over the past several years - whether it's been high-profile ransomware attacks, phishing scams, election interference - that this can have real, tangible impacts on our society the same way that traditional attacks have. And we need to start treating the threat like any other threat to our national security. And it's taken us a while, but I think we're finally starting to see government policy reflecting this strategy. 

Dave Bittner: All right. Well, we will have a link to the essay in the show notes. It's an interesting read - certainly gives you a good idea of the stated intentions of the folks at the national level who are defending us in the cyber realm, so I highly recommend it - worth your time. 

Ben Yelin: Absolutely. 

Dave Bittner: All right, Ben. It is time to move on to our Listener on the Line. Our Listener on the Line wrote us an email here, and I will read it. It says, a question for the "Caveat" podcast. And forgive me if this has been asked before, but your legalistic opinion - how legal are school device agreements if the parent doesn't have an option to decline the agreement? And how does Google Translate and Bing Translate factor into this? For example, if I have a parent that speaks Klingon... 

Ben Yelin: Who among us does not, right? 

Dave Bittner: ...But nobody in this school district speaks that language or they don't have the time to do this - so there's a couple of interesting questions here. What do you make of this, Ben? 

Ben Yelin: Great questions. Generally, these device agreements are not only enforceable, but I think the listener really gets to an important point that even if you as a parent can opt out in theory, you can't really opt out in practice. So the Electronic Frontier Foundation has done a lot of work on these device agreements, pointing out their potential flaws and the potential privacy concerns. 

Ben Yelin: And they got into this issue with opting out. Some school districts make it explicit that you can opt out if you don't want your child to be using one of these devices. Some schools don't really make it explicit, and in fact, when the Electronic Frontier Foundation has conducted opinion polling on this, many parents aren't aware that they even have the option of opting out. So if you don't know about it, you certainly won't be able to do it. And then even if you do know about it, I mean, opting out can just be an incredible hassle. 

Dave Bittner: Right. 

Ben Yelin: You can opt out, but then, you know, your child is not going to be at the same level as the other students who are using these technological tools. And the schools themselves, you know, aren't going to be in a hurry to provide you alternative means of learning. 

Dave Bittner: Yeah. 

Ben Yelin: We've read about in some cases where parents have tried to opt out, but when the kid goes to school, they have teachers setting up new iCloud accounts for the students, you know, not knowing that the parent refused to sign that permission slip. 

Dave Bittner: Well, I also wonder about the social implications, too. I mean, I don't know if this - you probably had a similar experience. I know for many of us, you know, when it came time every year - I don't know. Late in elementary school and middle school, when it came time for family life, you know, where - right? - they teach you all about you know the plumbing of boys and girls... 

Ben Yelin: Yes. 

Dave Bittner: ...You know, there was always one or two kids whose parents - for usually, you know, religious reasons or whatever - opted out and decided this was something they were more comfortable handling at home. 

Ben Yelin: Yeah. Let's just say the other kids noticed, right? 

Dave Bittner: Right. Right. 

Ben Yelin: Yeah. 

Dave Bittner: Those kids - this did not go uncommented on by the other children, you know? So I think that's a very real thing as well. 

Ben Yelin: It absolutely is. And there is a class-based and, you know, demographic-based angle to this as well. If you don't have parents at home that are technologically savvy or would know how to operate this device, you know, you can't expect a first grader to know how a laptop computer works. You're going to need to have semi-computer-literate parents... 

Dave Bittner: Right. 

Ben Yelin: ...To actually allow that student to do the work. And when you are relying on technology for learning, particularly in this time where we are only relying on technology for learning in most... 

Dave Bittner: Right. 

Ben Yelin: ...Jurisdictions across the country, then that can exacerbate existing inequality in our school districts. And that's where that language issue comes up... 

Dave Bittner: Right. 

Ben Yelin: ...From our questioner. The school might say, you speak this particular language at home? We have one tool to offer you, and that's Google Translate, Bing Translate. But here are all the privacy concerns. You know, you've done your research. You see the privacy concerns of using those tools. You can either opt in and subject yourself to those privacy risks, or you can opt out, and your kid's not going to understand what's going on in the classroom. 

Dave Bittner: Right. 

Ben Yelin: And that's really a terrible choice. So I think, you know, if you're looking for an action item here, it's to go to your local school board and try and make opting out a more realistic possibility. Force the administration of your school districts to develop reasonable alternatives so that, you know, you're not giving parents this awful choice of excluding their kids from the learning process or potentially exposing your children to the dangers of privacy, all types of threats that exist when we log onto our devices. 

Dave Bittner: Yeah. 

Ben Yelin: So this is just a great question, and I'm very glad our listener asked it. 

Dave Bittner: Yeah. Thanks so much for sending that in. We would love to hear from you. We have a call-in number. It's 410-618-3720. You can call in and leave a message there. We might use it on the show. You can also send us an email. Write to caveat@thecyberwire.com. 

Dave Bittner: Ben, I recently had the pleasure of speaking with Sean Brooks. He's director of the Center for Long-Term Cybersecurity Citizen Clinic. That is over at Berkeley. And they recently published a report titled "Digital Safety Technical Assistance at Scale." Interesting conversation here with Sean Brooks... 

Sean Brooks: The Citizen Clinic program has been running for about three years at UC Berkeley. The program was modeled after clinical programs in the law and medical fields where the model provides an experiential learning opportunity for students. 

Sean Brooks: So if you ask any attorney - right? - they'll probably tell you that law school doesn't teach you what it's like to actually go out and be a lawyer in the world. That doesn't mean that it doesn't prepare you intellectually for the work, it doesn't teach you a lot about the foundations of law. But in terms of actually practicing law, it's a very different experience. And so what a clinic model allows is for students to get an actual taste of what it's like to do work in the field in a space that allows them to learn and make mistakes and get guidance from, you know, experienced professionals. 

Sean Brooks: At the same time, clinical models in both the medical and the legal fields have enabled academic institutions to provide these high-cost, high-expertise services for populations and communities that wouldn't otherwise have access to those services. And so if you look at public interest law clinics, you'll see that they're focused on things like death penalty cases or environmental law or immigration law. This allows students to do public interest work, to serve populations who are in need of legal assistance but wouldn't otherwise be able to afford it and generally benefits the sector as a whole. 

Sean Brooks: The creation of, again, particularly law clinics helped create the entire field of public interest law. And when we were looking at the cybersecurity field and the sort of digital safety issues that nonprofits and journalists, human rights defenders and other activist organizations were facing, we saw sort of a similar challenge, where there's a high-cost, high-sophistication cybersecurity workforce and set of companies, startups, government agencies - you name it - but there's not much catering towards high-risk organizations, particularly those being targeted by politically motivated actors. And there's not an opportunity for students to understand that there's a workforce opportunity there other than - you know, if you want to go into cybersecurity work, most of the traditional career paths are into defense or intelligence or into the private sector - and the idea that there isn't really a public interest career track for folks interested even though there's a huge amount of need. 

Sean Brooks: So by creating the clinic, we hope to create a model by which more students can understand the breadth of the need of these types of projects in the field and understand that there's a lot of different ways and a lot of different types of expertise to get into the cybersecurity space. So for that reason, we take on students from not just computer science programs but law, public policy, journalism - a history student last semester. We've had students from psychology, the School of Information. 

Sean Brooks: And in the last three years, we've worked with 10 different organizations on, I think, four different continents working on issues as diverse as reproductive rights, indigenous land rights, journalism and the public interest, international development accountability efforts, war crimes reporting - a huge diversity of organizations. And our students have been able to work together in interdisciplinary teams to both understand the broader context in which these organizations sit and understand how that contact informs their threat model and then propose mitigations to those threats and actually implement some technical and policy solutions to some of those problems to get the actual experience of what it's like to do the work of cybersecurity and digital safety. 

Dave Bittner: Now, what sort of lessons have you learned along the way? You know, I can imagine, spinning up something like this, you certainly had things in mind. But of course, everything's not exactly the way you plan it when you actually go out there and start engaging with the folks out there. Were there any things that were unexpected, that you had to make adjustments along the way? 

Sean Brooks: (Laughter) Yeah, absolutely. You know, we started small. Right? We started with a pilot with four graduate students, one of whom actually ended up graduating from the School of Information and becoming the deputy director here at Citizen Clinic. 

Sean Brooks: And we started first working with a digital rights organization in Latin America. And in that first experiment, we knew that sort of understanding the organization's culture, the broader context of their work - particularly their political activism - was going to be really important in understanding what types of technical mitigations would be necessary and what sort of solutions they could adopt. But we didn't - I don't think we even embraced that mentality far enough initially. 

Sean Brooks: So for example, working with that client organization, one of the things that we wanted to do was get them set up with multifactor authentication on all of their critical accounts - right? - really simple mechanism by which we could really improve the authentication security of that organization across the board. So we determined for what the three most critical accounts were for everybody in that organization. We ran a series of very sort of intimate workshops where we had the employees go through the process of setting up app-based MFA for themselves on all these accounts, really walked them through and made sure they understood how it worked, made sure they understood backup codes and some of the challenges of preserving those. And we felt pretty positive that by the end of the experience we'd gotten everybody on - in the organization set up with MFA for their work email and social media accounts and some personal accounts as well. So we were feeling really good about that. 

Sean Brooks: We work with most of our clients for not just one semester but usually about a year and a half on average. So about six months later, we were able to come back and say, hey, you know, we implemented this. We also helped you get set up on a cloud-based email provider. So let's go back and check and make sure that everybody's MFA is still turned on. When we went back and looked, only 30% of the organizations still had MFA turned on. 

Dave Bittner: Wow. 

Sean Brooks: And we were like, well, what happened? 

(LAUGHTER) 

Sean Brooks: You know, what did we do? What went wrong? And what turned out - what it turned out is that the city in which this organization was based has a high amount of cellphone theft. And so what we didn't anticipate when we recommended, OK, everybody get this authenticator app and use this for your email security, was that everybody was more concerned and felt, accurately, that it was more likely that their phone was going to get stolen and they were going to lose access to their authenticator app than it was that their email was actually going to be remotely compromised. 

Dave Bittner: How interesting. 

Sean Brooks: And so in that moment, by - you know, despite the fact that we'd done all this work looking at the context of the organization, we missed this really, really simple component of what day-to-day life was like for our client organization. And so that really helped us double-down on that and the importance of that. 

Sean Brooks: And that has become, I think, one of the linchpins of our educational pedagogy - is really helping our students understand how context informs the threat profile of our partner organizations much more than their technical infrastructure or even, in many cases, the capabilities of their adversary because when it really comes down to it, the threat profiles of a lot of the organizations, from a technical perspective, that we work with are very much the same because they don't have complex technical infrastructure, right? 

Sean Brooks: They've got an email account. They've probably got some storage needs. There's probably some social media presence. But the technical infrastructure of a lot of these organizations, regardless of whether they have a hundred people in them or five people in them, because they're sort of agile nonprofits, tends to look very similar. But the context and the way that they do the work informs the ways in which they're going to be attacked both because the attackers aren't really necessarily looking to get things like money out of it, right? They're looking to it for political ends. And they don't necessarily need to use the full complement of their capabilities in order to compromise these organizations. 

Sean Brooks: And so one of our friends at Citizen Lab at the University of Toronto once called our work - and I think he was trying to be positive about it - he said, it's the boring work of cybersecurity. But in many ways, it's a really great experience for our students, where it's like, look - we talk in the abstract about things like multifactor authentication or setting up HTTPS or setting up things like DMARC as relatively simple or straightforward cybersecurity controls. But when the rubber meets the road, this stuff is actually a lot harder than you think it is. And so if we want to make meaningful progress, particularly for low-resource organizations with no technical staff, there's still a lot of work to do to make these types of controls more user-friendly, more accessible and, generally, more adopted. 

Dave Bittner: Well, you also recently published a report titled "Digital Safety Technical Assistance at Scale." Can you take us through that? First of all, what prompted the creation of the report? 

Sean Brooks: So one of the things that we were looking at over the last few years is how do we grow our own clinic? And, you know, as I mentioned earlier, we started with four students and one client organization. Now, on an average semester, we have something like 16 to 20, 25 students, and we work with five or six organizations a semester, depending on how many students we have. That's a big increase, and that's great. 

Sean Brooks: You know, law clinics have sort of similar sizes - some are smaller, some are much larger. But the reality is, is one public interest cybersecurity clinic at UC Berkeley is not going to solve the world of problems facing civil society, particularly not in this time in the world, when we see rising authoritarianism, clampdowns on media and civil society organizations. There is a lot more need than there is expertise available. 

Sean Brooks: And so if we really think the clinical model is a good contributor to resolving these problems, we have to think a little bit more about the big picture of how do we get this expertise into the hands of activist organizations and those who need it? 

Sean Brooks: And so that paper was informed by what we've learned in the last three years at the clinic about the challenges of bringing this work to scale and what we think some of the opportunities out there in the broader ecosystem look like because, in our mind, it's not just one clinic or even many clinics, but it's going to be a mix of contributors from the private sector, from government, from academia, from activist organizations, where currently the vast majority of technical assistance for civil society comes from - is from small nonprofits themselves, organizations like Access Now or the Electronic Frontier Foundation, Tactical Tech or Internews. These are organizations that, while well-established, you know, are still fairly lean operations. And they're the main technical assistance providers out there in the world today. 

Sean Brooks: And so if we need - you know, we need to sort of raise the number of bodies working on this issue, but we also need to raise the level of sophistication in which this ecosystem works and collaborates with other contributors, as I mentioned, from other sectors as well. 

Sean Brooks: So we started with this question of, how is this supposed to get bigger? How are we supposed to make more helpers in this field? And one of the things we realized very quickly was we weren't entirely sure what exactly was supposed to get bigger. We knew that we needed more people. But were we trying to grow what exactly? Are we trying to grow the capacity of NGOs to take care of these issues themselves? Are we trying to make it easier for them to find help? And so what the report talks about is what are sort of the components of scale that we want to emphasize. 

Sean Brooks: As the technical assistance ecosystem grows and matures, what are some of the things that need to be improved in existing practitioners? And what are some of the emergent models that we have seen in the last three years that might give us a sense of how do you do this work bigger and better? 

Sean Brooks: So we talked about three models that we've seen emerge, specifically our clinical model. 

Sean Brooks: We've also seen volunteer models, where organizations are getting together as sort of hubs for particularly members of the private sector who are interested in using their skills for good where - so organizations like I am The Cavalry, where they get together a large group of volunteers and try to matchmake those volunteers. Particularly in the example of I am The Cavalry, they're matchmaking those volunteers with organizations in the medical device or the airline industry or the auto industry. But there's not enough experts to deal with a large amount of technical systems going into these very critical machines and technologies. And helping them harden those systems out the gate to make sure that they're not going into people's bodies or on the roads or into the sky in an insecure fashion. 

Sean Brooks: There's another model that we've seen which is also really encouraging and exciting, which is sort of this community hub model where a couple of cybersecurity practitioners will create a hub for technical assistance directly focused on a specific community of need or on a specific advocacy sector. And we've seen this happen for reproductive rights. We've seen this happen for survivors of intimate partner violence. And these kind of community hub models can be focused on geographic areas, again, or specific sectors of advocacy or activism or specific issues. 

Sean Brooks: But what that community hub model allows is for that organization to become super embedded within that community and culture. And so all of that contextual awareness that I was talking about earlier, that becomes secondhand because they live and breathe it all the time, and they become sort of the center of excellence for that community. And they can do matchmaking. 

Sean Brooks: We've done a lot of work with a couple of community hub organizations at the clinic where we now have a trusted relationship with that hub. And so as they come across organizations who might be a good fit for the clinic, whether it's, you know, they need help on a longer-term basis or that the urgency of their needs is better addressed over time. Much of the work that we do is, you know, on the semester schedule - right? - so we're not necessarily an emergency response organization. And so those community hubs can help funnel members of their community to organizations like ours or others depending on their needs and provide technical assistance for those who don't necessarily have a good fit in the sort of broader ecosystem. 

Sean Brooks: And so through these three models, all of these things have - you know, these models have different places where they're more capable. And so what we tried to chart out in this paper was what kind of model fits what type of component of scale. And so the audience for the paper is folks who want to contribute. 

Sean Brooks: So if you're in philanthropy, for example, a lot of philanthropic organizations are coming to understand that, you know, every program needs some sort of capacity building in this space. So if you're looking to get something started up because you're funding in the environmental space or in a space that has generally not paid a lot of attention to digital safety issues in the past and you're trying to figure out what the models are, hopefully this paper can help you understand based on the problems you've observed what kind of technical assistance model might help the community of interest you're looking for. 

Dave Bittner: All right, Ben, what do you make of that? 

Ben Yelin: I think it is a very cool project that they're undertaking at Berkeley to try and bring this important service, cyber knowledge, to members of the community that might not be able to get it. And not to mention my own institution, but I would encourage... 

Dave Bittner: (Laughter) Just toot your own horn. 

Ben Yelin: Right. I would actually encourage all universities - you know, he talked about that this is a multidisciplinary effort. I think particularly in the legal world, it's incumbent upon law schools to have cybersecurity clinics because you're going to have victims of cybercrimes who don't know what their rights are, don't know what their obligations are. And, you know, sometimes in other legal realms, often the cheapest, most effective legal resource is going to be these law school clinics. And this is something that's being offered not through the law school at Berkeley but from an interdisciplinary level. And I think it's a very valuable tool, and I'm glad that he's introduced it. 

Dave Bittner: Yeah. Well, our thanks to Sean Brooks for taking the time for us. A little note - I saw just this past few days that Sean announced on his Twitter feed that he is actually moving on from the Center for Long-Term Cybersecurity Citizen Clinic. He's going to Facebook. He's going to be working on some stuff there. So congratulations to him. Seems like he's pretty excited about the possibilities there to contribute to some of the things that Facebook is working on. So again, our thanks to him for joining us. 

Dave Bittner: And that is our show. We want to thank all of you for listening. 

Dave Bittner: The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.