Caveat 6.13.24
Ep 222 | 6.13.24

The end of an era.

Transcript

Eric Goldstein: We have to keep recognizing and focusing on the fact that public service doesn't need to be a lifelong career. And in fact, there is real value in cybersecurity of moving between organizations, between roles and understanding perspectives. And so the view that we need to have is to encourage individuals to come into government, do a few years, help your country, help your communities, and then go back out and learn something new and maybe come back in.

Dave Bittner: Hello everyone, and welcome to "Caveat," N2K CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner, and joining me is my co-host, Ben Yelin, from the University of Maryland Center for Health and Homeland Security. Hey there, Ben.

Ben Yelin: Hello, Dave.

Dave Bittner: On today's show, Ben has the story of Google laying off members of the team that reviews law enforcement data requests. I've got the story of teens using AI to undress their peers. And later in the show, my exit interview with Eric Goldstein, outgoing Executive Assistant Director for Cybersecurity at CISA, he is sharing his thoughts on where cybersecurity has been and is going as he departs the agency. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. [ Music ] All right. Ben, let's start things off here. You want to do the honors?

Ben Yelin: Sure. So my story comes from the Washington Post technology page by Gerrit De Vynck, and it's about Google cutting members of its team that vets police requests for user data. So, unfortunately, as you know, there have been a lot of layoffs in Silicon Valley and the tech industry over the past year. Obviously probably that has personally affected many of our listeners, and it's just an unfortunate reality of the market right now, which really sucks. But this story is focusing on Google laying off members of a very important team, the team that reviews the 100 to 200,000 requests per year for some type of data that would be useful to law enforcement agencies. So location-based data, app-based data, really anything you can think of that Google retains in its cloud or really anywhere else. These are the people who deal with those requests. This is part of a broader trend in the industry. When Elon Musk bought Twitter, he laid off basically the entire team that did policy moderation and regulatory work on Twitter's behalf. Now X.

Dave Bittner: And as a result X, Twitter, has exploded in popularity.

Ben Yelin: Yeah, exactly. If by popularity, you mean responses that say, my nudes in bio, which is basically what Twitter is these days.

Dave Bittner: Right. Right. Exactly. It's what are they people referred to as the Nazi bar.

Ben Yelin: Yes. Oh, yeah. That is certainly more of a problem than it ever was. But that's a subject for a different day.

Dave Bittner: Okay. Fair enough.

Ben Yelin: So this unit in Google is called the Legal Investigations Support Team. The reason it's important is that I think as a matter, of course, when law enforcement agencies send these requests to Google, it is their inclination to say yes. The reason they have that inclination is they don't want to go on the bad side of law enforcement agencies, they want to be seen as cooperative, particularly when we're talking about exigent circumstances or emergencies. Let's say there's some type of active assailant, active shooter event, and they need instant location data to figure out if a particular device user is in a particular area, and that can't wait for any type of judicial authorization, Google's inclination is going to be to say yes. The check on that inclination, the balance on that inclination is to have a robust Legal Investigation Support Team. And this team, which has existed for a number of years at Google, was already struggling. They get a huge amount of government requests, they are responsible for developing policies on how to respond to these requests. They review the individual requests themselves. They have lawyers on staff to make sure that the requests are actually legal. I'm sure some of the more successful members of my law school class are part of that team to ensure that all the requests coming in are legal. And this trend of downsizing these types of units comes at the same time that police law enforcement and even spy agencies are being asked to give up data on its users. From January, 2023 to June, 2023, that's the last six month period for which we have data, Google was asked to give up 110,000 approximately user accounts in the United States. That's according to their own transparency report. And they said yes to those requests about 85% of the time. And that's part of a broader trend of them giving more and more data to these law enforcement agencies. This is something that's a concern to people who care about civil liberties, because the fewer resources you have to vet these requests, I think Google, the risk is that Google goes on autopilot and just starts saying yes, because they don't have the time or the manpower to go in and see, is this request actually legal? Is this backed up by whatever the standard is? Do they have an actual reasonable suspicion here? The fewer members who have of this investigations team, the more likely you're just going to say, well, you know, we say yes to 90% of these requests, we might as well just say yes to this one. I don't think that risk is immediate. Google says they still have approximately 150 members of this team on staff. But that is the risk if these layoffs continue. And since we're seeing it in other companies, I think it's an industry-wide risk.

Dave Bittner: Is there any statement or speculation that they are taking advantage of, dare I say, AI to streamline any of these processes and then be able to get rid of some actual flesh and blood workers?

Ben Yelin: I think that's a huge part of it. They don't say it explicitly, either in this article or in the statement, but I think you can certainly read between the lines. Their response is the most bureaucratic thing ever. It says, this restructuring simply consolidates the teamwork to a few existing locations and streamlines our workflows while maintaining our high standards for protecting our user's privacy and timely responses to law enforcement demands. Any suggestion to the contrary is simply wrong. I don't want to impugn anybody's reputation or motives here, but that certainly doesn't give me a lot of confidence that this isn't a big deal. I think you can kind of read between the lines when you talk about things like consolidation and efficiency. I absolutely think that could include the use of AI instead of human beings, which presents its own risk. We know how the use of AI tools, for example, can be biased, biased against in a number of different circumstances, including things like racial, socioeconomic bias. So that would definitely worry me. And I think that is a major unspoken part of this is the over-reliance on artificial intelligence to perform these same tasks.

Dave Bittner: I wonder, are there any of the big tech companies who take the stance that their default position is going to be no? In other words, you know, it seems to me like with Google here, what we're guessing, what we're speculating is that fewer people on staff will mean less scrutiny over more information being just handed over to law enforcement.

Ben Yelin: Yeah. And to be clear, this is our speculation. I haven't done any sort of market analysis on this, so it's just kind of my inclination on the subject.

Dave Bittner: But the other way that this could go is an organization like Google could say, hey, we've only got this amount of manpower, and that's going to limit the amount of stuff that we can give to you, because every request has to receive scrutiny. So this is going to slow down. But that doesn't seem to be the approach. I can't think of any of the tech companies who have that being their default approach. I mean, maybe, you know, Apple famously went toe-to-toe with the FBI in the famous case. But --

Ben Yelin: But that was kind of a one time because of the egregious nature of the request, which is that we break our own encryption. That's a step too far. We're not going to do that. When it came to garden-variety requests for data, I don't think they've had necessarily adversarial relationship with law enforcement agencies, including the FBI. I think the better track to take for these tech companies is to prevent this data from being available in the first place. Some companies do that through things like end-to-end encryption. Google took a major action last December that I think helps in this regard when it announced that it would stop storing users' location data in its cloud servers, meaning even if the government came to request such data for things like geofence warrants, they just simply wouldn't be able to provide it because they're not retaining that data anymore. So I think that's the promising way to do it. Then you're not getting into a fight with law enforcement, you're not getting into a massive legal battle, you don't have to worry about the bad publicity of there was an active shooter here, Google held the data and refused to give it to law enforcement, which is a story I'm sure, they're very scared to see from various news sources.

Dave Bittner: Right. Right.

Ben Yelin: So to avoid that situation, it's best to head off this problem by just making it impossible for the government to obtain certain data, giving people more control over their own data. I think that's kind of the long-term solution here. But just on the nature of having to run a business and having to analyze data and store a lot of different user information in the cloud, there's still going to be a lot of valuable information that law enforcement is going to want. So until we get to that time where companies are better at things like end-to-end encryption, we do need to have these robust teams that Google seems to be paring down here. And I think that's definitely a cause for concern.

Dave Bittner: Yeah. I mean, could it also be that if Google is doing things like more end-to-end encryption, right? Not storing location data, and that means that they have less to give law enforcement, I mean, that could be a reason for cutting back staff as well.

Ben Yelin: I think that's true. That combined with something like AI, it is a team of 150 people. So to me, that seems small when you're talking about the volume of requests here. But then you worry about the team getting whittled down further as there are other rounds of tech layoffs. And this becomes less of a priority because it's less involved in making sure that Google stock prices remain stable.

Dave Bittner: Yeah. It seems like weird optics to me. Like, of all the places, when you're a company the scale of Google, of all the places you want to keep robust, right? For the public eye, the folks who are protecting data seems like a place to just kind of, you know, give that team a blank check, right?

Ben Yelin: I think so. But to be real cynic about it, that's not what makes them money. I mean, it's just, if they're having to decide which employees to lay off because they're being forced through this type of market correction we've seen over the past couple of years to lay people off, they don't want to lay off product developers or network engineers, the type of people who keep their services running, running well, that keeps their company innovating and making the gobs of money that they end up making. I do think that a team like this would become less of a priority. That's a cynical view. They would say, you know, this is part of a standard consolidation process. That's not why they're doing this. But I think there's probably a reason that members of this team were on the chopping block. And I think it has to do with the fact that Google doesn't actually make any money from fulfilling law enforcement requests. It's just, it's not part of what goes into strengthening their bottom line, and therefore, it's just going to be less of a priority.

Dave Bittner: Yeah, that's interesting. So I mean, again, what we're talking about here is a lot of speculation on our part, but I think it does warrant keeping an eye on.

Ben Yelin: I think so too. Yeah. To whether -- I think it's important to see whether this becomes a broader trend, also if this is a one-time market correction, because what happened is tech companies hired a lot of people to handle kind of the COVID tech surge from 2020 to 2021 when everybody was, for the first time, conducting all of their meetings over Zoom or other online services, tech companies hired a lot of people to manage that transition. So I think we're not quite sure whether the downsizing we've seen in the past couple of years is just a natural correction to that and it's going to stop, or if this is going to be a long-term problem where AI, for example, is eliminating the need for a lot of positions, there's more consolidation, they're tearing down buildings in Silicon Valley, you know. I would worry that these teams are going to be whittled away even further. And it's not just going to be Google, it's going to be other companies. So I think it's definitely something to keep an eye on.

Dave Bittner: Yeah. It'll be interesting to see if we get any complaints from law enforcement, you know, that they're saying, you know, we're no longer able to get the information we need, or if anything like that would ever bubble to the surface.

Ben Yelin: Yeah. I mean, I think that would be interesting and bad publicity for these companies. It seems based on what the Alphabet Workers Union, I believe is what it's called, that this team was already significantly overburdened. So we're at the point where if they start to hemorrhage more staff, law enforcement is certainly going to notice, especially if it's some type of high profile event where it's an emergency and they need to obtain some data. Yeah.

Dave Bittner: All right. Well, that's interesting. We will have a link to that Washington Post story in the show notes. My story actually comes from Scientific American, which is one of my all-time favorite publications.

Ben Yelin: Not a usual source for us, though. We're dipping into uncharted territory here.

Dave Bittner: It is. And I think Scientific American has also kind of broadened the scope of the kinds of things that they are focused on. I think sort of the necessities of being a modern publisher means it's harder to be an extreme niche, which is what they've been. And you know, they've been around for what I think over 150 years. So they're a long storied publication. But this article was written for Scientific America by Riana Pfefferkorn, who --

Ben Yelin: Friend of pod.

Dave Bittner: Friend of the pod, she's been a guest on our show. She is a research scholar at the Stanford Internet Observatory and a good source for all sorts of digital policy issues. The article she wrote is called Teens Are Spreading Deepfake Nudes of One Another. And it's No Joke. So, Ben, you were probably too young to remember the days of comic books. Did you ever buy comic books? Did you ever read comic books?

Ben Yelin: I just was, I don't even think I'm too young. I just think I was never that interested in comic books. I just haven't really got them in a way that probably you and a lot of our more nerdy listeners.

Dave Bittner: I'll say I was not a regular comic book reader, but I did occasionally. You know, I had a handful of comic books. But one of the standard things in comic books were in the backs of comic books, there were ads. And one of the ads that anyone from my generation who read comic books, the ad that you, an ad that you will certainly remember are X-ray glasses.

Ben Yelin: Yes. I remember the X-ray glasses thing.

Dave Bittner: Right. So the idea was for, I don't know, you know, a dollar 29, you could send away and they would send you X-ray glasses, which will, of course, you know, the fantasy of every red-blooded teenage American boy, because I'm going to spend my dollar 29 on these X-ray glasses and --

Ben Yelin: And you can see everything.

Dave Bittner: Exactly. Exactly. Well, guess where we are today, Ben? Thanks to the magic of AI and online apps, this is kind of a reality now, and it's pretty scary. There are these apps that can create deepfakes, and one of the things that they're capable of doing is taking a photograph of you or me or our families, our wives, our sons and daughters, and imagining through the use of AI what that person would look like if they were in the nude. And of course, the technology is there to make it completely photorealistic. You can, you know, dial in all sorts of different things depending on, you know, what it is you want to see. The problem here, of course, is that anything dealing with underaged people is categorized as CSAM, you know, Child Sexual Abuse Material. And I think, you know, as you and I have talked about, this is an odd area of the law.

Ben Yelin: It is. Yeah.

Dave Bittner: Remember when my oldest son was probably about 16 years old, you know, and he'd started dating and he had a regular girlfriend and they had their phones and they would text each other. And I remember pulling him aside one time and saying, look, here's the deal. Like, I understand the impulse. You know, you guys have got phones on you all the time and you're going to be experimenting with things and you're going to want to, you know, you might want to send spicy photos of each other, but you need to understand that because you are both under the age of 18, this is how this is categorized. And whose name is that phone in? Mine, right?

Ben Yelin: It's also these images live forever and --

Dave Bittner: Well, yes. An excellent point as well. It's an odd area of the law because, you know, if kids are doing, a couple teenage kids, you know, 16 years old are doing this consensually, what does that mean? Where does that fit into this? Do we throw the book at them for, you know, Child Sexual Abuse Material or do we give them a stern talking to? You know, to me, it's not so cut and dry.

Ben Yelin: Yeah. One of the things that Riana Pfefferkorn talks about in this article is that it's the response to these crimes, to the distribution of this material is sometimes so counterproductive that it does more harm than the original distribution of the image. So she gives this example of a couple of boys in Miami, 13 and 14 years old, who were arrested for the distribution of CSAM based on nudes created through artificial intelligence and they face third-degree felony charges. So these are children being arrested and charged over deepfake nudes. And she notes the research here. When you arrest children, our juvenile justice system does lasting damage to young offenders because it's laden with abuse. And it's especially abhorrent, in her view, and really in my view, to do this to younger kids, 13 and 14 year olds, who may not appreciate the severity of what they're doing. So once you bring young kids into the juvenile justice system, that introduces its own level of significant harm beyond just the distribution of images. So I think the lesson here is to prevent these crimes from taking place in the first place. And that requires a societal change to take this seriously, to not ignore it when it happens, and then to do what both the federal government has done and many states have done, which is to, in some ways, disincentivize through criminal statutes the distribution of these nude photos. So the federal government used Title IX of the Education Act, which we're familiar with. It protects, it's basically a provision that prevents sexual discrimination at schools that receive any type of federal funding, i.e., most schools.

Dave Bittner: Mostly comes up with athletics.

Ben Yelin: Exactly. Most people know it through athletics. But it's being used in the circumstance to say that the distribution of these types of images constitutes sexual harassment, and that could subject the schools themselves or any individuals involved to some type of punishment. There are also a number of state laws that have been introduced in a variety of states, Florida, Louisiana, South Dakota, and Washington are ones she mentions in this article. And there are similar bills pending in other states that criminalize this activity. Hopefully, this creates the cultural change so that the result of these statues, it's not that more 13 and 14 year olds are getting arrested, but that people take the distribution of these images more seriously.

Dave Bittner: Yeah. I mean, I guess on the one hand, you know, look, anybody who has been the parent of this tween aged child knows that they have lousy impulse control, right? They don't think about consequences, many of them. And so you can see how this, they can get carried away with these sorts of things.

Ben Yelin: We all did stupid things as teenagers. It was just easier to erase those from our proverbial permanent records, because we didn't grow up in an age where everything is saved in a cloud somewhere or saved on somebody else's device, and we can't escape the consequences of our own actions.

Dave Bittner: But all that being said, what about the young boys and girls? And let's be honest, it's mostly girls who are the victims of this. We don't want to disregard the seriousness of what they've gone through of being effectively, you know, stripped of their clothing without their consent or you know, against their will. I feel like that's the hard needle to thread is, how do we have an appropriate punishment that takes this seriously, that respects the victims, but also doesn't go overboard?

Ben Yelin: Yeah, it is a hard needle to thread. I mean, Riana Pfefferkorn in this article also talks about the effect that these images have on victims, the substantial emotional and reputational harms. One of the solutions she talks about is a type of punishment that goes beyond criminalization. So there are a couple of options here that she mentions. One is a civil lawsuit, which doesn't put a person in the juvenile justice system, but would subject them to civil penalties, which I think is one avenue where a victim can get the relief that he or she, and it's likely a she, would deserve. This happened in New Jersey with a deepfake victim who sued the perpetrator. And then expulsion of students from high schools. So that's also a way of avoiding the juvenile justice system is just to expel them from school. That also has its own problems. I mean, generally, school is a stabilizing institution. And once you're expelled from school, that creates a lot of bad second-order effects. So an alternative approach that she proposes, which I'm a deep believer in, and really all realms of our criminal justice system is restorative justice. So it emphasizes healing, building mutual respect, and accepting responsibility for one's actions. And she gives some examples here, a discussion circle, which the victims and the accused get to be heard. The accused is asked to take accountability for causing harm and repairing the harm that they've caused to the victims. And there's some evidence in the literature that this can succeed, especially among youth who, as we said earlier, might not appreciate the wrongness of their own activities and the effect that it can have on their victims. So I know some people think restorative justice is kind of weird, touchy-feely from academic ivory towers, which I understand. But I believe in it, and I think in these circumstances, it's useful, because the perpetrators get to hear directly from the victims and hear how it's harmed them. And they can take responsibility for their own actions without being subjected to the abuse-laden criminal justice system, as she calls it in this article.

Dave Bittner: Yeah. What about the responsibility of the App Stores, right? I mean, obviously, you know, Apple has more control over what goes on your iPhone than you have on an Android device, because you can sideload things on an Android device. And, you know, as my wife and I sort of half joked, we may be able to outsmart either of our children when it comes to parental controls, but there's no way we're going to be able to outsmart our child and all of our child's friends, right? Like, kids are going to crowdsource this and figure out a way to get around whatever you block if there's something that they want. Can we go after the companies who are making these apps? Can we go after the companies who are allowing the distribution of these apps? Or are they, you know, are the Googles and the Apples of the world protected by Section 230 when it comes to this sort of thing?

Ben Yelin: They have some limited Section 230 protection, but I think they are worried about CSAM liability. And that's why I think they're very responsive when they're incidents of bad press where it's an image like this is spread on one of their platforms. We've seen AI companies, for example, commit to ensuring responsible use of their products by excluding non-consensual nude imagery. So shaming is always a helpful tactic. She also mentioned in this article a couple of self-help tools. One service called Take It Down, which allows users to get an image removed from participating social media services that is offered by the National Center for Missing and Exploited Children. And then the federal government has its own resource for victims of this type of abuse called Know2Protect, which is a campaign to provide additional resources to children and families who have been victims here. So it's less about liability for these companies because of the strength of Section 230 and more about shaming them to make sure that they're not allowing these types of images to circulate.

Dave Bittner: Yeah. And I guess the, I don't know if consensus is the right word for it, but the efforts to put age limits on being able to access things online, that just seems to not be a viable solution.

Ben Yelin: Yeah. I mean, we've talked about separate stories on age limits, and there are a lot of constitutional concerns there because it does prevent adults from being able to access content that they have a First Amendment right to access. It's a really difficult problem to resolve because those are two very strong interests. I think when it comes to something like CSAM, most of us would agree that the interest in protecting children supersedes the interest in giving people access to that type of material. But when you're restricting people from accessing a website -- I think we just did a story on this a couple of weeks ago, you, as I believe the State of Texas has proposed in a state law, you're not only restricting them from illegal content, but also on a lot of legal content, which is problematic. So I think you have to tread very carefully on these types of bans. I think it's better to work directly with these social media companies, shame them for the distribution, force them to take it down, have them be a productive partner and making sure that once images do make it on these services that they're promptly taken down.

Dave Bittner: It's frustrating because it is simultaneously a very simple sort of black and white issue, but also complex, right? Like, the right and wrong of this is so crystal clear. But, like, the second and third-order effects of coming at a child -- you know, you have a child who has been wronged, you have a child who has done wrong, but the same time, you don't want to ruin a child's life, I put that in in air quotes, you know, because of an impulsive, stupid decision they may have made as a teenager. But you also have to be empathetic towards the victim who can probably make the case, well, my life has been ruined, so, you know, I don't get off scot-free from this.

Ben Yelin: Yeah. I mean, this is why I really like this article and the underlying paper from Riana Pfefferkorn is I think it grapples with those conflicting interests. I think the easy solution is the lock everybody up, like, zero tolerance for the distribution of these images. I think that's satisfying, that's a political soundbite that I think people would embrace. But as she mentions here, it is really complicated because frequently, the distributors of these images themselves, we do not want to shield them from accountability, they are also children. And so we have to tread more carefully than we would if it were adults distributing these types of deepfake nude images.

Dave Bittner: All right. Well, we will have a link to that story in the show notes, again, article from Riana Pfefferkorn over on Scientific American. It's a nuanced look at this. So definitely worth your time. [ Music ] Ben, I recently had the pleasure of speaking with Eric Goldstein. He is the outgoing Executive Assistant Director for Cybersecurity at the Cybersecurity and Infrastructure Security Agency, CISA. Basically, this is an exit interview as he is on his way out. We're kind of looking at his time with the agency, some of the accomplishments there, the things that he thinks are yet to be done and his thoughts as he heads over into the private sector. Here is my conversation with Eric Goldstein. So let's talk about your pending departure from CISA here. What made you decide it was time to move on?

Eric Goldstein: You know, I am really a believer in cybersecurity, as in all fields of both taking time to look at the problem from different perspectives and the need for different viewpoints in leading an organization. And so I had the privilege of a lifetime to lead the cybersecurity team within CISA for three and a half years. I was a day one start for the Biden-Harris administration. And it's the right time for me to pursue that next opportunity, maybe have a bit more time with my family, and of critical importance, make sure that we have a refresh of perspectives for the critical cybersecurity mission that our team at CISA is driving every day.

Dave Bittner: Can you give our listeners a little overview of the arc of your journey here at CISA and the time you've been with the organization?

Eric Goldstein: You know, I think the most exciting evolution at CISA has been our focus on driving measurable risk reduction. You know, it gets so easy sometimes in cybersecurity just to observe the problem, to comment upon the problem. And I think we have seen CISA evolve into an agency that says, we have to be accountable. We have to be accountable for driving improvements in American cybersecurity risk and the American people should look to us to drive that kind of change. And I think some of the investments that we've made across the federal government in driving increased visibility, increased remediation, more quickly than ever before, some of our work in Secure by Design to really drive focus on safer and more secure technology products, and some of our work in really raising the baseline for what we call target-rich, resource-poor entities to make it harder for adversaries to achieve their goals, I think all moves us in that direction of being able to show measurable impact for the trust that the American taxpayer places in us every day.

Dave Bittner: When you look at the challenges that the organization faces going on here, I mean, what are some of the areas that you think your former colleagues are really going to have to focus on?

Eric Goldstein: You know, one of the blessings and the challenges facing CISA is that it remains a voluntary trust-based organization. And so that means that the agency is able to build deep, trusted partnerships based upon principles of reciprocal value. But it also means that the agency can't make anybody do anything. Now, this changes on the margins next year when our incident reporting regulation comes online. But even then, CISA is going to be a voluntary agency at its core. And so continuing to invest in those collaboration models, those planning efforts that get partners at the table voluntarily with shared goals, really is CISA's superpower, but it is a complicated way to drive change.

Dave Bittner: Speaking of collaboration, I mean, when you look back at the working relationship you had with Director Easterly, are there any highlights from the work that the two of you did along the way?

Eric Goldstein: You know, one of the most exciting areas looking at CISA over the past few years is, even three, four, certainly five years ago, CISA not only wasn't a household name, but also wasn't necessarily even an agency that was well known among critical infrastructure owners and operators across the country. And that's fundamentally changed. And I think a lot of the work that directory Easterly has done in driving CISA to be the cyber defense agency of first resort in this country and making sure that our role is well understood and well differentiated has been essential, because at the end of the day, if organizations, if individuals don't know about CISA, then they're not going to be able to benefit from and collaborate with CISA. And I think our Secure by Design effort really is an example of that. You know, Dave, you and I talked, you know, probably well over a year ago, when we put out our first ever Secure by Design white paper. And then fast forwarding to the recent RSA conference where we had at the time, 68 companies, now over 140 companies signing on to specific Secure by Design commitments, it really is remarkable. And it's remarkable not just on behalf of CISA, but really for every technology customer that depends on their providers to deliver safe and secure technology. And that wouldn't have been achieved without really CISA maturing into the agency that it is today.

Dave Bittner: You know, I think something that we hear from folks in the public sector is that, particularly for cybersecurity, it can be challenging to hang on to good talent, you know, for a variety of reasons. There are so many opportunities out there in the private sector for folks like yourself. As you are moving on, what is your perspective on that particular challenge of organizations like CISA hanging on to some of the people who have the skills that they need?

Eric Goldstein: We have to keep recognizing and focusing on the fact that public service doesn't need to be a lifelong career. And in fact, there is real value in cybersecurity, as I mentioned at the outset of moving between organizations, between roles and understanding perspectives from different sectors, different enterprises from the product side versus the organization side. And so the view that we need to have is to encourage individuals to come into government, do a few years, help your country, help your communities, and then go back out and learn something new and maybe come back in. You know, I was concluding my second tour of duty in federal service and in between interspersed with roles in the private sector. And that's a model that we have to encourage and endorse because that's the only way that we are going to get the cycle of talent that we need to really do our mission well.

Dave Bittner: I'd be remiss if I did not ask you, can you give us a sneak preview of where you might be headed next?

Eric Goldstein: I am heading back out to the private sector for a leadership role managing cyber risk of a large organization. I'm looking forward to learning and building on my expertise in that capacity.

Dave Bittner: All right. Well, stay tuned for the specifics. But before I let you go, based on, you know, the information that you've gathered, the wisdom that you've achieved over the course of your time there at CISA, what would your advice be for the person coming in to replace you? Any words of wisdom there?

Eric Goldstein: Really two. The first is, you know, we can do better and we can live in a safer and more secure technology ecosystem. The world we live in today where ransomware attacks are ubiquitous, APT intrusions are commonplace, vulnerabilities are disclosed faster than any enterprise can mitigate them. That doesn't need to be the future. And I think a lot, Dave, about, I have three young kids, and I think a lot about, as they begin to learn about and use technology, as they grow older, as they become adults, what is the environment that we want them to live in? And I think we should aspire to an environment where intrusions are a shocking anomaly and where adversaries don't have the kind of success that they achieve today. And I think the second one is really the cybersecurity community is an extraordinary one. And there is no place that represents the best of that community, the passion, the dedication, the expertise, the excellence than the federal government and CISA in particular. And so I would just encourage, if I may, your listeners, to really keep both considering public service and leaning into that partnership and collaboration with government, because it is the only way that we'll stay ahead of our adversaries.

Dave Bittner: Well, Eric Goldstein is the outgoing Executive Assistant Director for Cybersecurity at CISA. Eric, best wishes as you head off into your new adventures. And I do appreciate all the time that you've spent with me and with our listeners.

Eric Goldstein: Thanks so much, Dave. It's been a pleasure and best of luck to you as well. [ Music ]

Dave Bittner: Ben, what do you think?

Ben Yelin: I'm glad we're on this exit interview circuit here. That's two interviews in a row where we have hot members of the administration on their way out to give their unvarnished view of the inside, right?

Dave Bittner: Right. Right.

Ben Yelin: But it is really interesting to learn about the great work that's being done at CISA to protect us and to protect our data. And I personally know a lot of great individuals who work there and believe in the mission. CISA has grown a lot, especially over the past decade. So it's just encouraging to hear everything that's going on there.

Dave Bittner: Yeah. And I think it's interesting, too, we kind of touched on this idea that the back and forth between the private sector and public service, that a lot of people are finding that they want to spend some time in an organization like CISA. They want to contribute, they want to, you know, they feel a patriotic duty to use their talents to help this issue. And maybe they're not going to stay there for their whole career, but they're finding value in having that experience, providing their expertise, but then also being able to learn from their colleagues there and then take that back to the private sector. So it's an interesting -- I guess, what I'm getting at is cybersecurity is, in my mind, maybe not uniquely, but certainly highly collaborative when it comes between the public and private sector.

Ben Yelin: It really has to be, because it's a problem that often affects private institutions, but has downstream effects that hurt the public. Think of the Colonial Pipeline cyber-attack, where that seemingly affected a single company, but it's the local emergency manager who has to deal with the fact that there are gas lines. So it's just a unique area for collaboration. And I think the revolving door in this instance, that's usually a pejorative term, revolving door between lobbyists and lawmakers, for example, is seen as kind of a slur. I think the revolving door here between CISA and the private sector actually provides some benefits.

Dave Bittner: Yeah. All right. Well, our thanks to, Eric Goldstein. Again, he is the outgoing Executive Assistant Director for Cybersecurity at CISA. We do appreciate him taking the time for us. And, of course, best wishes to him as he heads off to his next grand adventure. [ Music ] And that is "Caveat" brought to you by N2K CyberWire. We'd love to know what you think of this podcast. Your feedback ensures we deliver the insights that keep you a step ahead in the rapidly changing world of cybersecurity. If you like our show, please share a rating and review in your podcast app. Please also fill out the survey in the show notes or send an email to caveat@n2k.com. We're privileged that N2K CyberWire is part of the daily routine of the most influential leaders and operators in the public and private sector, from the Fortune 500 to many of the world's preeminent intelligence and law enforcement agencies. N2K makes it easy for companies to optimize your biggest investment, your people. We make you smarter about your teams while making your teams smarter. Learn how at n2k.com. This episode is produced by Liz Stokes, our Executive Producer is Jennifer Eiben, the show is mixed by Tre Hester, our Executive Editor is Brandon Karpf, Peter Kilpe is our publisher. I'm Dave Bittner.

Ben Yelin: And I'm Ben Yelin.

Dave Bittner: Thanks for listening. [ Music ]