Caveat 12.22.22
Ep 154 | 12.22.22

Diversity, equity, and inclusion in the workplace.

Transcript

Betsy Cooper: So I think there are some trends in the right direction. But I'm not confident yet that we're over the hurdle.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner. And joining me is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: Today, Ben shares the story of Roombas taking private photos inside people's homes. I look at the possibility of the federal government providing a cyber insurance backstop. And later in the show, my conversation with Betsy Cooper from the Aspen Tech Policy Hub. Our conversation centers around their recent research on diversity, equity and inclusion. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, we've got a couple of quick items of follow-up here before we jump into our main stories. I was contacted on Mastodon. I have to say, this is our first listener communication via Mastodon. 

Ben Yelin: It's the future. 

Dave Bittner: (Laughter). 

Ben Yelin: This is where Elon Musk is driving us. 

Dave Bittner: That's right. That's right. So that was kind of exciting. I heard from Eric Wenger, who is Senior Director of Technology Policy and Global Government Affairs at Cisco. Pretty good job, I would say (laughter). 

Ben Yelin: I love when we get contacted by big, important people. 

Dave Bittner: That's right. So... 

Ben Yelin: Not that all of you are not big, important people. 

Dave Bittner: That's right. You're all special in your own way. So Eric wrote in and said, I do have concerns about the exchange with Ben on not being allowed to update OS on regulated medical devices. He says it's just wrong. (Laughter) So... 

Ben Yelin: Fair enough. 

Dave Bittner: This is referring to some feedback we got from a listener on a previous show who said that one of the challenges with medical devices is that the FDA prohibits updating them and that if you update them, you would have to get new certification from the FDA. Eric writes in to say that that is mistaken. And he included a link to the FDA's guidance on this, which we will include a link to. I will read just a part of that guidance that Eric highlighted here. It says, (reading) however, the majority of actions taken by manufacturers to address cybersecurity vulnerabilities and exploits - referred to as cybersecurity routine updates and patches - are generally considered to be a type of device enhancement for which the FDA does not require advance notification or reporting under 21 CFR Part 806. So basically, what this is saying is that you can apply patches - specifically security-related patches. And that does not require going back to the FDA for recertification. So... 

Ben Yelin: Good to know. Yeah. Always important that listeners correct us when we're wrong. We were wrong on that one. 

Dave Bittner: (Laughter). 

Ben Yelin: And... 

Dave Bittner: It's perilous when we get to the edges of our own areas of expertise, right (laughter)? 

Ben Yelin: Yes. This is certainly an example of that type of subject. But I also appreciate him giving us the primary source here... 

Dave Bittner: Yeah. 

Ben Yelin: ...Which is the FDA guidance. So I'm glad we're including that in the show notes. Take a look. You'll get more information from that than you would from us. 

Dave Bittner: Yeah. I also want to emphasize that I - this is not to say that our previous listener who wrote in was doing so in bad faith or anything like that. You know, there's - this is complicated stuff, so it's easy to understand how someone could be mistaken or whatever. So we're just happy to get the straight story here and hoping to have Eric Wenger on the show sometime soon to talk about some policy stuff that he's working on with his colleagues there at Cisco. 

Ben Yelin: Fantastic. 

Dave Bittner: Yeah. What else do you have for us, Ben? You had a little quick story before we dive into our main stories? 

Ben Yelin: Yeah. So while we were sleeping last night - as we're recording this show - Congress released its year-end omnibus spending bill. And as Congress generally does, they tucked in a bunch of unrelated non-spending provisions 'cause this is the end of the Congress. You kind of - this is the train that's leaving the station. You want to get your luggage on the train. So one of those items is a ban on federal government employees using TikTok on their government-issued devices. And this is becoming a trend. We've seen it happen at the state level as these security risks have been highlighted in news reports about TikTok's close ties to the Chinese government. And now we're seeing it happen at the federal level. So presumably, this is a careful deal that's been worked out among House - Senate Republicans, Senate Democrats and House Democrats. House Republicans are not happy about this deal. 

Dave Bittner: Is that right? 

Ben Yelin: But it does mean that this is almost certainly going to pass and get signed by President Biden. So starting very shortly, if you are a federal government employee or contractor - I believe it applies to contractors - you will not be able to have TikTok on your government-issued device. So that's pretty big news coming out of Congress. 

Dave Bittner: Is that the kind of thing that could trigger a set of dominoes to fall? 

Ben Yelin: Yeah, I think so. I mean, really, Congress is trying to use the power where they have it. I think it would be overreach, and, frankly, probably not within Congress' powers to do a type of nationwide ban on private individuals putting - getting TikToks - TikTok on their devices. I also think that would be politically perilous. So I think they're kind of going for the more narrow ground of government-issued devices. It's where they have very clear jurisdiction and where they can have the most immediate impact. 

Dave Bittner: All right. Interesting. I will keep an eye on that one for sure. Well, let's jump into our main stories this week. Why don't you start things off for us? 

Ben Yelin: Well, it's the holiday season. And our gift to you today is we're going to talk about Roombas. 

Dave Bittner: (Laughter). 

Ben Yelin: But unfortunately, the story is a little darker than most stories about Roombas. I know Roombas. I don't have one, but... 

Dave Bittner: I do. 

Ben Yelin: All right. So you are a proud Roomba owner. 

Dave Bittner: I have a - it's not actually a Roomba brand robot vacuum cleaner. But I do have - let me preface this by saying I have a shedding dog. So... 

Ben Yelin: Gotcha. 

Dave Bittner: ...That is what motivated us to get our Roomba-esque robot vacuum cleaner. And it does its job diligently and, you know, makes it so that - it spreads out the amount of time in between when we have to do manual vacuuming of the main floor of our house. 

Ben Yelin: Shedding dogs, especially in the spring months when they lose that winter coat, that's going to get you. So I mostly knew Roomba from great YouTube videos of cats riding on top of Roombas, which, you know, that's going back decades now. Which is why when I saw this story come across my timeline, it was so interesting. It's from Technology Review, and it's entitled "A Roomba recorded a woman on the toilet. How did screenshots end up on Facebook?" And this is really a story about the chain of custody of images taken by Roombas inside people's houses. Now, some caveats here, so to speak. For one, these are not ordinary Roomba customers. These are people who specifically agreed to have Roombas with cameras that are taking pictures to help improve artificial intelligence. So you sign a waiver that says you can have this Roomba. It is going to be taking photos inside your house. When that green light is on, that means the camera is up and running. 

Ben Yelin: But I think the terms of service here, the thing that these customers signed, doesn't reveal the full extent of what happens with these images. So these images get sent to a contractor called Scale AI. It is a San Francisco-based company that has employees all over the world. And their job is to scan images, classify images to improve artificial intelligence. And so they send images to a bunch of data centers all around the world. They have some in South America, India, different countries. And these are live human beings who get access to these pictures and make certain determinations about the pictures. So, for instance, this object that you've encountered that we've never seen on previous footage from our Roombas is an object. And so we should train our artificial intelligence to go around that object. 

Dave Bittner: Right. That's a volleyball or a pile of dog poop. 

Ben Yelin: Exactly. Exactly. Let's learn what to do with that. 

Dave Bittner: (Laughter) Right. Right. 

Ben Yelin: The problem is - I think the implication of the terms of service was that robots or nonhuman beings were going to be the ones surveying these images when they were sent to Scale AI. But it's actually real human beings. And in this case, real human beings leaked or shared these images. And they ended up on Facebook, including a couple of rather intimate ones - the one mentioned in the headline about a woman on the toilet. There's one with a young boy who was sprawled across the floor. They are grainy shots. They're not clear photos. But it certainly raises concerns about the privacy risks involved with Roombas here. 

Ben Yelin: So what are the broader implications? Because obviously, the story itself only affects a small subset of people who are part of this trial run of having cameras on our devices. 

Dave Bittner: Right. 

Ben Yelin: For one, there are a lot of home devices now that have cameras, including things like Ring devices, which is owned by the parent company Amazon, which is also the parent company of iRobot, which makes the Roomba - or at least to the process of acquisition. So there's certainly that risk that if we have these practices of misleading customers into collecting data that's going to be sent to actual human beings, sometimes a very private images, then we're going to be going down a pretty slippery slope. And there's the fact that the legal system offers very little protection for this. This is where you really start to notice that the federal government doesn't have any comprehensive data privacy laws. We have this patchwork of laws, many of which would not apply to things like Roomba taking a picture of you on the toilet. It's something that HIPAA wouldn't cover and something that FERPA wouldn't cover. So oftentimes these terms of service are based on state laws like the CCPA or are modeled after laws in Europe on GDPR. And that just doesn't offer the type of full legal protection that people might need to prevent these images from being spread. And then there's just the question of this chain of custody is how many of these companies farm out images to contractors around the world who are actual human beings who would have a chance to share these photographs? So I think that's really why this article is potentially concerning, even though, you know, Roombas themselves are kind of silly. And I enjoy talking about them. But I think this certainly is - was an eye-opening story. 

Dave Bittner: I do remember folks having concerns when Amazon was going to purchase iRobot because the Roombas - some of the more advanced Roombas, they map out your home. And they create a model of your home, which helps them do the cleaning that they do. But the concern was now Amazon has a model of your home. 

Ben Yelin: Right. 

Dave Bittner: (Laughter) Right. 

Ben Yelin: Not only does Amazon have it, but it's like, we think that a crime might have been committed. Where can we go get a good model to figure out, you know - to gather some evidence of what this home actually looks like and where we should, you know, search? 

Dave Bittner: Yeah. 

Ben Yelin: All right. Let's go to Amazon and get some images from the Roomba device. 

Dave Bittner: This also reminds me of - years ago, I recall a story about - I want to say it was one of the companies like Expensify - and I don't know if it was actually Expensify - but one of those companies who helps you keep track of your corporate expenses. And one of the ways you do that is you send them a photo of your receipts, and the receipts get scanned, and they get automatically entered, and it makes everything easier for everybody. And the company, who may or may not have been Expensify - again, I don't remember. 

Ben Yelin: Don't sue us, Expensify. 

Dave Bittner: Exactly. They were sending these images off to the Mechanical Turk service, which - I can't remember who runs Mechanical Turk. But it's a service where, very inexpensively, you can have things sent off to people. 

Ben Yelin: Right. 

Dave Bittner: And the people actually do the work. And what you get back is however you wanted that image or data or whatever processed. And, you know, it gets sent off to folks who are in a part of the world where people are working for very little money. And so you can get the things you need done done very inexpensively. 

Ben Yelin: Right. 

Dave Bittner: But you can see the concern here, that you're sending your receipts off to what you think is an automated scanning device, but it's actually being entered by a real, live human being. 

Ben Yelin: Right. 

Dave Bittner: And just like your Roomba story, there are concerns about the chain of custody and so on and so forth. 

Ben Yelin: And it just - the terms of service, at least in this example, just don't make clear that it's human beings who are going to review the images. The language is something like, it gets sent off to be processed. And so I think most of us would picture that going into some, you know, machine learning algorithm, where a computer collects a bunch of images and tries to make improvements on the product. I just think it becomes a little starker when you realize that these are pictures that actual human beings are processing, and they've come from our houses. The other thing is, you know, iRobot claims they've really tried to anonymize the data. Even in these images, the people who have been identified - their faces are blacked out. 

Dave Bittner: But not their butts on the toilet (laughter). 

Ben Yelin: Not the butts on the toilet, yeah. Good thing we don't yet have butt-scanning technology... 

Dave Bittner: (Laughter) Right. 

Ben Yelin: ...That can identify us in public. 

Dave Bittner: Well, you could have an identifying tattoo or something like that. 

Ben Yelin: Exactly, exactly. Coming to a "Law & Order" episode near you. 

Dave Bittner: (Laughter) Right. Yes. Right. 

Ben Yelin: So, you know, they do make efforts to preserve privacy. But you are going to end up in some situation where, either due to human error or just the inability to filter all of these images, somebody's face is going to get released. And that's a very personal data that can be used for nefarious purposes, facial recognition, etc. And I just don't think that people who agreed to this type of arrangement where Roomba is taking photos - I don't think they agreed - or meaningfully agreed - to that type of collection. So I think Roomba was kind of caught flat-footed here. They have said that they are going to discontinue their relationship with this particular contractor. 

Dave Bittner: You think (laughter)? 

Ben Yelin: It's sort of like - yeah, you know, it's sort of like, you know... 

Dave Bittner: How to get fired fast. 

Ben Yelin: Yeah. And it's kind of like, soda is making me fat. I'm going to quit Coca-Cola. It's like - but I might use Pepsi. It's like... 

Dave Bittner: So the story of this ending up on Facebook - was that basically - was that an employee doing it just for the LOLs? 

Ben Yelin: Yes. 

Dave Bittner: OK. 

Ben Yelin: That seems to be exactly what happened. 

Dave Bittner: OK. 

Ben Yelin: So some human being got ahold of the image. And I believe what happened is it was one of those, hey, look at this. 

Dave Bittner: Right. 

Ben Yelin: Like, look what I came across in my line of work. And then that ended up being leaked to - or somebody took a screenshot on Facebook and sent it to media sources. 

Dave Bittner: I see. 

Ben Yelin: Yup. So, you know, I think they are going to use a different contractor, but even - they interviewed a contractor from a different company, and I think - one of the things that the CEO said is that human beings often share these photos with one another for legitimate purposes. You know, they'll say in their contracts, do not share these photos at all. But, like, hey, Bob, I'm trying to process this photo. I can't figure out what this is. 

Dave Bittner: Oh. 

Ben Yelin: Here's an email. Like, that happens. They - people ask each other for help. And that's - I mean, that's very hard to control internally. So watch out for those Roombas. 

Dave Bittner: (Laughter) Now, now, let's not begrudge the poor Roomba who's just doing their day-to-day work, the unglamorous work of keeping our homes clear of pet hair and dust and things like that. Those poor... 

Ben Yelin: That's true. They're so cute, too. 

Dave Bittner: Those little robot - they are cute. Yes. 

Ben Yelin: I know. 

Dave Bittner: A friend of mine - my - the first friend I had who got a Roomba when they were new - I said, so how's the Roomba? He said, I love it. I said, so does it save you time? He said, absolutely not. I said, why? He said, 'cause I spend all my time watching the Roomba (laughter). 

Ben Yelin: Right. Exactly. At least you don't have to physically vacuum. But, like... 

Dave Bittner: Right. Right. 

Ben Yelin: ...I've seen them in houses. They are mesmerizing. 

Dave Bittner: Yeah. 

Ben Yelin: And, like, their ability to find the mess in a given room is really impressive. 

Dave Bittner: Yeah. 

Ben Yelin: I mean, as somebody who frequently glosses over where all the crumbs are in my own living room, you know, I'm glad we have this level of artificial intelligence. I just think hopefully, from a chain of custody perspective, we don't end up in a situation where these images are being sent to human beings who might share them with one another. 

Dave Bittner: Yeah. We've put googly eyes on our Roomba. 

Ben Yelin: Oh, that's such a good idea. 

Dave Bittner: And we're very - and it's very sad because sometimes the Roomba gets stuck under something and you feel - and I just... 

Ben Yelin: You just see those little fish eyes going (imitating sad sound). 

Dave Bittner: Yeah. Right. Or, like, one eye is poking out saying, why me? And so you bring it out and you say, oh, poor Roomba. Let me take you back to your charger. 

Ben Yelin: It's - a great prank you can play on your significant other is you can buy those googly eyes at a dollar store and just put them on everything... 

Dave Bittner: Oh, yeah. 

Ben Yelin: ...Including in your refrigerator and - low-stakes prank. So just a little advice from the two of us. 

Dave Bittner: Yeah. Yeah. Still on your first marriage, aren't you, Ben? 

Ben Yelin: So far, yes. We'll see what happens after I put googly eyes on everything in my house. 

Dave Bittner: That's right. That's right. All right. Good stuff. And we will have a link to that story in the show notes for sure. My story this week comes from the folks over at Bloomberg Law. And this is about - this is written by Daphne Zhang, and it's titled "As Cyber Insurance Dries Up, Treasury Department Eyes a Backstop." Now, Ben, I'm going to go and say I called it. 

Ben Yelin: Ding, ding, ding, ding, ding, ding, ding - Dave was right. 

Dave Bittner: I have been beating this drum, or I guess I have been asking this question probably as long as we've been doing this show. I've been making the case wondering, are we going to come to the point with cyber insurance where it ends up like flood insurance, where it's a sucker bet for the insurance companies. They get out of the business because nobody can make money doing it. And it becomes only the federal government who provides cyber insurance. 

Ben Yelin: And the answer to your question is yes, according to this article. 

Dave Bittner: There's - well, yes, maybe. 

Ben Yelin: Right. At least they're looking into it. 

Dave Bittner: So - that's right. That's right. So the Treasury Department is asking for - or was asking for public comment on whether the government needs to shore up the insurance industry because we've seen losses, of course, have gone through the roof for these insurance companies. Big insurers like Lloyd's of London, the usual suspects who will insure anything if the price is right are saying, yeah, this cyber... 

Ben Yelin: (Imitating horn) Yeah. 

Dave Bittner: Right. Cyber - the losses are too big. Like, they are catastrophic losses on par with things like natural disasters and fires and floods and those sorts of things. So they're inquiring as to whether or not things that are uninsurable losses should be backed up by the federal government. And as I - I don't mean to beat my chest here, but also, I mean, it didn't take a genius to see that this could be the way that it was going. It's pretty obvious. 

Ben Yelin: Right. I mean, we've been seeing for a while that these insurance companies are losing, you know... 

Dave Bittner: Yeah. 

Ben Yelin: Their losses are reaching 300-, 400%. It's a model that was increasingly unsustainable. Because there are so many cyberattacks, the underwriting process can't keep up with the level of risk that exists in the cyber realm. And then you get into this death spiral where insurance companies pull out of the market because it's too expensive, and then all the risk gets passed on to consumers, and that's not good for anyone. 

Dave Bittner: Yeah. 

Ben Yelin: So I think the idea here, as we've done with flood insurance, is to pass off some of the risk to the government. And it's not the only - flood insurance isn't the only example of this. I mean, even something like the FDIC, it's federal - it's the federal government backstopping our financial system in case... 

Dave Bittner: Right. 

Ben Yelin: ...The private sector fails. But it's not a great solution, as we've seen with flood insurance. 

Dave Bittner: No. Flood insurance is crappy insurance. I mean... 

Ben Yelin: Yeah. It's terrible. 

Dave Bittner: Yeah. Yeah. 

Ben Yelin: The system has - does not work well, and people who've relied on it after major flooding events always end up disappointed. 

Dave Bittner: Yeah. But the flip side of that - does it incentivize people to not build in flood zones, knowing that they can't get good insurance? 

Ben Yelin: I don't know. I mean, do we have any reliable data on that? Not that I've seen. 

Dave Bittner: Yeah. 

Ben Yelin: I think people just don't make decisions with those precise level of risk calculations. 

Dave Bittner: Yeah. 

Ben Yelin: I mean... 

Dave Bittner: People love the water. 

Ben Yelin: Yeah. Everybody's moving to Florida, so they're - seemingly, that risk is not being priced in. 

Dave Bittner: Yeah. 

Ben Yelin: I think this is sort of a carrots-and-sticks approach that the government would be taking, where we'll backstop your insurance policies, but you have to comply with certain federally mandated cybersecurity standards... 

Dave Bittner: Right. 

Ben Yelin: ...So whether that's the NIST framework or whatever they come up with in the years ahead. And I think the private sector is saying, well, you know, that might be overly restrictive. Companies should be given the freedom to offer policies on their own terms. Let us underwrite it. We'll have more innovation in the industry. You know, don't slap a bunch of mandates on it... 

Dave Bittner: Yeah. 

Ben Yelin: ...Which is all well and good until they're losing so much money that they drop out of the market, and consumers are left footing the bill. So there's no easy answer here. I think it's good that the federal government is looking into this. It looks like the comment period has ended. 

Dave Bittner: Yeah. 

Ben Yelin: But I'm very curious to see what comes of this. 

Dave Bittner: And this article points out that if we do get a national cyber insurance policy, that it may very well limit coverage to things that qualify as critical infrastructure... 

Ben Yelin: Right. 

Dave Bittner: ...Which makes sense. But you could see a whole lot of people scrambling to say, we're critical infrastructure. We're critical. 

Ben Yelin: Yeah, it's going to be fun for... 

Dave Bittner: Please, please. 

Ben Yelin: ...The lobbyists. Yeah. 

Dave Bittner: Yeah, yeah. 

Ben Yelin: My toy store in the mall is critical infrastructure, OK? 

Dave Bittner: (Laughter) That's right, especially this time of year. 

Ben Yelin: Exactly. Yeah. That is going to be somewhat limiting. And that also doesn't - I mean, there are companies that clearly don't qualify as critical infrastructure. But they're still things that people rely on. They're still businesses that help power our economy. And we don't want to get into a situation where there is no option for cyber insurance for those types of companies. So that would be my significant concern here. But the current system's unsustainable. The premiums have - one of the stats that jumped out at me here is that cyber premiums jumped 95% in 2021. 

Dave Bittner: Yeah. 

Ben Yelin: That's just commensurate with the increased level of risk. They're just - the underwriters aren't able to process the risk quickly enough. And so even despite these skyrocketing premiums, they're still losing money. So something has to be done. Whether it's this solution or another solution, the status quo is unsustainable. 

Dave Bittner: One - if you'll allow me a bit of speculative snark here, which is one of my specialties, this article points out that more and more cyber insurance policies are excluding attacks from nation-state actors, right? 

Ben Yelin: Right. 

Dave Bittner: And we have certainly witnessed that one of the common responses when an organization is attacked is for them to say we were attacked by sophisticated nation-state actors. What could we do? They throw up their hands. 

Ben Yelin: Right. 

Dave Bittner: You know, there's nothing we could have done here. I wonder how these two things are going to intersect. Are we going - are - so in order to get their insurance coverage, are we going to see organizations saying we were attacked by script kiddies who barely knew what they were doing (laughter). It's clear - this was clearly someone from down the street working in their parents' basement. 

Ben Yelin: Maybe it just depends on who's listening. You go - for insurance purposes, you say, we were attacked by the kid in his basement. But for public relations purposes, you say we were attacked by a sophisticated nation-state. I just hope, you know, people don't see the two statements together at any given time. 

Dave Bittner: Yeah, yeah. All right. Well, we will have a link to this story in our show notes. And, of course, we would love to hear from you. Our email address is caveat@thecyberwire.com. 

Dave Bittner: Ben, I recently had the pleasure of speaking with Betsy Cooper from the Aspen Tech Policy Hub. Our conversation centered around their recent research on diversity, equity and inclusion. Here's my conversation with Betsy Cooper 

Betsy Cooper: First, as an organization, the Aspen Tech Policy Hub and the umbrella we're a part of - Aspen Digital - care deeply about diversity and equity inclusion issues and had begun, of our own accord, incorporating certain techniques to try to encourage that in our own hiring and internal processes. So, for instance, we have rules about how we pay speakers to ensure that everybody is treated equitably and that they're - we're not taking away their time for free. Or we will incorporate certain processes like anonymous review when we're hiring people. And so we got to talking and thinking about, well, there must be other best practices to encourage diversity, equity and inclusion. And we're, you know, just this one little organization. Maybe we should get a bunch of smart people together and see what they think would matter and how they would care about those issues. 

Betsy Cooper: And then the second motivating factor was the Hewlett Foundation. So at the time, the program officer, Eli Sugarman, also cared deeply about these issues. And so we got into a chat with him about this. And he said, you know, it'd be great to do a couple roundtables with really smart people - think, you know, high-level executives who might not be thinking about this, as well as people in the field who come from diverse backgrounds - and talk about their experiences, so we can see if we can come up with some simple tips that people can take forward or some ideas that funders can actually invest in for the future. And we said, sign us up. And so the project was born. 

Dave Bittner: Well, let's go through some of the key findings here. I mean, what are some of the things that rose to the top for you? 

Betsy Cooper: Well, so first, we have a lot of work to do. I think, you know, some of the data is incredibly stark. I mean, that only 4% of workers self-identify as Hispanic, 24% as women - like, these are not good numbers, not representative of our community. And then a lot of the stories that you hear are also people who enter the field and then depart because they don't feel comfortable in that space. So I think, you know, I'd hoped it would be a bit more of a happy ending story. But I think the lesson really was that there's a lot more work to do. 

Betsy Cooper: I think we came up with some interesting ideas for steps - sort of bigger-picture steps that could be taken - So things like organizing a coalition to think about certifications for cybersecurity jobs, collecting and sharing anonymous data among organizations, especially private sector organizations, to share characteristics that can give you an idea about successful hiring for cybersecurity jobs. So, you know, if you're hiring people who don't have a traditional background in cybersecurity, what are you seeing that is leading them to be successful? And can we share that information among ourselves? Another one especially applicable in the federal space - whether the current criminal background check process is working. A lot of candidates may be targeted by police unfairly, may have some form of a record, and that can eliminate them from all sorts of jobs, federal and sometimes private sector as well. So these were the sorts of ideas trying to develop coalitions to actually get information sharing happening so that we can make progress on these issues. 

Dave Bittner: You know, I hear a lot of folks, when talking about this, point out - and I think, you know, perhaps trying to find some good news - that the numbers may be trending in the right direction, that more women are being hired and that some of these are improving. Is - does that track with the sort of things that you gathered here or not? 

Betsy Cooper: So we didn't run trend lines. So I can't say for sure. I do know that on women, the numbers are better than they were years ago because I did a report, I think, back in 2017. So the trend line, at least in female hiring, is better. I also am seeing more companies at least giving lip service to the idea that they care about these issues, and that's obviously not enough. But that's at least an improvement from years ago in which there was sort of a coded impression that actually hiring someone who is different from the rest of the team might actually harm the team's chemistry in some way and lead to negative results. So at least we're not hearing that sort of explicit bias come out on a regular basis anymore. 

Betsy Cooper: But I do think that on the other hand, we're seeing that, you know, the cyber talent pipeline is incredibly under-resourced to begin with. Craig Newmark has been doing a lot of interesting work in this space. And then when you add the diversity, equity and inclusion layer on top of it, it gets even harder. If you don't have enough people to begin with, and now you're trying to grow the field to represent more of what America looks like rather than the existing cybersecurity field, it's a real challenge. So I think there are some trends in the right direction, but I'm not confident yet that we're over the hurdle. 

Dave Bittner: I know you're tracking some real-world trends and things that are going on, I believe, in the government space. Can you share with us what you're seeing there? 

Betsy Cooper: Yeah. So I think there's been some interesting innovations happening in government and then some actual pushback on those, which I think can give you some color as to exactly what we're facing. So the Department of Homeland Security - I believe it was last year - came up with an interesting program in which they - and I believe I saw they invested seven years researching this, so long after I left the department. But - so they invested a ton of time into coming up with a new personnel management system for cybersecurity and essentially coming up with new hiring authorities and ability to pay closer - not up to, but closer - to market rates so that they would be more competitive. And so this sounds, on its face, like a huge victory. Like, you know, DHS is an incredibly important cybersecurity partner. And the fact that they can hire more people seems like it should be a good thing. But apparently, the director of the Office of Personnel Management has been getting a lot of pushback from other agencies about this being allowed, that the other agencies are saying now DHS is able to hire away not just private sector people, but other federal employees doing cybersecurity and other agencies because now DHS can pay more. 

Betsy Cooper: So it's one of those holistic questions. You know, I support the opportunity of DHS to do better in this space. And I do think that given that they have CISA there should be some priority given to the work that they're doing. But sometimes because government operates in silos, there's nobody in position to sort of see the overarching picture and to predict the pushback that might happen or the effects that might occur on other agencies if the salaries at DHS are disproportionate to others. So I think that's one key reason why the coalitions I was mentioning earlier are so important is that you need both a variety of stakeholders and folks from outside - you know, like some of my Aspen Institute colleagues - to come in and see these opportunities and maybe give you a little bit of red teaming on these ideas to help identify some of the challenges that might come up before they happen, rather, as in this case with OPM, after the fact. 

Dave Bittner: How about some actionable items? I mean, based on the information that you all have gathered, if I'm part of a hiring team at my organization, how do I do my part to make sure that we're including people who deserve a look? 

Betsy Cooper: Yeah, so some of the things that we do that are really important - so first, anonymous review of job applications. So look at the cover letter before you look at the resume. Or sometimes we use tests that are - basically, you do it anonymously so that you're able to see the person's quality. We are human. So if you see someone with a particular academic background that speaks to you or a specific set of credentials, you might read the rest of that application unthinkingly with - you know, with an expectation of whether that person is strong or not. We do this in our own Aspen Tech Policy Hub fellowships, for instance. And it's amazing how often in the anonymous review I will pick the community college grad over the Harvard grad if I do it anonymously, but might come out the other way if I had read the resume first. 

Betsy Cooper: Then second, I think it's really important to keep job postings open until a diverse slate of candidates have applied. So don't jump right to the first applicants that you get. You have to look at your marketing strategies. Are you marketing only to elite universities or places that do not generate diverse talent? If so, you might want to reconsider that. At the interview stage, we incorporate what we call the modified Rooney Rule - sort of borrowing from the NFL. And this has flaws. So I want to state this upfront that this is not everything. But we do always ensure that we have at least one person who does not identify as male, and one person of color interviewed for every position. And I think the folks that we hire and the folks that are in our fellowship reflect that really well. 

Betsy Cooper: And then finally, you have to be willing to have conversations around these things. So one of the easiest places to do that is on referrals. So how often do you get a candidate that somebody emails you about and you give them an exceptional look or maybe bump them through because your friend asks you to look at them carefully. So we don't take referrals, and the only time we will look at a referral is as a tie-breaker once that person is already at the final stage of hiring. If we have two candidates and they're identical otherwise, we might look at a referral at that stage. But we don't let them affect the early review of our candidate. So having conversations around DEI, around some of the practices you might be deploying that are leading you to get candidates that always look the same - some of these tips hopefully can lead you to do a little bit better and then to have a conversation around taking even more serious steps to encourage DEI within your company. 

Betsy Cooper: And then the last thing I want to say is that it's not just about the supply of candidates. It's also about the demand and the demand from your company to want that diversity in your organization. So one reason that companies end up losing people is because they don't feel valued once they're in the organization. So they feel like they were hired to tick a box and not actually valued for their skill set. So that really involves having, you know, diversity, equity and inclusion built into all sorts of things. What trainings are you providing? You know, in your all hands are you talking about these questions? Are you taking on difficult issues when they arise or are you suppressing them? So creating that environment will lead you to have success in retaining the people you do hire. And then hopefully that will lead to more diverse candidates being interested to joining you. 

Dave Bittner: Ben, what do you think? 

Ben Yelin: This was really music to my ears. I mean, I've tried to make a career out of this... 

Dave Bittner: (Laughter). 

Ben Yelin: ...Where you bring together policy experts and cyber practitioners in the same room so that we can just talk to one another. I mean, I think we're so blind to the issues on the other side of the ledger. And so getting this type of group where practitioners can not only learn about public policy but also best practices and private industry is really going to be beneficial. And I think the - we should really go the other way, where there's some type of group out there for us policy nerds to get a 101 in the technical issues with cybersecurity. And, you know, my institution, the University of Maryland School of Law, we've tried to do that by having a cyber bootcamp course where people learn about cybersecurity in a managed one-week course - kind of a crash course in the technological issues. But that's why this was kind of an encouraging interview that to hear that there's a broader effort out there to bring these two communities together. 

Dave Bittner: Yeah. Well, our thanks to Betsy Cooper from the Aspen Tech Policy Hub for taking the time and sharing her expertise with us. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.