Caveat 5.4.23
Ep 170 | 5.4.23

Your approach to efficient security compliance.

Transcript

Nick Means: If you put change management in place to check a box on a program, then there's incentive for people to work around it. But if everybody really understands and has bought in that that's the safest way to deliver software that benefits your users, that's a much more effective program. And it's much more palatable to the engineering team because they understand why the controls are in place in the first place.

Dave Bittner: Hello, everyone, and welcome to Caveat, the CyberWire's privacy, surveillance, law, and policy Podcast. I'm Dave Bittner, and joining me as my co host, Ben Yelin, from the University of Maryland Center for Health and Homeland Security. Hey there, Ben.

Ben Yelin: Hello, Dave.

Dave Bittner: Today, Ben has the story of a Pennsylvania case on whether people have a reasonable expectation of privacy in Google searches. I look at the notion of a defensive search exception to the Fourth Amendment. And, later in the show, we're joined by Nick Means from Sym to discuss how building a compliance program doesn't have to tank your engineering velocity. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. All right, Ben. We've got some good stories to share this week. Why don't you start things off for us here.

Ben Yelin: So we're back to the professor Orin Kerr files. I religiously follow his Twitter feed. He alerts me to interesting Fourth Amendment cases, and this one caught my eye.

Dave Bittner: Yeah.

Ben Yelin: It's the Commonwealth of Pennsylvania v. John Edward Kurtz, and it's about whether people have a reasonable expectation of privacy in their Google searches. So the facts of this case are really disturbing. This sounds like a allegedly really, really bad guy. John Edward Kurtz appeals his conviction on a series of rape, kidnapping, attempted rape, attempted kidnapping. And apparently he's a serial kidnapper and rapist. There were five alleged victims that were laid out in the criminal complaint.

Dave Bittner: Wow.

Ben Yelin: In the case that made up this charge, he went to an individual's house while she was sleeping, and while he would have known that her husband was working a night shift at a local correctional facility and sexually assaulted her, dragged her into a field, attempted to kidnap her. Eventually she got away, was able to call law enforcement.

Dave Bittner: Wow.

Ben Yelin: So the Pennsylvania State Police engaged in a pretty comprehensive investigation. They were able to extract DNA from a sample left in the victim.

Dave Bittner: Yeah.

Ben Yelin: So they had circumstantial evidence that this individual was involved. There was a match for the DNA, but you can't prove that this person committed sexual assault from that alone. So we're looking for other sources of data. So they went to Google and asked if anybody in the immediate vicinity within a certain time period had searched the physical address of this alleged victim. And, sure enough, Mr. Kurtz' IP address was among those who had searched for that location. So they ended up trailing thus guy. Turns out he worked at the same correctional facility as this woman's husband. They were able to arrest him. He ended up confessing to the crime because they presented him with all of this very overwhelming evidence. Your classic fruit of the poisonous tree here, he wants to suppress the original evidence used that led to his conviction and that is this request from Google, subpoena to Google for these search records. So the question turns on whether Mr. Kurtz has any Fourth Amendment rights in what he puts into a Google search bar. And that turns on whether he has a reasonable expectation of privacy in his Google searches. Generally, as we've talked about many times, people do not have a reasonable expectation of privacy and information they voluntarily submit to third parties. So when he typed something into Google and print -- presses that Return button to instigate that search, he is forfeiting his reasonable expectation of privacy because he either knows or should know that Google is keeping some sort of record of it.

Dave Bittner: Wouldn't that be in the Google ULA as well?

Ben Yelin: It absolutely is. So they went and looked back at the relevant ULA that was in place at the time this crime occurred. I think it was the 2016 edition of the ULA. And it clearly said that, in so many words, you don't have an expectation of privacy in terms of law enforcement, that we reserve the right to provide search requests to law enforcement upon a legal request from a local law enforcement agency.

Dave Bittner: Okay.

Ben Yelin: So, yeah. That was all contained in the ULA, although that presents a bunch of other interesting questions that we can talk about in a moment. So the Court held here that you do not have a reasonable expectation of privacy in a Google search. It's just your standard third-party doctrine case. They knowingly volunteered this information to a third-party. Google retains that information. Once it's in Google's hands. You can't have an expectation that the government is not going to get hands on that data.

Dave Bittner: Right.

Ben Yelin: Now, the argument everyone makes these days is I'm Carpenter. I'm Mr. Carpenter. Everybody goes back to the Carpenter v. United States case, one instance where the Court said just because information was submitted to a third-party doesn't mean that protected -- that Fourth Amendment protection is canceled in those circumstances. The specific facts of the Carpenter case had to do with historical cell site location information, a guy who committed a bunch of robberies of, ironically, cell phone stores. And it turned out that the government had been tracking his historical cell site location information. In that case, they held that, even though he voluntarily submitted that information to the third-party, the cell phone company, it wasn't really voluntary because all he did was turn on his cell phone. The cell phone automatically tries to find the nearest tower, not to mention the fact that the depth and breadth of information that you can obtain about somebody given their cell site location information, it's just so vast that it makes it different than other third-party doctrine cases. So every criminal defendant is going to say, I'm the new Carpenter. This is -- this is a case where the depth and breadth of information you collected is so vast that I should be granted some type of Fourth Amendment protection on that information.

Dave Bittner: Okay.

Ben Yelin: And the Court here basically says no. Nice try. You are not Carpenter.

Dave Bittner: Okay.

Ben Yelin: This is one discreet search term. We weren't engaging in any type of historical location tracking. And the voluntariness element is a little clearer here. Instead of just turning on your device, you're actively typing something into the search engine bar and pressing that Return button, meaning you are engaging in that search. So this was only one of the grounds of his appeal, but it was upheld on it. His conviction was upheld on appeal. So, at least in this Pennsylvania court, this is the Pennsylvania Supreme Court, we have a pretty declaratory statement here that, based on this case and other cases, you do not have a reasonable expectation of privacy in what you type into that Google search bar.

Dave Bittner: First of all, I want to have T shirts made up for you that say, You are not Carpenter.

Ben Yelin: I like that as a market -- as a marketing idea.

Dave Bittner: Sell that -- in the gift shop at the Center for Health and Homeland Security.

Ben Yelin: I love that idea. And I think I'll just show up in one of those shirts to every single Fourth Amendment case.

Dave Bittner: That's right. That's right.

Ben Yelin: Because I would say in maybe 1 percent of them somebody can make a viable comparison to Carpenter. But, in the rest of them, it's just, I'm Carpenter. Please, I'm Carpenter. No, you're not Carpenter. You're not Carpenter.

Dave Bittner: Was there any -- was there any particular element of this that caught Professor Kerr's eye here that he had insight on?

Ben Yelin: So remember when we talked a few weeks ago about his article on how Terms of Service shouldn't be a factor in Fourth Amendment investigations, how when you are signing that -- when you are agreeing to that ULA, that has implications in terms of the relationship between you and that private company and your rights to that information vis-à-vis that company, but it shouldn't have any implication on one's Fourth Amendment rights. Part of the justification of the Supreme Court making this decision was that Mr. Kurtz agree to the ULA. And I think Professor Kerr, though he agrees with the ultimate holding that people do not have a reasonable expectation of privacy in their Google searches disagrees with that reasoning. He thinks that the fact that you agree to a ULA is irrelevant to your constitutional expectation of privacy, that it should be an objective test that goes beyond the terms of service.

Dave Bittner: So the ULA can -- he's saying the ULA should not be able to disallow that fundamental Fourth Amendment right.

Ben Yelin: Right. Exactly. And the inquiry as to whether a person has a reasonable expectation of privacy should not turn on the content of that ULA. It is a separate constitutional inquiry that should be -- you know, that inquiry should be based on previous case law, should be based on Court's evaluation of one subjective expectation of privacy at societal standards to determine whether that expectation is reasonable or not. ULAs are just a different ballgame entirely. It defines the relationship between the person and the private company, and it should just not be a consideration. And that's generally a perspective I've come to agree with based on Professor Kerr's writing. The other thing that really interested me is I always look at the replies in Twitter.

Dave Bittner: Oh, Ben, Ben, Ben.

Ben Yelin: Somebody asked, would it have mattered if this person engaged in an incognito search, which I think is a really good, natural next question.

Dave Bittner: Yeah.

Ben Yelin: And Professor Kerr said no. It wouldn't matter. I tend to agree with that. In just pulling up an incognito browser, it says that you can browse privately and that Google will not save your browsing history, cookies, and site data, information entered in forms. But your activity may still be visible to websites you visit, your internet service provider, your employer, or your school.

Dave Bittner: Right.

Ben Yelin: I think that's enough indication that some records are being collected here when you're using incognito browser. It is not a Get Out of Jail Free card. You know, I think it's useful to try to conceal your search history if you're trying to trick your spouse or your neighbors or, you know, your employer perhaps, if your employer is not technologically savvy. But it's not something that can be effectively used to evade law enforcement. So you really should not have a reasonable expectation of privacy, even when you use incognito browser, that that's not going to be the determining factor here.

Dave Bittner: Yeah.

Ben Yelin: The fact that you are voluntarily entering something into the search bar, whether it's in a regular Google search or whether you are using incognito browser is and of itself indication that you have no reasonable expectation of privacy. You're typing it in there. It is on the internet, and that means a lot of people are going to be able to see it. And that's really what this case comes down to.

Dave Bittner: Yeah. There's that old saying, don't put anything on the internet that you wouldn't put on a postcard.

Ben Yelin: Yeah. I mean, I think that's ultimately the determining factor in this case is we know or at least we should know that everything we do on the Internet creates some type of permanent record, even if we're using incognito browser. And it's going to some type of third-party. Most of these third-party companies are not going to lift a finger to fight for our privacy rights. It's not in their interest. When the Pennsylvania State Police comes to Google and says, Hey, is anybody searching the address of this woman who was brutally raped and kidnapped, they're not going to be like, You know, I'd rather not give that to you.

Dave Bittner: We're going to stand on principle here.

Ben Yelin: Yeah. I'm going to stand on principle and protect this guy who is clearly a serial rapist. Like, that's not something that Google is going to do.

Dave Bittner: Let me ask you this: Did -- do we know, did law enforcement get a warrant for this information, or was it just handed over?

Ben Yelin: So they did end up getting a warrant --

Dave Bittner: Okay.

Ben Yelin: -- which is one of the reasons why the finding in this case that there is no reasonable expectation of privacy in Google searches isn't actually the determining factor in this case. They did end up getting a warrant. It was just one argument that the defendant made. And so they responded to that argument and said, No. You don't have reasonable expectation of privacy in your search history. I think there could be other cases where they did never obtain a warrant, and then it would be that could be the determining factor.

Dave Bittner: I see.

Ben Yelin: It just happened to be in this case that there was questions about the validity of that warrant. The Court separately held that it was a valid warrant supported by probable cause. So this is one of those instances where the defendant in appeal just kind of throws crap at the wall to see what sticks.

Dave Bittner: I was going to say. Right.

Ben Yelin: So you try the warrant was defective. Maybe if you get lucky on the warrant was defective, then you can say, well, the original subpoena to Google violated my Fourth Amendment rights because it was an unreasonable search and seizure. I had a reasonable expectation of privacy in that information. Basically, the Supreme Court of Pennsylvania here is saying no dice on either of those allegations. So long story short, this person is in prison and is probably going to be in prison the rest of his life.

Dave Bittner: All right. Well, that's a good ending, I suppose. I'm not going to say happy ending because, obviously, there are all the people that were affected by his horrible actions. But perhaps justice has been done here.

Ben Yelin: I think so. And I think the legal reasoning on it is pretty sound, In my opinion.

Dave Bittner: Yeah, yeah. All right. Well, we will have a link to that story in the show notes. My story this week comes from the Center for Democracy and Technology. It's an article written by Jake Laperruque. I believe I have that right. If I don't, I apologize. And this is about how the FBI is facing some scrutiny regarding how they use FISA to look into some communications of a member of Congress, and they did so without a warrant. So Section 702 of FISA, which comes up here regularly.

Ben Yelin: Sure is. Probably the three-digit number we've said the most.

Dave Bittner: Yeah. I'll let you do the honors here of give us a little description of what 702 enables.

Ben Yelin: So Section 702 of the FISA Amendments Act of 2008 allows the government to collect information from domestic internet service providers about non-US persons reasonably believed to be outside of the United States. It's a counterterrorism tool. So first authorized in 2008, was reauthorized in 2013 and then at the beginning of 2018. If you notice a pattern there, it means that we are up for reauthorization at the end of 2023. And there's already a really brutal fight going on about whether that program should be authorized.

Dave Bittner: Yeah.

Ben Yelin: On the positive side, it is a very effective counterterrorism tool. Any law enforcement or any federal law enforcement official or intelligence official will testify in front of Congress under oath and say that Section 702 has been used to thwart terrorist attacks in the United States. On the negative side, there's this problem of incidental collection. So even though the program is targeted at non-US persons reasonably believed to be outside of the United States, it ends up capturing conversations where one US person is a party in the conversation because they might be talking to an overseas target. And what happens is those conversations go into a database. And, in most circumstances, the government can search that database without a warrant. So the allegation is that this becomes a form of backdoor searches. If you can't -- if you don't have probable cause to search somebody's communications, just part of the regular Title III process --

Dave Bittner: Yeah.

Ben Yelin: -- then you would use Section 702 and FISA as a backdoor way of getting that information.

Dave Bittner: Right.

Ben Yelin: So the point of controversy here is that a member of Congress, Darin LaHood of Illinois, was informed that, as part of an investigation as to whether he was the victim of a cyberattack, the National Security Agency and the FBI obtained some of his communications under Section 702. He mentioned this at a hearing where they were talking about the potential for Section 702's renewal this year. And, obviously, this does not reflect well on Section 702's prospects for being renewed in full or even being renewed in any way similar to its current form.

Dave Bittner: Yeah. I want to dig into this notion of defensive searches, which I think is the core of what they're talking about here. I mean, that's -- that's the backdoor, right? That's the end around of the Fourth Amendment.

Ben Yelin: So what the FBI would argue is defensive searches, I don't know that they necessarily use that term.

Dave Bittner: Okay.

Ben Yelin: But these are the types of searches where you're not investigating an American for any type of criminal activity or for involvement in any foreign intelligence operation. Rather, there should be some type of special dispensation when you are trying to defend the interests of an American, and you need to search their communications to defend that interest. So if you suspect that they're a target of foreign espionage and influence operations, or if you expect that they're going to be the victim of some type of cyberattack, then through a warrantless process you should be -- the government should be able to obtain those communications. Really, it's not that different than what happens currently under the law. There are very limited number of cases that require a warrant to search the Section 702 database. Only cases where there is already a predicated criminal investigation is there a requirement that the government seeks a warrant from the FISA court. But I think the FBI's thinking is, through this reauthorization process, Congress is going to try and create some type of rule regulating searches of the Section 702 database. And it might be a broad type of warrant requirement where, for every query of the Section 702 database, you might have to go in front of the FISA court and present a probable cause finding. And what the FBI would say is, well, what if we're trying to protect you? We don't want to have to go through that arduous process.

Dave Bittner: It would slow things down.

Ben Yelin: Yeah. Now, did I roll my eyes a little bit when this article went through the sordid history of FBI surveillance?

Dave Bittner: Well, yeah. That was my next thing here. I mean, the article points out that J. Edgar Hoover monitored Dr. Martin Luther King, Jr. It's been used against the antiwar movement, Black activists, students, left-leaning groups. And after 9/11 they were monitoring Muslim communities. So there's plenty of history here that describes the potential peril.

Ben Yelin: Yeah. There certainly is potential peril. All of that history is very real. I think it's kind of an emotional appeal that's disconnected from the somewhat narrow question about whether there should be different rules for defensive searches for Americans' communication. I think all 702 searches are -- have the potential to lead to abuse. I mean, when you have this database, and you have a relatively large subset of communications that go through this database -- although we recently found out the number of searches of this database has decreased drastically over the past two or three years. Still a lot of communications, but it went from like 3 million down to 120,000.

Dave Bittner: Wow.

Ben Yelin: So, clearly, there's somebody in our federal government who's taking notice of at least the bad optics of the Section 702 searches. But when you have searches like this, there's always going to be the potential for abuse. I think the task for Congress is to figure out a way to make this database useful for foreign intelligence purposes while still protecting the Fourth Amendment rights of Americans. And there are various ways you can strike that balance.

Dave Bittner: Yeah.

Ben Yelin: The Privacy -- a member of the Privacy and Civil Liberties Oversight Board would allow warrantless queries as a general practice but would add heightened protections for queries involving elected officials, members of the media, and religious figures. I think that would be a step in the right direction. But it's kind of a weird, slippery slope where then, you know, why -- why only those sensitive groups get these heightened protections? Why not political dissidents? You know, why not atheist groups, if we're not talking about religious leaders?

Dave Bittner: Yeah, yeah.

Ben Yelin: But I do think Congress, theoretically, at least, has the ability to come up with fair rules here. And they have virtually eight months to try and do it before this law expires. So, yeah.

Dave Bittner: This article, the author of this article, I think their point is that they want a warrant to be required.

Ben Yelin: In all circumstances.

Dave Bittner: Yes. In all circumstances. Is there -- is there an in between here? Is there a -- is there some -- is there a possibility for oversight without requiring a warrant, or is that allowing the foxes to guard the henhouse?

Ben Yelin: So one of the reasons that this is a very difficult area of law and policy is that so much of this program is secretive. When we do find out about abuses that exists in this program or overzealous use of the Section 702 database, it comes out in redacted judicial opinions, usually by the FISA court, two or three years after the alleged defense -- alleged offense has happened. So it's sort of like when you look at a distant star and you're actually seeing what that star looked like, you know, hundreds of billions of years ago.

Dave Bittner: Right.

Ben Yelin: When we see these cases, you have a view into Section 702 as it existed several years ago. So it was hard to have that type of real-time oversight. Theoretically, Congress should have the ability to get classified briefings on Section 702 surveillance. That in and of itself is an imperfect system. The best evidence that it's imperfect is that a member of Congress himself has been surveilled as part of a warrantless search. So it's a really difficult problem to solve. I mean, I think the government to me has made a compelling case in some of these congressional hearings that a blanket warrant requirement on Section 702 searches in the database would be a major inhibition on intelligence collection and could lead to harm such as terrorist attacks, cyber incidents, espionage, etc. But, you know, these civil liberty groups have been adamant for years that this is a backdoor search, that we need some type of robust warrant protection. There are members of Congress who agree with them, members of both parties who've proposed amendments in the past to try and institute this warrant requirement. So these are two pretty intractable positions. And I know this feels like a cop-out, but I'm curious to see where they land. I don't know what, like, a reasonable middle ground would be.

Dave Bittner: Yeah.

Ben Yelin: So I'm curious, just like everybody else, as to what happens through the legislative process here.

Dave Bittner:Making its way through our dysfunctional Congress.

Ben Yelin: Yeah. I mean, once they figure out how to avoid national default, fund the government, keep the lights on, maybe we can have a nice robust debate into our surveillance practices.

Dave Bittner: All right. We'll look forward to that. All right. Well, we will have a link to that article in the show notes as well. And, of course, we would love to hear from you. If there's something you'd like us to discuss here on the show, you can email us. It's caveat@thecyberwire.com.

Dave Bittner: Ben, I recently had the pleasure of speaking with Nick Means. He's from a company called Sym, and he joins us with insights on how an organization can build a compliance program without tanking their engineering velocity. Here's my conversation with Nick Means.

Nick Means: The only way to be 100 percent compliant, to be 100 percent safe is to just not ship any software at all. And that's obviously not what anybody wants. And so it's inherently some amount of compromise to put controls in place in a way that still lets your engineering organization ship software at the pace that they want to ship software, put controls in place that's not onerous on them and their ability to do their job but still results in the compliance results that you're looking for at the end of the day. And digging into that tension and finding the right way to do that is a really interesting problem that's changed a lot over the over the years.

Dave Bittner: Is there a most common way that folks have their compliance programs set up these days?

Nick Means: It's becoming more and more common to use a compliance out-of-the-box program, sort of like Vanta. And there's a few other vendors out there that will sell you this sort of thing where it's sort of a prepackaged set of controls, walks you through setting up a compliance program, all of the things that you need to put in place for sort of SOC 2 or something like that. That approach is becoming increasingly more common for smaller organizations that need to be compliant. And then, larger organizations, the path still seems to be hire [inaudible]. So have them help you put the program in place, design one that fits you and your organization really well.

Dave Bittner: How does the engineering team feel about these sorts of programs being put in place?

Nick Means: I mean, I think it depends on how they're put in place, right? So I remember early in my career, it's sort of the first time I had been in a compliant environment, it was in a health tech startup. We were subject to HIPAA and had a compliance officer that was helping us put a HIPAA program in place and really didn't have a whole lot of empathy for the need to deliver software, was really just focused on the controls. And that's a pretty painful place to be as an engineering team that's trying to deliver software. It seems that approach is less and less common now, which I think is a really good thing. There's a lot more empathy from the folks that are helping engineering teams put these programs in place, a lot more listening, a lot more, okay. This is the control. This is the objective of the control. How can we put this in place in a way that doesn't slow you down any more than it has to give us the compliance that we're looking for.

Dave Bittner: And how does successful teams go about doing that?

Nick Means: You know, I think the most successful compliance programs I've seen are compliance programs where the engineering team does have a voice in the process. So you have an empathetic person that's in charge of putting compliance in place. They're seeking to involve the engineering team in the process, not come in and layer controls on top or tell them that they have to do things a certain way but to talk to them and figure out, how do you build software now? What are the controls that we need to put in place? How can we do that in a way that is the least disruptive possible for you as an engineering team? You know, in a lot of cases, that's going to be an automated control of some sort. A good example is change management. That's a component of many compliance programs. You have to manage the change in the software that's going out the door. And most of the time you can achieve that control via your standard software delivery or SDLC program just doing pull requests, having somebody review the code before it's shipped out the door. Maybe if they're touching a high, high, high risk part of the code base, you need an approval from an executive or somebody higher up to make sure that code is safe but putting those controls in place in an automated fashion so that there's not somebody that's running a manual change management program.

Dave Bittner: Can you give me some insights as to the letter of the law versus the spirit of the law?

Nick Means: You know, I think we hear a lot of complaints from folks that talk about checkbox compliance.

Dave Bittner: But -- and you can understand the need for that, right. And particularly if you have some kind of regulator looking over you and it's great to be able to say, here are the boxes. We check them all, and so we are in compliance. But I think a lot of folks aspire to more than that, and how do you strike that balance?

Nick Means: I mean, I -- you know, I think a big part of it is having an understanding that the aim of the compliance program, not the compliance program itself, is your real goal. You want to build secure software. You want to build software that doesn't introduce new vulnerabilities where risks are managed, where change is controlled. You don't want people SSHing into a production server and changing production code live. You want a change management process of some sort. And getting to the why of that and making sure everybody's bought into the why of the compliance programs is the important part. You know, if you -- if you put a compliance -- if you put change management in place to check a box on a program, then there's incentive for people to work around it. But if everybody really understands and is bought in that that's the safest way to deliver software that benefits your users, that's a little bit more effect -- that's a much more effective program. And it's much more palatable to the engineering team because they understand why the controls are in place in the first place.

Dave Bittner: How do organizations make this collaborative and not end up, you know, within an adversarial relationship between the compliance folks and the engineers?

Nick Means: You know, I think it goes back to kind of what I said earlier is of giving the folks on the engineering team a voice in the process. Don't just come in and layer controls on top of them. You know, don't tell them, Okay. We're going to put this button here, and you have to -- somebody has to click this every time you want it to play software. Come in and ask. Okay. We have to have some affirmative approval that it's safe to deploy this software, and it's approved by more -- it's been seen and reviewed by more than the person that wrote it. What is the best way for us to do that for your team in a way that's not disruptive? You know, it's one of the design principles of the product we work -- we build at Sym. We've built a product that lets you layer in authorization and approval workflows for accessing production environments and other things. Workflows run in Slack. But one of the key components is it's all configured in code. So the engineering team that's using these workflows on a day in, day out basis to get approval to access production environments or deploy software can go in and look at the code that's setting up these workflows and commit changes to it. So they're able to change the controls that they're subject to.

Dave Bittner: Is this -- when it's properly implemented, do the engineering teams see this as a benefit? I mean, can it -- can it be anything other than a speed bump for them, or are there positive sides to it?

Nick Means: You know, I think there's a lot of cognition for most engineers nowadays that breaches are more and more common. And one of the most common ways that organizations are getting hacked is by somebody phishing and successfully compromising somebody's user credentials. And so there is some deep incentive there to not carry more access than you need on an ongoing basis. So I think focusing on benefits like that, focusing on being aware of how much access you have and what could happen if your credentials were somehow compromised is one way that engineers can see this is more than a speed bump. You know, code review is also an another good example of a thing that checks a bunch of compliance check boxes but also helps build better software. And it's a pretty common practice among software engineering teams now. So it's a good example of a thing that's gone from kind of new, new novel idea 20 years ago to something that most engineering teams do naturally now and see as a beneficial part of their process. So I think it's all about finding ways to do it and finding ways to achieve compliance in a way that actually benefits the team as well as often as possible.

Dave Bittner: How about as we move up the organizational chain within an organization, you know, the leadership folks, the folks on the board of directors, what's the most effective way to present this kind of plan to them to let them see the value in it?

Nick Means: So to clarify the question, more on the plan to implement it at the engineering level or just the value of a compliance program overall?

Dave Bittner: Well, I think the compliance program requires an investment of time and -- time and money, resources, all those kinds of things but to sell to your board of directors that we're going to come at it, but we're going to come at it in this way. You know, it's not going to be just a checkbox thing. You know, there's -- that might be cheaper for us, but here's -- here's what we're going to get out of coming at it the proper way.

Nick Means: The interesting thing about coming at it from a checkbox approach is that's also often a more painful way to come about it because you can check those boxes and do it in ways that are pretty obstructive and obtrusive, whereas if you get to the actual motive behind the compliance program, the reason that the controls are there in the first place, that gives you the opportunity to have those conversations, to make it a more empathetic approach with the people that are subject to the controls and do it in such a way that's not painful to them. And, from that perspective, it becomes a little more easier to sell to a board of directors, again, kind of digging into that tension of, you know, we can -- we can make ourselves compliant by not shipping any software at all. Or we can ship software very quickly and not worry about compliance. Where we want to land is somewhere in between, and that's going to take some discussion. That's going to take some compromise. We can just come in and layer a set of checkboxes on top of it.

Dave Bittner:  The organization's that you've worked with that are doing this well, who are really hitting it out of the park, are there any common elements that they have, any things -- things that they share?

Nick Means: Investment in people I think is a big one. Finding the right compliance person, the right security person who comes in and brings that more empathetic approach, who wants to work alongside the engineering team, who sees themselves as more of a consultant and an ally to helping the organization ship software in the right way versus someone that's just there to come in and achieve compliance aims. That's one of the commonalities that I've seen everywhere that I've thought this was done really well is that the engineering team actually wanted the help from this person because they knew it was helping them produce safer software, more secure software versus sort of reluctantly allowing them to come in and help.

Dave Bittner:  Is this an iterative process as well? I mean, as you're going along learning things, is there a feedback loop and as to, you know, what worked and what didn't and how can we be constantly improving?

Nick Means: Yeah. I think -- I think there has to be, you know. If every time you go through an audit, the audit prep, there's parts of it that are more painful than others, when you hit one of those parts of audit prep, when you find a place that it's really difficult to gather the evidence that you need to get ready for the audit, that's a signal that that's a part of the process. You should probably spend some time thinking about and iterating on so that you're not doing as much manual work when the next audit rolls around. Similarly, engineering teams are pretty good at introspecting on a regular basis and figuring out what part of their process is working and what part's not. So the person that's in charge of compliance needs to be ready to take that feedback from the engineering team and help them find a way to fix the parts of the compliance program that are painful, make it where it's not something they're begrudgingly dealing with.

Dave Bittner:  Ben, what do you think?

Ben Yelin: It's a really interesting conversation. I mean, you have to strike a balance between compliance and not trying to stifle innovation in your own company. It's a really difficult balance to strike.

Dave Bittner:  Yeah.

Ben Yelin:: So I think, you know, hearing the ways that he works through trying to strike that balance was just really interesting.

Dave Bittner:  Yeah. Again, our thanks to Nick Means from Sym for joining us. We do appreciate him taking the time.

Dave Bittner: That is our show. We want to thank all of you for listening. We'd love to know what you think of this podcast. You can write us an email at cyberwire@n2k.com. Your feedback helps us ensure we're delivering the information and insights that help keep you a step ahead in the rapidly changing world of cybersecurity. We're privileged that N2K and the podcasts like Caveat are part of the daily intelligence routine of many of the most influential leaders and operators in the public and private sector, as well as the critical security teams supporting the Fortune 500 and many of the world's preeminent intelligence and law enforcement agencies. N2K's strategic workforce intelligence optimizes the value of your biggest investment: your people. We make you smarter about your team while making your team smarter. Learn more at n2k.com. Our senior producer is Jennifer Eiben. The show is edited by Elliot Peltzman. Our executive editor is Peter Kilpe. I'm Dave Bittner.

Ben Yelin: And I'm Ben Yelin.

Dave Bittner: Thanks for listening.