Caveat 11.4.20
Ep 53 | 11.4.20

IoT device risk: can legislation keep up?

Transcript

Curtis Simpson: These things are everywhere. They're connected to everything. We don't understand them well. And in some cases, even manufacturers don't really understand what they're selling to the public in terms of risk.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's law and policy podcast. I'm Dave Bittner. And joining me is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: On this week's show, I wonder if Facebook's Mark Zuckerberg isn't asking Congress to throw him in the briar patch. Ben looks at the massive amount of data politicians gather on voters. And later in the show, my conversation with Curtis Simpson. He's CISO at Armis, and he shares his thoughts on some recent IoT legislative updates. 

Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, before we kick things off here, just a little reminder to our listeners that timing-wise, it's a little bit odd because we record our shows a few days ahead of time, which means that we're recording this before the election, but it's actually going to come out the day after the election. So if people are wondering, why are they not talking about the election, it's because in this strange little time warp that is podcasting, the election has not happened yet for us, although it has happened for everyone else. 

Ben Yelin: Yeah. So to our future selves, congratulations on your candidate winning, or my condolences on your candidate losing. We should... 

Dave Bittner: Right. 

Ben Yelin: We should fulfill all of our prerecorded messages for... 

Dave Bittner: There you go (laughter). 

Ben Yelin: ...Election-related content. 

Dave Bittner: Yeah, absolutely, absolutely. All right, well, let's get to our stories. Ben, why don't you start things off for us? 

Ben Yelin: So on the subject of elections, we are in the last stages of the election season. And one thing you might notice if you've ever given any information to either a political candidate or a private company is that politicians know how to contact you and, more importantly, they know how to microtarget you. So they know your interests. They know your spending habits. They know your demographic information. 

Ben Yelin: And there's a really interesting article on The Washington Post in the Technology section by Geoffrey Fowler, who's a technology columnist, who tried to dig into exactly how much these political campaigns know and how they get access to information. And it turns out that they know a lot. 

Ben Yelin: So there are a bunch of different ways that politicians can learn things about you. There is publicly available information in state voter files. So when you register to vote, you put in your address, your phone number, your email address. In most states, you put in your political party affiliation. Those aren't necessarily public, but they're widely available to researchers, political analysts, et cetera. 

Ben Yelin: There are also commercial voter files. So there are, like, professional political data brokers who combine voter file information with various, quote, "enhancements." So... 

Dave Bittner: (Laughter). 

Ben Yelin: ...These are things - yeah. I mean, it's very Orwellian. 

Dave Bittner: (Laughter). 

Ben Yelin: So these are the types of things that you would find in somebody's credit report, their subscriptions, their spending habits, bank accounts, what information they've given to various online applications. And, you know, they can really build quite a detailed profile of you. There are a couple of very large political data firms who are so good at what they do that they can predict with a very high degree of accuracy not only whether you are likely to vote, but whether you're likely - you are likely to be a political donor. 

Ben Yelin: So the result is political campaigns may know more about us than we know about ourselves. 

Dave Bittner: (Laughter). 

Ben Yelin: According to both political parties, there are about 3,000 data points on every voter. So they know about 3,000 distinct pieces of information about me and you. Now, sometimes they're wrong. This particular author talked about how when he sought information on his own voter profile, they said he was into, like, interior decorating or something. And he's like, I am not into interior decorating. 

Dave Bittner: (Laughter). 

Ben Yelin: But, you know, they do have a lot of information about you. 

Ben Yelin: A very interesting legal angle to the story is that Mr. Fowler used the California Consumer Protection Act - CCPA - to get public access to this information. So you can, through the CCPA, request that any corporation - private company has to reveal what they know about you. And you can put in these requests and, because of this law which just went into effect at the beginning of 2020, you can get a little bit of transparency on what information is out there. 

Ben Yelin: But the success of using CCPA is limited. It doesn't apply when you're seeking information from public entities. That's a very big exception. And, you know, sometimes you don't know what you don't know. So you have to request every distinct piece of data from every distinct company that might have information on you. And if you don't know, you know, how those 3,000 data points are collected, it's going to be impossible to get a full picture of what political campaigns know about you. 

Ben Yelin: As he said in this article, politicians tell us that they care about privacy, but certainly, the way their campaigns act, they are anything but interested in privacy. They're interested in giving us an incredibly detailed profile so that they can have accurate information as to how likely we are to vote, how likely we are to vote for their preferred candidate, et cetera, et cetera. So I just thought it was a really interesting window into this world of political data. 

Dave Bittner: Yeah, it really is. And it leaves me scratching my head, you know, 'cause I suppose one person's influence is another person's manipulation. Right? 

Ben Yelin: Yes. 

Dave Bittner: Where's that line? How do you know if you've crossed it? I mean, I guess politics is very much all about getting emotional responses and pressing people's buttons. So I guess - should we - I mean, is it all fair play then? 

Ben Yelin: It is mostly fair play. Now, the parties themselves - I actually interviewed a spokesperson for both the Republican National Committee and the Democratic National Committee. They were both very adamant that all of their practices comply with federal laws and regulations and state laws and regulations. And the representative of the Democratic National Committee actually has said that they've returned data that they believe they have received improperly - that they believe was not obtained using legal channels. 

Ben Yelin: But by and large, all of this collection is legal because it is publicly - it's either public information, which is what you can get through the voter file, or information you, in one way or another, have voluntarily given to third parties. So whether that's a financial institution, whether that's agreeing to cookies on a website, whether that's - you know, you visit a website and log in with Facebook and you're consenting for, you know, Facebook to collect that information and then a group like Cambridge Analytica goes in and finds out what they know about you based on your Facebook profile. All of that is is legal. 

Ben Yelin: So there's really nothing - there's not much legal recourse except, if you are a California resident, you do have the CCPA where you can at least get some transparency as to what they know about you. And, you know, if you have the time and the energy to do this, which nobody has, you can go through and request... 

Dave Bittner: (Laughter) It's 'cause we're all too - yeah, we're all too busy reading all those EULAs, right? 

Ben Yelin: Yeah, exactly. 

Dave Bittner: It's taking up all of our time, yeah. 

Ben Yelin: I read all 800 pages, and now I'm going to go through and, you know, using the CCPA, try and have these companies give me back my personal information. Yeah. So nobody's actually going to have the time to do that when, you know, companies have 3,000 data points about you. 

Dave Bittner: It makes me long for a universal opt-out where you could go to a website or an organization or something and say, you know, I, Dave Bittner or I, Ben Yelin, I do not authorize you - all of you data aggregators - to use my data as a private company. Obviously, there are some of - sort of there's some public uses that I suppose you wouldn't be able to opt out of like the credit reporting agencies or things like that. But if we could have a one-stop shop to say, hey, you know, knuckleheads, knock it off, wouldn't that be a nice thing for consumers? 

Ben Yelin: Yes, it would be. 

Dave Bittner: (Laughter). 

Ben Yelin: I would love to have a kill switch to all of the data that's been collected on us online. 

Dave Bittner: Right. 

Ben Yelin: You know, there are some remedial measures you can take. So this talks about how much information the campaigns can obtain from you just if you visit their website or sign up for their mailing list. And you can always - you know, when they text message you, you can reply stop. And the author of this article said that when he did that, the campaigns did generally stop texting him. You can actually read those EULAs and, you know, decide whether to give your consent to each individual website or application that you visit. These options exists. They're not realistic. And in a world where business is conducted via social media and our social interactions are largely social media-based, especially in the COVID age, it's just not realistic to expect that, you know, people can opt out without also suffering the consequences of going off the grid. 

Dave Bittner: Right. 

Ben Yelin: So I think there's not really a meaningful opportunity to opt out, even if there are measures you can take to limit this type of collection. 

Dave Bittner: Yeah. No, it's an eye-opening story, again, from The Washington Post - Geoffrey Fowler doing great work over there. We'll have a link to that in the show notes. Good story. 

Dave Bittner: My story this week - how do I describe this? As we - as you and I are recording, several of the leaders of some of these social media companies are set to testify before Congress. And Facebook CEO Mark Zuckerberg has released his opening statements, which are getting all kinds of commentary, specifically because he's saying that he wants some changes to be made to Section 230. Ben, Cliff's Notes - Section 230, just to get us up to speed? 

Ben Yelin: Sure. So Section 230 of the Communications Decency Act is the provision that holds that content moderation will not lead to liability for services like Facebook, Google, YouTube, et cetera. So you have a liability shield against any moderation decisions you make as it relates to content on your service. And this is for people who are not publishers of information but merely conduits for other people to post information. 

Dave Bittner: Right. So in his comments, Zuckerberg says - and I quote here - "Section 230 made it possible for every major internet service to be built and ensured important values like free expression and openness were part of how platforms operate. Changing it is a significant decision. However, I believe Congress should update the law to make sure it's working as intended." 

Dave Bittner: Now, there are folks who are saying that basically this is Mark Zuckerberg saying to Congress, please don't throw me in the briar patch... 

Ben Yelin: Yes. 

Dave Bittner: (Laughter) ...'Cause... 

Ben Yelin: Are we allowed to be suspicious of Mark Zuckerberg? 'Cause... 

Dave Bittner: ...You know, I think we're obligated to be suspicious of Mark Zuckerberg (laughter). 

Ben Yelin: I kind of think so, too. Yeah. 

Dave Bittner: In my opinion, yes. Yes. And the criticism that folks are making here - and the article that I'm going to link to is from Techdirt, it's written by Mike Masnick. And people are saying that he's pulling up the ladder behind him. The quote here from the article says, "Make no mistake about it - this is Mark Zuckerberg pulling up the innovation ladder he climbed  behind him." In other words, saying that they were able to build these businesses because of Section 230... 

Ben Yelin: Yes. 

Dave Bittner: ...They are now successful. They have the resources to do large-scale content moderation. And if we change Section 230 at this point, it will make it much harder for competitors to enter the space because having an enhanced regulation on content moderation is going to be expensive. 

Ben Yelin: I think that's - that perspective is absolutely right. And I would be willing to suggest - and forgive me if this is controversial - that this is a really questionable statement and action on the part of Mark Zuckerberg. And I certainly question his motives here, as does the author of that Techdirt article. 

Ben Yelin: So he's in a difficult position because politicians of all stripes dislike him for different reasons. Republicans generally think that Facebook and other tech platforms like Twitter are biased against conservative viewpoints, basically that the Twitters and the Facebooks of the world censor conservative-related content. Democrats and liberals come come to it from the opposite perspective. They don't think that there's enough content moderation on these platforms; they're not doing enough to root out abuse, misinformation, et cetera. 

Ben Yelin: I think there's a grain of truth to both of these arguments, but I think what Zuckerberg is doing is leveraging the fact that there is this bipartisan opposition to Section 230 from both political parties into an opportunity to kind of clear the field of competitors. As you said and as the Techdirt article mentions, he now has the full market share on whatever it is that Facebook does. He got there by taking advantage of Section 230. You can innovate - you can be creative in how you moderate content if you're not subject to any type of liability. And to come in after you have 100% of the market share, basically - no offense to the MySpaces of the world... 

Dave Bittner: (Laughter). 

Ben Yelin: ...And remove that ladder from under you, I think, is a very cynical political decision for him to make. I think he's trying to ingratiate himself with lawmakers who have been beating the drums about reform to Section 230. But in the end, this is a decision to improve his own bottom line and to continue some of the anti-competitive practices that we talked about last week. So, yeah, I do think it's cynical. 

Ben Yelin: The other thing that sort of bothers me about it is I don't really see any potential solution in Zuckerberg's opening statement. I see a critique of Section 230 - that he thinks it has to be reformed, but I don't see what a viable alternative would be. If we completely repeal Section 230 and there isn't that liability shield, then content-moderating platforms like Facebook or these types of conduits - Twitter, et cetera - will be extremely fearful about lawsuits and therefore will regulate content more significantly. They'll be like broadcast networks where they're bleeping out words so that they don't get fined by the FCC. 

Ben Yelin: They could potentially take some half measures where they're not continuing to have complete liability but maybe minimize shield of liability, but we don't know exactly what that would look like either. The less protection these companies have, then the less leeway they have to be creative in making content moderation decisions. And they have gotten creative. I mean, Twitter in particular has taken some substantial steps to cut against misinformation as it relates to voting and the presidential election. And if they were subject to liability for something like that, they wouldn't be able to take these bold content moderation steps. So without any sort of proposed solution, it's hard for me to look at this as anything except something that's pretty cynical on Zuckerberg's part. 

Dave Bittner: Yeah. Well, as Molly Wood over at Marketplace says, which I love - she says, Facebook's solution to every problem is more Facebook. 

Ben Yelin: Exactly (laughter). Exactly. 

Dave Bittner: (Laughter). 

Ben Yelin: That's very true. And the thing is, Zuckerberg knows that we are all addicted to Facebook. We're not going to leave. There isn't a real alternative platform that does everything that Facebook does. And they would probably benefit from the type of rollback on Section 230. So I think there's a lot to that statement that the solution to everything, in Zuckerberg's mind, is more Facebook. 

Dave Bittner: There's also nothing keeping Facebook from enhancing the amount of moderation that they do. I mean, it seems to me like this is sort of, in a way, passing the buck that's saying, you know, well, if the government takes - tells us what to do, then we don't have to take responsibility for our own actions. 

Ben Yelin: I think there's something to that as well. They take a lot of heat because of their content moderation decisions. So there's this New York Post article that came out that alleged a scandal against Hunter Biden, the son of the Democratic presidential candidate. 

Dave Bittner: Right. 

Ben Yelin: And Facebook made a decision to limit sharing of this article until it was independently fact-checked, and conservatives raised holy you-know-what about this, saying that this was evidence of political bias. The tech companies - and Facebook in particular - would love to stay clear of this type of political controversy. They would love it if they didn't have to be yelled at by members of Congress, by people of political influence every time they made a content moderation decision. So, you know, as you say, taking the decisions out of their own hands and trying to put them in the hands of somebody else - i.e. Congress - that's another way they're trying to make their own lives easier, not to mention they wouldn't have to devote the resources that they currently devote to content moderation. So I think that point is very well taken. 

Dave Bittner: Yeah. All right. Well, interesting article. I say - I linked to the one over in Techdirt. There's been a lot of commentary about this. And we'll see - as these testimonies take place, I'm sure there'll be a lot of grandstanding and puffing up of chests and (laughter) monologuing and all that stuff that we can expect from... 

Ben Yelin: Yeah, how dare you, sir? Yeah.  

Dave Bittner: (Laughter) Right. But so it goes. So it goes. We'll have a link to this article in the show notes, of course. Ben, it is time to move on to our Listener on the Line. 

(SOUNDBITE OF PHONE DIALING) 

Dave Bittner: I have sort of an unusual listener on the line this week. I was recording the "Grumpy Old Geeks" podcast with my pal Jason DeFillippo, and we were talking about a story that we'll have a link to hear about a gentleman who's trying to use facial recognition software to recognize police officers. And as we've seen in some of these demonstrations, some civil unrest - so there are police officers who are covering their nametags, covering their badges or not even wearing them at all, and there are some folks who are taking issue with this, I think - personally, I think understandably. So they're trying to use facial recognition to sort of turn the tables on the police to be able to know who these police officers are. And Jason and I had an interesting little conversation about it. Here's a snippet from our conversation over on "Grumpy Old Geeks." 

(SOUNDBITE OF PODCAST, "GRUMPY OLD GEEKS") 

Jason Defillippo: We've talked about facial recognition so many times in this show it seems like we're beating a a pulverized horse, but this one got me excited just from the title alone - "Activists Turn Facial Recognition Tools Against the Police." I'm like, yes. Go get them. Go get them. 

Dave Bittner: (Laughter). 

Jason Defillippo: And in reading the article, it turns out it they're not really yet. It's kind of a side project for one guy up in Seattle, so it hasn't been really released into the wild. But, you know, turnabout's fair play, I think. 

Dave Bittner: Yeah. Do you - you know, this is - I guess this is something I have to - I'll ask Ben about when I talk to him. Is it against the law to out a police officer who's undercover? 

Jason Defillippo: Undercover is a little different from what they're trying to do, though, right? That's a different thing. These guys are just putting their - putting black tape over their name badges, and they're in public doing police work. This is not like... 

Dave Bittner: Yeah. 

Jason Defillippo: ...You don't know they're police. 

Dave Bittner: Right. 

Jason Defillippo: Undercover is like, you know, yeah, you don't want Huggy Bear getting wise to, you know, Joe Law over there. 

Dave Bittner: So, Ben, what is your take on this? 

Ben Yelin: So I would say for the proximate issue, which is can people be held criminally liable for using this facial recognition to identify police officers, I think the answer is no. These officers are in public. They are in uniform. They are not undercover, and they generally should be showing their badges at all time, depending on what their individual department's policy is. So in that sense, no, there is nothing illegal about using this facial recognition software. This is just a way to augment good old-fashioned facial recognition, which is standing in a crowd and seeing the officers for oneself. 

Ben Yelin: It gets a little bit more complicated when we're talking about outing an undercover law enforcement officer. And in that sense, it varies by jurisdiction. In federal law, you can be charged with obstruction of justice if you knowingly expose an undercover federal investigator. And, you know, that's - that can be a pretty serious charge. You could be, depending on the case, potentially looking at jail time. And states - I think states vary on whether there is any punishment for uncovering undercover police officers. In some states, you can be charged with being an accessory to a crime, especially if violence will befall that undercover officer. In some states, there really are no criminal or civil penalties for that type of action. So that would be my stab at answering that question. 

Dave Bittner: Now, that's interesting because it makes me wonder, is it possible to inadvertently out someone who's undercover, right? You're just running facial recognition on everybody walking by, and an undercover police officer walks by. And it pops up and says, oh, Bob Jones from 123 Main Street, you know? 

Ben Yelin: Right. I thought I recognized that name from my police officer files or whatever. Yeah. 

Dave Bittner: Right. Right. Yeah, exactly. I - that's - I mean, I suppose it hasn't come up yet, but it's an interesting thing to ponder - I suppose a potential peril there, right? 

Ben Yelin: Yeah, it absolutely is. Another thing I should mention here is there are First Amendment interests at stake. So generally, a person will have more protection if there's some sort of public purpose for revealing the existence of an undercover officer. So if it was part of, like, a media expose, you run into some pretty significant First Amendment issues about freedom of the press and freedom of speech. But if you start identifying more personal or private information about them, then you lose that sort of First Amendment protection. 

Dave Bittner: Yeah. 

Ben Yelin: But, you know, that's just another thing to keep in mind when we're talking about this issue. 

Dave Bittner: Yeah - interesting, interesting. All right, well, you can hear the whole conversation over at the "Grumpy Old Geeks" podcast. And if you tune into certain episodes, you might even hear Ben over there. You've been a guest on that show as well, right? 

Ben Yelin: I have. It's me unfiltered. I can, on that podcast, use all types of dirty words that I spare from you, our loyal listeners. 

Dave Bittner: (Laughter) Right. Right. Right. All right. Well, thanks to Jason from "Grumpy Old Geeks" for taking part in our show this week. We would love to hear from you. Our call in number is 410-618-3720. You can also email us. It's caveat@thecyberwire.com. 

Dave Bittner: Ben, I recently had the pleasure of speaking with Curtis Simpson. He is the chief information security officer at a company called Armis. And our conversation focused on some recent IoT legislative updates and Curtis' thoughts on that. Here's my conversation with Curtis Simpson. 

Curtis Simpson: I spend a lot of time talking to customers about this overall scenario. But the ratio of devices within our environments has changed, and that's a term I commonly use because it's true, and it applies both to enterprise as well as consumer environments. Even if you look at our homes - right? - we used to have a bunch of benign devices like light switches and things in our houses that have now gone digital, alongside all of the other things that we've commonly used for a while, like laptops and tablets and such like that. 

Curtis Simpson: Well, then you go into the enterprise landscape, including public and private sectors, and you get into a whole more complex, sophisticated scenario where, in the environment that I came from, as an example, at Sysco Foods - really, how you could describe that environment was an IT environment and an OT and ICS environment that were totally separate from one another. Well, about four years ago or so, you started to see holes punched into that OT and ICS environment, and more specifically, you started to see IoT sensors and additional IoT capabilities introduced into that landscape to help reduce costs, avoid downtime through predictive maintenance capabilities and things like that, even to the point where we had one facility that was almost entirely automated, similar to an Amazon warehouse. 

Curtis Simpson: So we've gone from a position where we run our businesses and even just connect it to the internet at home using traditional PCs and tablets, mobile devices, to the point now where we've got all these interconnected devices that, in many cases, we don't know a lot about. And then there's another really important element to all of this - it's the manufacturing side of this scenario that I'm speaking to. So for a long time - decades now - as long as we've been using these technologies, it's not about an outcome versus security because it's - we're talking about multiuse platforms, like a PC or a server or even a tablet or a mobile device. They're used for many different purposes, for many different outcomes. And therefore you've got to have some security abilities that apply to those things, et cetera. 

Curtis Simpson: Well, the market's been demanding rapid outcomes on the IoT side, and the manufacturers have rose up to that, so much so that there's large and many small manufacturers that have been able to join the market with little more than a capability to understand the components that are out there, how to stitch them together and then sell an outcome. 

Curtis Simpson: So you're very much seeing us run the gamut of a lack of understanding about these new devices that sit in our environments, how many of them are - what are they doing? Are they compromised? Was the supply chain compromised from the beginning? All the way to the fact that we're now seeing these types of devices that may be not even built responsibly, built with security in mind, built with even a technical understanding in general, that can't be patched, that are being actively attacked by bad actors - and that's some of the more recent news. But this is really what's leading up to this right now. It's pretty broad. It's pretty multifaceted. But the gist of it is these things are everywhere. They're connected to everything. We don't understand them well. And in some cases, even manufacturers don't really understand what they're selling to the public in terms of risk. 

Dave Bittner: Can you give us an example? I mean, is there some category of device that, in your mind, is kind of the poster child for this issue? 

Curtis Simpson: It's many - so even, as you note, some of the more recent cybersecurity or IoT-based regulations and legislation - one of the things you - that they're noting very specifically as one of the main concerns is around these smaller manufacturers. So an example is sensors. Many of the IoT-based sensors - whether it's a temperature sensor or some sort of sensor to look for a specific situation, like humidity, or a specific point of failure - those sensors are usually built very simply without necessarily a lot of - the need for a lot of experience in building these devices. You can buy the software straight from the market, buy the components from the market and quickly put them together to sell one of these sensors to an enterprise. 

Curtis Simpson: And as you look at things like that, you can even consider other examples as you think about scenarios in the home where, even when you do have IoT devices that may be able to be updated - like, you may be able to get an update for that switch that sits in your wall - would you have any concept of how to actually apply those updates or consider what risks there may be facing your home? And as you think about even some of these - well, actually, one of the more interesting examples I can give you is in terms of one of the recent vulnerabilities dubbed Ripple20, which affected many different forms of devices, ranging from printers to network devices to devices used in health care and manufacturing use cases. One of the interesting things that we found is, once it was disclosed, we started doing our research and using our product to discover other devices that might be impacted, and we found a number of them. 

Curtis Simpson: One of the interesting conversations that came out of that is there was a relatively known manufacturer that was actually unclear on the fact that they had this vulnerable software running on their device. And it was because there had been someone within the organization that had made that specific purchase, solved that specific problem, helped them deliver on that specific feature, but there wasn't necessarily the documentation or understanding that that thing was there until someone else was able to point them to it. So in some ways, it is the Wild West in terms of how these devices are being secured, what security capabilities are there or even who's building them and what their capabilities are in understanding of how to secure those devices or even respond to someone telling them that they're not secure. 

Dave Bittner: Now, that's fascinating. I mean, I - you know, I think about - something that comes to mind for me that I find useful is thinking about, you know, those old security cameras that are sitting up on the wall or on the corner of a building somewhere. And maybe they've been sitting there for 10 years, and they've been doing all the things that they need to do. They've been functioning fine as reliable security cameras. But perhaps they are full of vulnerabilities. Perhaps they've been taken over by a botnet. And they're still doing all their security camera stuff, but who knows what's in there? You know, I wonder, as we move forward, does there need to be some sort of IoT abatement system, you know, similar to how we pull asbestos out of old buildings? Do people need to take stock and say, listen - you know, before we had - before we settled on what security standards were going to be, we made a lot of mistakes, and you can't just leave those things sitting up there on that wall. 

Curtis Simpson: Yeah. No, and it's a very good analogy and it's a very real one. We're actually seeing - and there's - it's complex to some extent, of course, because one of the challenges we do see is that - you take an OT and ITS environment as an example. A lot of the devices in that environment were built to be used for a decade or longer. But to your point, they weren't targeted by bad actors in the past. They weren't even being targeted by researchers. They weren't seen as an attack surface. Now they very much are - largest attack surface I've seen through my career. And now, though, there's this concern in that - wait a second. We can't just replace all these things because they are the lifeblood of this service, this organization, this energy company - whatever that is. 

Curtis Simpson: So it's going to be a multifaceted effort. I do think a part of it is exactly what you said. And one of the reasons I do say that is because what's been interesting to me is the response to Section 889 more recently - banning a number of specific foreign manufacturers and their devices from environments that are used to service government clients. That's had one of the largest impacts on that conversation that I've seen. And remember - there's an element of this that we've actually been talking about for a while. We've been talking about how dangerous these Hikvision cameras are or Hikvision cameras are, the risks that you're exposed to with those cameras in your environment, so - there's nation-state associations with these devices. We've long talked about them. But many different companies and enterprises have not looked at that as a significant risk until it started having implications on revenue and customer bases. 

Curtis Simpson: Right now you're seeing a very different conversation because it's not just a risk anymore; it's a risk to the business. And as we continue to look at these challenges, I do think we - and we look at regulations and legislation around these things - I think we do have to think about how we got here, what we're facing. And I think it's going to be a combination of what you said - we've got to have these regulations that guide us, that are going to have to have some sort of deadlines as to what we need to do and how we need to do it. And it's not just about government acting; it's about all of us acting. And it does start with the manufacturers themselves and the capabilities built into that. And then it ripples into our organizations in terms of how we're consuming those devices and how we're applying due diligence to actually understanding if these things are acting the way they shouldn't be, et cetera. 

Dave Bittner: Is there a possibility that other elements of the ecosystem could have an effect on this? I'm thinking of specifically insurance companies. You know, is it possible that they could come to the organizations and say, listen - if you want us to cover you or if you want this really nice discount on your insurance, your IoT devices need to meet these standards? 

Curtis Simpson: Absolutely. And I do think what we're facing are a whole number of things coming to an ultimate conclusion in the center. And I think there's - exactly what you said. I think that is an absolute opportunity. And there's also a flip side of that that's going to have an impact. As I think about what the NSA and CISA have recently advised in regards to this OT and ICS attack surface - it being massive, a perfect storm that you need to build a program around now - the other side of this is - imagine you have some of the most informed intelligence agencies telling you to act immediately and you don't. And three or four years from now, you encounter this issue that ultimately comes back to the fact that you've had this massive outage. There's potentially been loss of life. It relates back to - there's been no program built around your OT and ICS environments. 

Curtis Simpson: The other side of that is companies may very well be at risk of being unable to receive any sort of payout from the insurance companies. Those claims will be rejected. And I do envision that we will come closer and closer to that because what you're seeing is this culmination of - we need to do more. Let's start getting those regulations in place so we can start forcing more. And I think we're going to continue to learn from this, and I think what we're also going to see is - you've seen this story more recently talking about how CEOs and leaders may be held accountable in as little as four or so years from now for loss-of-life events in their organizations. This is where I say all of these things are coming together and that there's an elevated level of responsibility being talked about from a number of different perspectives. And I do think we have the opportunity to influence it both positively, and I think it's also going to be influenced negatively as events continue to occur. 

Dave Bittner: What are your recommendations for organizations who want to try to get ahead of this? You know, what - where do they begin? 

Curtis Simpson: So I think a good first point and one of the things I often advise folks is don't rely on the regulations to require you to build a program - just start building a program. One of the really important things to understand here is that, one, the technologies exist out there to actually assess these devices, understand the risks associated with these devices, build strategies around managing the risk in terms of those potential impacts and implications to business. 

Curtis Simpson: One of the things, as an example, we help a number of manufacturers with is actually implementing our product and solution in a - in their manufacturing environment such that they can assess all of those components throughout that build process and have a greater level of confidence that they're using an external source of information that understands how to break an IoT device into all of its individual components, assess it for risk, provide you with the readout, and then you're building these more secure capabilities into those devices. 

Curtis Simpson: But as we think about this as enterprises, I think what's most important - particularly as we think from a risk leadership perspective, from a CISO standpoint - is we've got to go back to what's most important about our roles, which is protecting what's most important to our businesses. And I think it is important for us to remember that ratio of devices has changed. And we need to take a step back, look at our most critical solutions and services that we use to deliver all those capabilities back to our customers, our partners, etc. - what are the attacks and through which devices and the exploitation of what vulnerabilities can these bad things happen? And I think, if we're looking at that with due diligence or the appropriate level of diligence now - starting to build a program around it, being responsible around it - my advice would be - it's been the same as it's always been in cyber, which is if we're doing that, we're ultimately working towards compliance, regardless. 

Curtis Simpson: And as these regulations ebb and flow, which they will - because these are good-start regulations for the most part, as opposed to some sort of end result - we will have already been working towards a place where we've built that ecosystem to be able to achieve compliance because what I also see this burgeoning towards is that there's going to be a mandate at some point in time around some of the things we're talking about today - can't use these unsecured devices or devices that haven't been assessed, must have a program around managing OT and ICS security - doing that overnight is almost impossible. And if we get into a situation where retaining business or gaining business with the government or otherwise is hinging upon that, it's in our best interest now to start looking at risk from the perspective that I just mentioned. 

Curtis Simpson: One of the things that is really important as we consider regulations and as we consider this risk - it's not about breaking the risk down between government or public and private sector and consumer; it's about looking at the device, right? Because these devices are now everywhere. And, yeah, the risk within government can be extreme, obviously, depending on that government's overall mandate and scope of responsibilities, et cetera. But the reality is this - if we look at 2020 and what we're facing is not only are we seeing much of our work forces have gone remote; we're seeing many companies say that many of these workforces will remain remote or a larger subset will. As we say those things, it's also important for us to remember that we're telling bad actors that as well. 

Curtis Simpson: And as we think about this larger risk and even some of the warnings that have come from FBI over the last year or so around televisions being compromised in homes and such - a lot of these botnets that are being formed by APT28 or otherwise are actually being formed of the devices that we all have, including the ones we have in our homes. If we continue to almost disregard the consumer element of this in terms of how we're protecting the citizens from these attacks, that also has a very significant ripple effect into public and private sector because we all go home, and right now we're all going home to devices that don't really have a lot of mandates, regulations, requirements. They're just being sold to do something, and people are buying them because they're the right price or whatever else. That's what we're running towards if we don't consider the risk scenario as all-encompassing versus starting to slice it into different channels of users and companies in it. 

Curtis Simpson: The definition of what devices could have an enterprise impact now and where they must sit and who is purchasing them and how they're being used has gotten very blurry as a result of 2020 in particular. 

Dave Bittner: All right, Ben. What do you think? 

Ben Yelin: So another really interesting conversation. I think what's informative about what he said is that organizations have to be forward-looking and anticipate their liability risks in advance. So it's not good enough to understand the regulatory environment as it exists right now. You have to have somebody, whether it's your CISO or somebody else on staff, who can anticipate where the regulatory world is going because you could get to a situation where you're using a type of technology or IoT that runs afoul of new government regulations or new state policies. So I think that's just a very important and prudent lesson for people who are in the industry. 

Dave Bittner: Yeah. Yeah. So I think it really speaks to that whole notion of being proactive rather than being reactive. 

Ben Yelin: Yeah, and having a full understanding of your risks. That's what risk analysis is all about. It's not just about what your risks are now. It's, yeah, being forward-looking and proactive to try and figure out what your legal risks will be in the future. 

Dave Bittner: Yeah. All right. Well, our thanks to Curtis Simpson for joining us. We do appreciate him taking the time. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.