Caveat 8.25.21
Ep 91 | 8.25.21

Apple CSAM: well-intentioned, slippery slope.

Transcript

David Derigiotis: If we don't continue to fight for that, we're going to lose every possibility of living in a more secure and a more protective environment without Big Brother, without Big Technology knowing every single step that we take - when we leave our homes, when we're sleeping, who we're talking to. You know, we're going to get to the point where we're going to be fully exposed, and there will be nothing left that we can do about it.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner, and joining me is my co-host, Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: We are taking a bit of a departure from our usual format this week, and we're going to take a closer look at Apple's recent announcements that they will be enabling scanning for child sexual abuse materials on iOS devices. We're going to be spending the entire episode on this topic, and joining us is going to be our special guest David Derigiotis. He's corporate senior vice president at Burns & Wilcox. 

Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right. So as I said at the outset here, we're taking a little bit of a different approach to this week's show. Rather than go through our own individual stories, I think this announcement from Apple warrants its own conversation. And before we dig in, Ben, of course, we want to welcome our special guest this week. Happy to have David Derigiotis return. David, welcome. 

David Derigiotis: Thank you, Dave. Thank you, Ben. Great to be here. 

Dave Bittner: David is corporate senior vice president at Burns & Wilcox, and we're happy to have him with us. I want to start the conversation by saying that we're going to do our best this week to focus on the policy side of this issue. I think a lot of other people have addressed the technical side of this - how the engineering, the encryption, you know, all of those things that are going on behind the scenes. And there are plenty of places to hear conversations about that side of things. We'll have some links to some other podcasts. The ATP podcast - the Accidental Tech Podcast - did an excellent discussion. We'll have links to those in the show notes. 

Dave Bittner: Let me start off - let's just sort of do a little taking the temperature here. Ben, why don't I start with you? Can you just give us a little bit of an overview of your understanding of what exactly is going on here? 

Ben Yelin: So Apple introduced two new programs to monitor sexually exploitive material from children, so people who are minors. The first one I have fewer civil liberties issues with - that is an algorithm that recognizes nude images in the iOS messaging application. It would - it requires a parent to opt in, so it's not the default setting. And once a parent opts in, that parent, if that child is under 13, would be notified if that child sends or receives a nude image. That doesn't present civil liberties concerns from my perspective because the parent is opting in and we're talking about very young children here, and it's obviously a very worthy policy goal to prevent child exploitation. 

Ben Yelin: The other program that was announced certainly does present more significant civil liberties concerns, especially as we get into some of these more slippery-slope arguments about what might happen. As part of that program, Apple is monitoring photos that have been posted to the iCloud for images that match exploitative child pornography images that are in national databases, like the one maintained by the National Center for Missing and Exploited Children. That program is going to be used at the outset in the United States. It's going to be rolled out over the next several months. And presumably, if Apple finds a photo through their algorithm that matches a photo in that database, then law enforcement is going to be notified. That could potentially lead to criminal prosecution. 

Ben Yelin: And I think that program is the one that's causing Fourth Amendment advocates and civil liberties groups the most concern. And the concern is that Apple has created a backdoor where, if the government seeks to request information and seeks to do so in a way where they're not asking for a warrant issued by an impartial, you know, judicial magistrate, then Apple has created the technology where they can go into somebody's private photos and extract that information. I think that just strikes people the wrong way. It cuts against our values of digital privacy. 

Dave Bittner: David, when you heard the news about this and saw some of the initial reactions, what was your response? 

David Derigiotis: The concern is more overreach by large technology companies. There's already so much invasive operations - you look at Facebook, you look at Google. And for the last several years, Apple has really built a reputation on privacy. I mean, if you look back to the 2015 San Bernardino shooting, the pressure that they received from the FBI to create that backdoor to get into the phone to be able to bypass the four-digit code at the time and get access to the contents, they received such pressure from the FBI, but they remained firm. So my question to Apple is, what's changed since then? You know, you see the advertising, the billboards that they put up - what happens on your phone stays on your phone. This is moving in the complete opposite direction. And while I think the spirit of it is well-intentioned, I think we can all band together and say that, you know, explicit images with minors, children, that's something that's horrendous, despicable. Nobody wants to see that. But unfortunately, once you open up the door to this kind of access on private citizens' devices, there's no closing it. And it will only get wider going forward. 

Dave Bittner: Couldn't the argument be made that Apple's response in the San Bernardino case gives some credibility here that they have demonstrated that they will resist government pressure to open things up? Ben, what do you think about that? 

Ben Yelin: I do think it, perhaps, gives them more credibility, because that was the most high-profile case of the government not only exerting pressure on Apple through the courts but also in the court of public opinion, saying they're protecting information that might lead us to prosecute terrorists who killed, you know, 15 individuals in this horrific attack. And Apple did stand strong. Although, eventually, FBI was able to break their encryption anyway. So the issue did become moot. But I also think that's one of the reasons the backlash has been so swift, because it is Apple, because it is this company that has presented itself as the foremost protector of private user data. I don't think the reaction would have been the same if it had come from another provider just because Apple carries that sort of cachet as being this company that presents itself as the utmost protector of our private information. So I think it goes both ways. They've earned credibility in the past. But just by making this decision, that cuts against their reputation as a company. And I think it might throw some of their past actions into question as well. Are they really as protective of private information, of end-to-end encryption as they've claimed to be over the past several years? 

Dave Bittner: You know, David, it seems like part of the outrage here is the fact that Apple has chosen to do this on device rather than scanning images that are in the cloud, which is what many other providers are doing. Google does this. Facebook does this. Dropbox does this. So that is a routine, uncontroversial thing. But it seems like Apple did not expect the backlash of actually doing the scanning on the device. What - in your mind, what's the difference here? 

David Derigiotis: Well, I think there's a big difference. And Apple promises - they make promises that they won't bend the knee to government intervention requesting access to data that's on the phone, looking at photos and other content. But we also have to remember, Apple sells iPhones without FaceTime in Saudi Arabia... 

Ben Yelin: Exactly. 

David Derigiotis: ...Because local regulation prohibits encrypted phone calls. And if you look at one of the most stringent cybersecurity laws in the world and, really, most invasive - if you take China, for example, they have a couple of data centers that they opened in China because the law requires it. So Apple has been legally required to remove VPNs. They remove news, other apps, from the store because in China, the government needs access to that information, and they require audits as well. So I think it's very difficult for Apple to uphold the promise that they will not give up information that's located on the device in the iCloud when we've already seen instances of them following regulations, being compliant with governments all over the world. Who's to say that this won't be next? And who's to say that the government won't ask for a little bit more, as opposed to it just being right now, you know, sexually explicit material? 

Ben Yelin: I think that's a great point, David. And I know this is cliche. We just had our baseball game in the Iowa cornfields, reflecting "Field Of Dreams." 

(LAUGHTER) 

David Derigiotis: Yep. 

Ben Yelin: And the line from that movie is if you build it, they will come. 

David Derigiotis: (Laughter) Exactly. 

Ben Yelin: And I think that's, unfortunately, applicable here - is that once this technology is built, you can't absolutely assure users across, you know, the hundred some odd countries where Apple sells its products that their information is going to be protected into perpetuity, because once this technology exists, it can be used not just to scan photos, but to scan other applications, to scan social media applications. Although, that's already, of course, public anyway. But to scan messaging applications, to crack down on political dissidents or to foster censorship - and as you've said, we've seen examples where they've bent the knee to more totalitarian countries who have made these requests because they want to continue to sell their devices in those countries. So once this technology exists, there are going to be governments across the world - and not governments that are particularly friendly to civil liberties - who want to exploit this technology for their own purposes. And it's going to be much harder for Apple to go to these authorities and say, we're not going to do that when now, these authorities are fully aware that Apple has the technological capabilities to do so. 

Dave Bittner: You know, child sexual abuse material is a special category, and the National Center for Missing and Exploited Children, who heads up the database for this, they have a special status when it comes to this sort of thing. They are - it's my understanding they are the only organization in the U.S. who is allowed to house these materials because they have to. You know, they have to have - in order to create the database, they have to do this. How does the fact that this is a special category of crime play into this if at all, Ben? 

Ben Yelin: So from a legal perspective, the First Amendment does not protect child pornography. And that's a very - that's something that's very unique for that area of the law. The First Amendment is protective of all types of lewd images. Even, you know, obscenities, things that are offensive to most of our eyeballs, that is protected under our First Amendment. There is this carve-out that the Supreme Court has reaffirmed over and over again that child pornography does not merit First Amendment protection, partially because it is its separate category in that it is extremely exploitative of children. It can have real-world effects. It's not theoretical. So I think we have to keep that perspective in mind. 

Ben Yelin: And that's why Apple is rolling out this technology in these circumstances to apply to this - these exploitative images - is that we know that sexually exploitative images of children does carry this sort of extra weight. It falls outside areas of First Amendment protection in a way that all other types of speech, be it political speech, personal expression, do not. So I do think it carries, you know, some sort of extra meaning, not just morally, but within our legal system as well. 

Dave Bittner: David, do you suppose that this adds a layer of anxiety to folks who are using iOS devices knowing that - I mean, there's a - it's hard to imagine - this is such a horrible crime that I think the possibility of being falsely accused of it could cause anxiety for people. I'm not one who believes in that argument that, you know, if you - if you're not doing anything wrong, you have nothing to hide. I mean, you know, that's - I think runs counter to many of the things in our Constitution. But I can see - you know, this is a new level of surveillance, and it's on your device. It's there all the time. If you're using iCloud photo services, it's there. That's sort of the ballgame, yes? 

David Derigiotis: I completely agree, Dave. And look, Apple has been - for the last countless years, they've been a company that promotes privacy. They've been a company that's used their edge in this space to gain a competitive advantage over the likes of Google, over the likes of Facebook. So they've been the one leading the privacy charge. And I think that the American public - we've just seen countless data breaches. We've seen constant issues of rogue employees, threat actors that are accessing our personal information. There's not a day that goes by where you don't see a new headline surrounding ransomware, you know, exposing sensitive and confidential medical information, whatever it may be. And Apple has really been the leader. They've been at the forefront of promoting privacy, of promoting keeping your information secure. 

David Derigiotis: So I think that there is a heightened level of anxiety because now you have Big Tech reaching directly onto your device in ways that I think a lot of people don't realize can happen. So what photos you have, the text messages you're sending, the contacts that you have - so many companies constantly scrape. They go after this information. They share it. And now, we have another instance of a program - again, it's in the best spirit. The intentions are very favorable in terms of stopping the spread of this type of information, stopping the spread of sexually explicit material as it relates to children. It's, in theory, a good idea, of course. But again, if somebody is wrongfully accused, if there is a threat actor internally, if there's a rogue employee - there are just so many ways for this to go sideways, for it to go wrong. And I think Apple has announced - they did say that there's - the number was astronomical. I think it was one in a trillion that there would be no false positives or one in a trillion false positive, so it's very unlikely. 

David Derigiotis: My concern here is, again, they are coming onto your device, onto your personal information. What's going to happen if somebody gets into Apple, if they're able to access information that they wouldn't have been able to access previously had it not been for this program? What if a rogue employee decides to do something that they shouldn't do if they're going to get some type of payoff from a criminal? And then again, what's the next step from here? You know, as Ben mentioned, what does this lead to? Once this door is opened, it can only get wider. You can't go backwards. 

Dave Bittner: Yeah, and as we're recording this, you know, this morning, I saw on social media some folks in cybersecurity or - there are GitHub projects underway where people are starting to experiment with creating adversarial images to counter this, to confuse this, to, you know, cloud this, to make this more difficult. So, you know, that's a real issue as well. It's a possibility. All right, we're going to take a quick break here. We're going to pay some bills and let our advertiser share their message with you. We'll be right back in just a moment. 

Dave Bittner: And we are back - Ben and I, joined by our special guest, David Derigiotis. He is corporate senior vice president at Burns & Wilcox. Apple has been very specific and clear about the levels of technology they have put in place here to try to keep things on the straight and narrow when it comes to this. Apple's Craig Federighi sat down with Joanna Stern from The Wall Street Journal. They had a good - what I'd describe as a clarifying conversation in response to a lot of the backlash. And one of the points that Craig made was Apple, who has been accused of being late to the table with this sort of thing, as we mentioned earlier - Facebook, Google, Dropbox - they've all been doing this sort of scanning for quite a while, and Apple has been lagging. And Apple has gotten pressure from folks in Congress that they needed to step up and do a better job. 

Dave Bittner: So part of this - I think part of the why now question might be that this is a response - if Apple didn't do this themselves, they would have been forced to do something through legislation. So all that said, the technical steps that Apple has put in place are significant. Craig made the point that, you know, you have to have 30 images or so on your device before Apple even gets notified that there's an issue here. There has to be this critical mass reached before Apple is even able to access any of the images. And only then does a human intervene and view a low-res version of the images to verify that they are actually what the machines think that they are. 

Dave Bittner: So what I'm getting at here is that it seems as though Apple, in good faith, tried to come at this problem from a technological point of view. It seems to me that Apple themselves are surprised at the amount of backlash that they are getting here. Ben, does that surprise you? 

Ben Yelin: It surprises me a little bit just because you'd think that Apple, the most privacy-conscious of these companies, would understand why this would cause a backlash. I'm not surprised by the backlash just because I think it conforms with values that we have. You know, our Fourth Amendment jurisprudence in this country is built around this idea of reasonable expectation of privacy. If we display a subjective expectation of privacy by trying to keep our information - whether that's photos, messages, et cetera - to ourselves, and if that expectation is reasonable, then the government needs a warrant to obtain that information. I think even though, you know, that's a provision of the Constitution, it's also a fundamental value that if we seek to keep something private, then the government can't get access to it. And so I think just the possibility, either whether it's through this or something like the EARN IT Act - a piece of proposed legislation that would have compelled companies like Apple to take a similar action - the concern is that we'd - it would be a very significant violation of that expectation of privacy. And I think that offends us on a pretty deep moral level just because it's such a fundamental value of our country. It has to do with, you know, going back to our English legal ancestors, where you would have these general warrants where minions of, you know, the ruling king or queen would come in and look for incriminating information in people's houses, even if they, you know, didn't have any suspicion that somebody had done something wrong. And so I think our ears perk up a little bit when we hear something like this, where we know that Apple is going to be accessing our private digital space. So I just think based on our own political culture, you can understand why there was such a backlash. 

Dave Bittner: David, what are your thoughts on that? 

David Derigiotis: I agree with Ben there fully. I mean, again, how much information are we willing to give to these large technology companies that already encapsulate so much around the way that we live our lives? They already know our location. They already know the different apps that we're using, when we're using them, how long we're using them. They already know all of the contacts that are in our phone, you know, who we're corresponding with. So we're giving up, inch by inch, sometimes mile by mile, as a matter of fact, bigger pieces of our private life. We're going to get to the point - if there is no action taken and people aren't mindful - and I think this is why there has been such a backlash because people have become much more mindful of what it means to be private and to have that expectation of privacy. If we don't continue to fight for that, we're going to lose every possibility of living in a more secure, in a more protective environment without Big Brother, without Big Technology knowing every single step that we take, when we leave our homes, when we're sleeping, who we're talking to. You know, we're going to get to the point where we're going to be fully exposed. And there will be nothing left that we can do about it. 

Ben Yelin: It kind of strikes me that, you know, most of the intrusive technological surveillance that we've been subjected to over the past several years has been more of the metadata variety. You talk about contacts, you know, who we're messaging, what the duration of that message was. Even something like historical cell site location information doesn't, you know, reveal our most private communications. So I think that's where this is a very prominent step in that direction - is we're talking about content here. We're talking about the actual photos and the actual messages. So I think there really is a distinction in terms of the private information Apple has had access to in the past and the private information now that they have access to the content of our devices. 

Dave Bittner: What about the fact that this is all opt in? I mean, in other words, Apple has said that if you're not using iCloud Photo services, they're not even going to be scanning your photos. It effectively turns off the switch for any of this scanning to happen. The hashes for the images are going to be in the operating system. You know, that's going to be baked in now. But unless you turn on your iCloud functionality, they're not going to take a look at it. And yet people don't seem to be calmed down by that. Why do you think that is, David? 

David Derigiotis: Well, again, I think it's the level of awareness that Apple has really created in their marketing strategy and their campaigns, their focus on privacy. I do think that it is a positive and a great step that they're taking in terms of making sure that the consumer has to specifically and explicitly opt in to this service. But again, we've seen a number of cases - not Apple, but we've seen cases throughout the years where companies make retroactive changes to the privacy notice, to the actions that they're taking. And they just automatically start collecting information, or they're automatically opting people in to the various programs or data collection methods that the company employs. So this is where you as a consumer have to pay very close attention to the terms and conditions to any future changes that could occur. Because right now, it may be opt in but who's to say two years from now, that doesn't change, and you're automatically opted in when you do a software update for the newest Apple release? Those are things that people need to pay very close attention to because it can come back to bite you if you're not aware. 

Ben Yelin: You mean, David, you don't read all 200 pages of the terms and conditions... 

(LAUGHTER) 

Ben Yelin: ...When you download that OS update? 

David Derigiotis: I know. You know, being involved in the privacy space, I hate reading all of those terms and conditions and privacy notices. 

Dave Bittner: Yeah, actually, I guess it's part of your job that sometimes you have to, right? Poor guy. 

David Derigiotis: That's right. 

(LAUGHTER) 

David Derigiotis: That's right. 

Dave Bittner: You know, it also strikes me that as much as Apple has built a reputation on supporting privacy, as being, you know, the privacy company among these big tech giants, they also have a reputation for a certain amount of arrogance. You know, Apple knows what's best for you, right? You don't - you know, when the iMac initially came out, you don't need that floppy drive anymore. You know, with the recent iPhones, you don't need that headphone jack anymore. And while I think that's largely worked out for Apple - certainly, you know, their products are selling well. They are - if not the - among the most valuable companies in the world. I think this is a case where that attitude - where they didn't float this publicly before they just came out and announced it fully baked, right? They came out and said, this is what we're doing. It is a technological marvel. It's going to solve all of these problems, and it's a done deal. I think that attitude from Apple in this case isn't really serving them well. Do you agree with me, David? 

David Derigiotis: I agree. I mean Apple has - so they have over 1 billion iPhones in use today. The iCloud has over 850 million users across the world. They've built a reputation around user experience, around privacy. And I think anything that they do, any steps that they take that are contrary to that perception, to that, you know, kind of cachet that they've created, it's going to create outrage. And I think that's why we're seeing the backlash here because they have built a reputation on protecting their consumers. They have built a reputation on going in an opposite direction that other large tech companies have chosen to take. Apple's on a different path. 

David Derigiotis: And again, the question I would ask; what's changed now from your stance in 2015, 2016, when you were willing to put up that fight, when you were not willing to turn over the contents of the phone and not willing to create that backdoor to break the encryption? Because then it was about privacy. And Apple stated, if we make these changes, if we create that backdoor, privacy will be lost going forward for everyone. 

Dave Bittner: Yeah. 

David Derigiotis: So now that they're taking this new step, the question I ask is, what's changed since 2015? 

Dave Bittner: Yeah. And I think it's worth, you know, just mentioning as a point of clarification that Apple has - Apple does not encrypt your iCloud Backups. And that has been a way for law enforcement to, you know, get at information that people had stored on their phones. It's also benefited Apple because it gives them the ability to help people restore their phones. If a phone's been locked or broken or lost, you know, they can get at those backups that they otherwise wouldn't. But, you know, in the past, Apple has made noises that they were going to start encrypting that. And they've got pushback from folks like the FBI, and they haven't done it so far. 

Dave Bittner: Ben, I'm curious. You know, we hear stories about these tech companies receiving demands from the government to turn over information. And part of that demand includes not being allowed to tell anyone that the demand was made. 

Ben Yelin: Right, a gag order. 

Dave Bittner: Yeah. Is that coming into play here? Is the specter of that hanging over this as well? 

Ben Yelin: Absolutely. In a bunch of different circumstances when the government requests information, those requests come with a gag order. And depending on the circumstance, especially if it has to do with something national security-related, it might be really hard to even seek judicial review to reveal information that's part of that gag order. That could be very dangerous for the public because Apple is, in a sense, muzzled. They're not able to share the type of requests that they've been given, you know, whether it's been informal or whether it's been an official subpoena. And, you know, that can certainly impact public debate on this. If Apple is receiving a bunch of these requests, but they're bound by gag orders, how are we to know, as the public, exactly what Apple is asking for and exactly what they are receiving? And that really does depend on the context of the information that's being sought. But it's certainly an issue that's in play here. 

Dave Bittner: David, in your mind, where could Apple go from here? Is there a way that they could salvage this situation? 

David Derigiotis: You know, I think for true privacy advocates, there's no turning back for Apple at this point. I don't know if there's any salvaging the messaging that's already gone out, the steps that they're taking. They're not launching this update until the end of this year. And it's going to be in the U.S. - United States - only. So I think it can only snowball and become more of a privacy nightmare going forward as they introduce this in other countries, as they roll it out in other places across the world. You know, my worry is, what's going to happen for the citizens in those countries? What's going to happen for us here in the United States years down the line? 

David Derigiotis: Because again, they are taking it, as it appears, very strong security measures. The hashing methods that they're doing on the phone, matching them back to the, you know, NCMEC - the National Center for Missing and Exploited Children. They're doing it in a private manner in a way that makes sense. My fear is, what's the next version of this going to look like for citizens in China and Saudi Arabia? What's going to happen for the citizens when the government says, well, we want to now see different types of communications, whether it's political, whether it's religious. What's going to happen when they start matching hashes to those types of conversations and those types of images that are being communicated and shared between the citizens? I don't think there's any going back from here. It's only going to get broader and wider in the years to come. 

Dave Bittner: A question for for either of you - I mean, do the service providers themselves - the Apples, the Googles, the Facebooks of the world - do they have liability for potentially having these images on their servers? Is this - are they protected by, say, you know, Section 230 of the Communications Decency Act? Or are they obligated to look for and scrub and report these sorts of images? 

David Derigotis: Well, Ben, maybe I can jump in here for part of this. Section 230 is not all-encompassing. There are some exceptions to what's covered - federal criminal liability, electronic privacy and intellectual property claims. So I think any type of information that an organization collects, stores, no matter how sensitive - you can look at the most sensitive medical information procedures and billing information that are stored and collected on individuals. You can look at Social Security numbers. If they're choosing to store very sensitive critical data, consumers - it's their obligation to protect it and to store it securely because they're the ultimate data owner. They are the ones that have this information within their network. They need to protect that data as well. So I don't see a scenario where they will be protected from any type of liability. The more information they collect, it only increases the liability that they have for the consumers that they service. 

Ben Yelin: Yeah. That's my read of the situation as well. And it also is the Communications Decency Act that we're talking about. So the intention of that act was to limit the exposure of offensive content to children. I mean, Section 230 is one part of that law. But, you know, I think while there is a general shield of liability for images or content posted on these third-party networks, as David says, it's not all-encompassing. 

Dave Bittner: Yeah. 

Ben Yelin: And there are ways in which they can be held liable. 

Dave Bittner: All right. So final word - Ben, let me give it to you. I mean, suppose Apple came at this and said, you know what? We're going to reconsider this. We're going to come at this the same way that some of our colleagues have. We're just going to - we're simply going to scan things on our cloud servers. We're not going to do this on your actual device. Do you think that would put them in the clear? Would people be OK with that? 

Ben Yelin: I kind of agree with David that the cat's already out of the bag. You can't, you know, stuff this back in just because they have already decided to make the decision to go into people's devices. Now we know, A, that they're willing to engage in such a step, you know, even if they were to retract under severe public backlash and, B, that the technological capability exists. So even if they were to retract these particular programs, we know that they have the technological capability and if they were ever sufficiently pressured, whether it's by our government or by Saudi Arabia or by China, into making use of this technology, that the technology exists. That's now all on the record that this is something that they've developed and are willing to use. 

Ben Yelin: You know, I think we can surmise that this effort is the result of political pressure because members of Congress, you know, as evidenced by the proposed legislation of the EARN Act, think that tech companies have not done enough to protect against childhood exploitation. So in a sense, this announcement is a reaction to public pressure, which means what's to stop them from similarly reacting to public pressure in the future, whether it's in this country or a country that cares even less about civil rights and civil liberties? So I think it would be really hard for them to fully undo what they have already done by making this announcement. 

Dave Bittner: David, your final thoughts. 

David Derigotis: I agree with Ben 100%. And I think this is just a reminder if we truly want to obtain personal privacy and protection, we cannot give so much power to any one technology organization, any one company in general. It's important to segment your digital life. Having all of the information, like your Apple device - having all of the communications, the phone calls that you're making, contacts, photos - it puts you at risk. And you're really at the mercy of that company no matter how strong of a privacy reputation they want to promote. So I think the biggest learning lesson from all of this is segment your exposure. Don't put all of your eggs in one basket, and make sure you're mindful of how the company operates, what they're collecting and who they're sharing that information with. 

Dave Bittner: All right. Well, gentlemen - good conversation on a difficult issue. Thanks to you, David Derigotis, corporate senior vice president at Burns and Wilcox. Thank you so much for taking the time to join us today. 

David Derigotis: It is my pleasure. Thank you, and thank you, Ben. 

Ben Yelin: Thank you, David. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.