Dave shares a story from Gizmodo about lawsuits aimed at Ring and Amazon. You asked - Ben listened - his take on an op-ed from the New York Times about cell phone tracking, and later in the show we interview Michelle Dennedy, formerly of Cisco and now CEO of DrumWave about the future of data value and... elephant masseuses.
Links to stories:
Thanks to our sponsor, KnowBe4
Michelle Dennedy: [00:00:09] People think that technology is going to end it all and be the solution to it all. And the answer is neither one.
Dave Bittner: [00:00:16] Hello, everyone, and welcome to "Caveat," the CyberWire's law and policy podcast. I'm Dave Bittner, and joining me is my co-host, Ben Yelin, from the University of Maryland Center for Health and Homeland Security. Hello, Ben.
Ben Yelin: [00:00:27] Hi, Dave.
Dave Bittner: [00:01:07] And now some reflections from our sponsors at KnowBe4. What's a policy? We've heard a few definitions. Some say it's an elaborate procedure designed to prevent the recurrence of a single non-repeatable event. Others say it's a way the suits play CYA. Still, others say, it's whatever happens to reside on those binders that consultants left behind them right before they presented their bill. How about a more positive approach? As KnowBe4 can tell you, better policy means better security. And getting the policies right is a big part of security; so is setting them up in ways that people can actually follow. We'll hear later in the show how you might approach policy.
Dave Bittner: [00:01:49] And we are back. Ben, I'm going to kick things off for us this week. This story comes from Gizmodo, written by Dhruv Mehrotra. And the story is about Ring and Amazon being sued - this is a class-action suit - over allegations of negligence, invasion of privacy, breach of contract and unjust enrichment for lax security standards that people claim have left thousands of cameras vulnerable to hacking. Want to help us unpack what they're going for here?
Ben Yelin: [00:02:20] Sure. So this is a lawsuit filed with a U.S. district court, a federal court in California. Just for a brief refresher, a class-action suit means you have a lot of plaintiffs, and they have to be similarly situated, so they have to have suffered some sort of common harm. And there have to be a sufficient number of similarities among all of the members of the class. It's advantageous for individual plaintiffs to join a class action because, you know, me or you suing Ring or Amazon, we're not going to be able to afford good enough lawyers to make that happen. It's going to be too much of our time and money. But pool, you know, hundreds of us together, then, yeah, we can pay the lawyers. They can instigate that class-action lawsuit.
Dave Bittner: [00:03:01] I see.
Ben Yelin: [00:03:01] So just a little background there. The named plaintiff in this lawsuit is a guy from California, John Baker Orange. He alleges that his three children were outside playing basketball when the Ring device basically started harassing and talking to his children. The voice said some unsettling things. The children were playing basketball, and whoever had hacked the Ring device spoke directly to the children. They were commenting on their basketball play - certainly, things that are very creepy.
Dave Bittner: [00:03:27] And we've seen a couple stories like this where people have had Ring cameras inside their house as well, basically using them as baby monitors almost, and people talking to the kids.
Ben Yelin: [00:03:37] Yeah, it's super creepy. The other story they mentioned in this article is from Mississippi, where during a hack, a person impersonating Santa Claus tried to harass an 8-year-old kid by hacking the Ring device.
Dave Bittner: [00:03:50] Blech (ph).
Ben Yelin: [00:03:50] So yeah, I mean, these are certainly pretty harrowing accounts, especially hearing them come from parents. So the lawsuit alleges that this happened because of lax security practices on the part of Ring and its parent company, Amazon. What the plaintiffs are saying is that either through negligence, which is, you know, sort of their first bite of the apple, or based on a theory of what's called strict product liability, which means that companies, even if they're not negligent, can be sued if there's a defect in their product. So based on either one of those two theories, Ring and Amazon did not do enough to protect the security and the integrity of their devices. They did not mandate or require that every user have a system of two-factor authentication. And as a result, this plaintiff suffered obvious damages. Even if you can solve the immediate problem if you're one of these plaintiffs by destroying the Ring, throwing it into the trash, a lot of other damages have been done, both financially, emotionally, et cetera.
Ben Yelin: [00:04:53] So this class-action suit has been filed. There are a relatively large number of parties to this case. The plaintiff's team threw out a bunch of different legal causes of action. I think the two with the best chances of actually succeeding are the negligence claim and this strict products liability claim. On the negligence claim, you have to show that the company breached its duty of reasonable care. And so, you know, to look for evidence of what that duty is, you can look at industry standards. So if they can prove that Ring isn't following proper security protocols, that would be very compelling evidence that they have breached their duty of care. And then on the product liability side, you know, they don't even have to prove - if a court is willing to hear this product liability argument, the plaintiffs wouldn't even have to show that Ring or Amazon had been negligent but simply that the harm had occurred as a result of, basically, defects in the Ring devices. So, you know, Amazon and Ring have not commented on the lawsuit. It's their policy not to comment on pending causes of action.
Dave Bittner: [00:05:57] Yeah.
Ben Yelin: [00:05:57] But it'll certainly be interesting to see how this turns out.
Dave Bittner: [00:06:00] OK, so I have some thoughts on this.
Ben Yelin: [00:06:02] Yes.
Dave Bittner: [00:06:02] Now, before this lawsuit came to pass, we have certainly covered a variety of stories - a number of stories - having to do with this. My recollection is that in the vast majority of the cases here, it was a situation where the parents either had not changed a default password or were reusing passwords. So in other words, the bad guys can go access one of the password data dumps that are readily available on the internet, find a password that is associated with an email address, use that password and gain access to the Ring. So I guess I'm left thinking, if I'm Ring and I go in front of the judge, I'm going to say, hey, these people are reusing passwords or not changing the default password, even though our, you know, instructions say you should - it's the first thing you should do. Is that going to be a reasonable defense in front of a judge?
Ben Yelin: [00:07:06] Oh, yeah. I mean, you've made the argument that the defense attorney is going to make on day one if this ever goes to trial.
Dave Bittner: [00:07:12] Right.
Ben Yelin: [00:07:12] In fact, this will probably be in their answer to this class-action lawsuit. They'll say, we encourage customers and make them very aware that their devices are vulnerable if they either use a default password or recycle a password. So it was actually the user's negligence in not changing the password that caused this injury. What the plaintiff is alleging here is the problems run deeper than simply not changing the password. Their evidence for that is that there have been a number of high-profile data breaches. So it's not just that, you know, one or two people used a default password, forgot to change their password. It's more systemic.
Dave Bittner: [00:07:52] Yeah.
Ben Yelin: [00:07:52] And in their view - the view of the plaintiffs - Ring can do more to ensure that the users who may be not technologically advanced are taking the necessary steps to protect their devices, so requiring multifactor authentication...
Dave Bittner: [00:08:09] Yeah.
Ben Yelin: [00:08:09] ...In order to use the device.
Dave Bittner: [00:08:10] See, that's the other thing that I wonder about here. Could something like this push the suppliers of devices like this to have multifactor authentication be the standard - to require it?
Ben Yelin: [00:08:23] Sure. I mean, if they either lose this lawsuit or are forced to settle and pay out the members of this class-action lawsuit, then that's going to be the next step they take. They will require users to have multifactor authentication in order to use the devices. Then, the next time they get sued, if there's a breach, they'll point to this multifactor authentication requirement and say, we've taken this proactive step to require people to have multifactor, two-factor authentication. That shows that we're not being negligent, and that shows that there's not a defect in our product, in our process. Whatever happens, it's the user's fault.
Dave Bittner: [00:08:59] Yeah.
Ben Yelin: [00:08:59] And they'd probably have a stronger leg to stand on if they did require that multifactor authentication. Now, whether that argument is going to hold suit in a court of law, the case law's sort of mixed on how much you can require of users to ameliorate the potential flaws in a product. And so it's just - it's going to depend on how the court sees it. I really could see it going either way on that question.
Dave Bittner: [00:09:24] Interesting. All right, we'll keep an eye on this one, see how it plays out. What do you have for us this week, Ben?
Ben Yelin: [00:09:30] So the fans of the "Caveat" podcasts have demanded that we talk about this...
Dave Bittner: [00:09:35] (Laughter).
Ben Yelin: [00:09:35] ...New York Times article. It's actually an op-ed - came out towards the end of December, called "One Nation, Tracked." And it is a large-scale investigation into smartphone tracking and the industry around smartphone tracking undertaken by The New York Times.
Dave Bittner: [00:09:51] Yeah.
Ben Yelin: [00:09:52] They were able to do this because somebody who works for one of these tracking companies - you know, and I actually hadn't heard of most of them. Foursquare is one of the high-profile ones I've heard of. But somebody who worked for one of these companies leaked a trove of data to - it was a large trove of data - to The New York Times. And The New York Times spends six months to a year playing around with this data and seeing how much they could find out. And they talked about exactly the breadth of what they could find out in this op-ed. And it's pretty staggering. They were able to follow celebrities and high-profile politicians by combining the data that they received in this leak from publicly available sources. So...
Dave Bittner: [00:10:31] Right.
Ben Yelin: [00:10:32] ...You know, you figure out which celebrity lives at a particular house in Hollywood. You have all of the cellphone pings that go directly to that house. Figure out which cellphone belongs to the resident of that house. Then, you can follow them as they engage in their personal affairs. So as they said in this op-ed, they weren't looking to uncover evidence of crimes or extramarital affairs, but they almost certainly happened upon it because of the bulk of the data that they collected.
Dave Bittner: [00:11:00] The fascinating thing to me in this story is the bare revelation that if I know where you live and where you work, and I have access to one of these databases, that's the ballgame. I can track everywhere that you go based on the pinging of your cellphone.
Ben Yelin: [00:11:17] Right.
Dave Bittner: [00:11:17] In other words, I can figure out what that pinging is by knowing where you live and you work because chances are you're the only person who lives and works - you know, has that combination of specific data.
Ben Yelin: [00:11:30] Absolutely.
Dave Bittner: [00:11:30] And based on those simple things that are readily available, I have a tracking mechanism for you.
Ben Yelin: [00:11:35] Right. So the companies say the data is anonymized. Sure.
Dave Bittner: [00:11:39] Right (laughter).
Ben Yelin: [00:11:40] But if I were interested in figuring out what Dave Bittner was up to on his free time, I could search property records - you know, if I knew generally where you live, I could search property records, figure out exactly where you owned property.
Dave Bittner: [00:11:54] Yep.
Ben Yelin: [00:12:01] I know your place of work, and you are probably the only person who drives from your home to your office. Once I make that connection and I've figured out that particular device, then as you say, basically, all bets are off. I have a full picture of your personal life because if I had access to this treasure trove of data, I could know basically where you were for every moment for an extended period of time.
Dave Bittner: [00:12:23] The cellphone providers, the service providers are selling this information to these data brokers?
Ben Yelin: [00:12:32] Yes. So the way it works is, basically, every application we use, we enable location services. One of the most common ones they talked about in this article is the Weather Channel app. Everybody wants to know their forecast, you know. I want to know if it's going to rain today. I download the Weather Channel app, and they ask if they can use my location. I say sure.
Dave Bittner: [00:12:53] Right. Makes total sense because I want local weather.
Ben Yelin: [00:12:55] Local weather, yeah. Tell me what's going to happen where I live. Once I've shared that with them, I've given them permission to basically collect my location in perpetuity. And we'll talk about some qualifications on that, including one that's come into place with the latest update of iOS, as I found out this week. But basically, that is the case with a whole number of extremely popular applications. And the user has consented to it. I mean, we - generally, on every device, we are asked if we're willing to share our location. Most people don't know the full implications of that. But when you share that with a number of different applications, even if you delete apps or prohibit location sharing on certain applications, if you carry around a smartphone, you are going to be pinging your location multiple times a day, if not multiple times an hour.
Dave Bittner: [00:13:44] Right.
Ben Yelin: [00:13:45] And they will end up selling this very valuable data to these companies in the location data business. Now, the reason this is very valuable - this location data is valuable - is because it's very useful for advertising purposes. If you get a person's personal habits, you can also understand their shopping needs and their consumer desires and which billboards they've passed. And you can do some really cheap, effective market research on the basis of that.
Dave Bittner: [00:14:16] Right. Dave goes to the gym every day. Or more likely, Dave never goes to the gym (laughter).
Ben Yelin: [00:14:20] Does not go to the gym, yes. They know that. They see that I never go to the gym, and that means I'm going to get weight loss advertisements and diet pills.
Dave Bittner: [00:14:30] Right. Right. Right (laughter).
Ben Yelin: [00:14:30] And it's great for the diet pill company because they have a target market, right? So that's basically what's happening here. Now, what I discovered this past week is I downloaded the new version of iOS. And now iOS will inform you via a push alert that your location is being tracked by a particular application, and it'll ask you whether you want that location service to remain on only while you're using the app or whether, as is generally the default, that location collection is going to be on when your phone is on. And this is eye-opening to me because I got those alerts, and they actually show you the map of everywhere they've tracked your location since you've been using the application. And, whoa, that was a little bit scary.
Dave Bittner: [00:15:17] (Laughter) Was there any particular - if you're willing to share, was there any particular app that raised your eyebrows the most where you weren't expecting it?
Ben Yelin: [00:15:24] So the Weather Channel was certainly one of them.
Dave Bittner: [00:15:26] Yeah.
Ben Yelin: [00:15:26] I have that. Obvious ones like Facebook - you've probably shared your location with Facebook, which is why they give you ads for the local car dealer in your - whatever suburb you live in.
Dave Bittner: [00:15:36] Yeah.
Ben Yelin: [00:15:37] So that's one of them. I'm trying to remember if - obviously, you know, the GPS apps, Google Maps, that's certainly...
Dave Bittner: [00:15:42] Right, right.
Ben Yelin: [00:15:43] ...An obvious one. They track your location. The sports applications that I use to get sports scores, that was one that was surprising to me.
Dave Bittner: [00:15:50] Oh, isn't that interesting?
Ben Yelin: [00:15:51] So, you know...
Dave Bittner: [00:15:53] So you could see them making the case that they just want to make sure that they're covering your local teams.
Ben Yelin: [00:15:59] Exactly. Or they perhaps can have a legitimate reason to understand your preference. So for me, I'm a fan of San Francisco Bay Area sports teams. So my app knows that. I've at some point probably enabled location services. It knows that I'm generally around the Baltimore, Md., area most of the time.
Dave Bittner: [00:16:16] Right.
Ben Yelin: [00:16:16] That's very valuable to, say, StubHub, if they want to advertise that my teams are coming to this area and you should buy tickets. And so it happens on all different types of applications. So it is admirable on the part of Apple to at least give the user some notification of which apps are using their location and how closely they're tracking you. And they actually show all of the dots in which you've pinged your location pursuant to that one application.
Dave Bittner: [00:16:44] Yeah.
Ben Yelin: [00:16:44] And it was a lot of dots.
Dave Bittner: [00:16:47] Now, you and I have wondered about would the day come when a legislator had their personal travels revealed publicly in some sort of way that was not to their liking? And would that be the push...
Ben Yelin: [00:17:01] They were, quote, "walking the Appalachian Trail," if you will.
Dave Bittner: [00:17:04] (Laughter) Walking the Appalachian - that's right. That's right. Could that be the catalyst for us seeing some sort of legislation coming to put more restrictions on this kind of thing? Are there any rumblings of that sort of thing? Has this story gotten that sort of attention?
Ben Yelin: [00:17:21] The story has gotten that attention. I think would probably be better politically for it not to come from a lawmaker who had been subject to this tracking. And, you know, they realized that he or she had had an extramarital affair or, you know, were going into a well-known drug selling spot or something like that.
Dave Bittner: [00:17:37] Right, right.
Ben Yelin: [00:17:37] Because this was such a high-profile article, I do think we're going to see a renewed effort for federal data privacy laws. And, of course, that effort existed anyway. But this really just sort of put it into focus. And it put it into focus in the political realm. I mean, one of the people they tracked was - they found the device of a Secret Service agent who was accompanying President Trump as he went on a trip to Florida to the Mar-a-Lago resort and on a diplomatic trip with the Japanese prime minister.
Dave Bittner: [00:18:04] Right.
Ben Yelin: [00:18:05] And they followed around the Secret Service agent who was with the president. And they were able to track the president's movement to a square foot or whatever.
Dave Bittner: [00:18:14] Yeah, on a precisely...
Ben Yelin: [00:18:15] Very precisely on the day of travel. And that...
Dave Bittner: [00:18:18] And these security...
Ben Yelin: [00:18:19] Right.
Dave Bittner: [00:18:19] ...Considerations.
Ben Yelin: [00:18:19] That brings up significant national security concerns.
Dave Bittner: [00:18:22] Yeah.
Ben Yelin: [00:18:23] There - as we've talked about many times in this podcast, there are no federal data privacy laws. This practice is largely unregulated - almost completely unregulated, really. I mean, there are some requirements on what companies have to say in their privacy policies. But there's nothing illegal about what they're doing. They're generally bound by their own policies and procedures. And sometimes, the industry will come up with ethical guidelines. But that's really what is protecting us. California has stepped in with the CCPA for Californians that potentially would allow you to request that this data be deleted once it's collected. It would give you that potential recourse.
Dave Bittner: [00:19:01] Right.
Ben Yelin: [00:19:02] For the rest of us who don't live in California, it's going to be very difficult. We're going to have to demand some sort of policy changes if we want to see this practice discontinued. And it's not obvious to me that most people do or should want this type of pervasive smartphone tracking to stop, just because there are a lot of side benefits to it. It's all a matter of convenience. Our smartphone knowing our location and personal preferences really, probably improves our quality of life...
Dave Bittner: [00:19:32] Yeah.
Ben Yelin: [00:19:32] ...In a number of different ways. It's nice that the Weather Channel knows where I am so that it can give me real-time radar and forecasts...
Dave Bittner: [00:19:38] Right.
Ben Yelin: [00:19:38] ...And can send me an alert if there's a tornado warning. Those things are really beneficial to me. And we couldn't do that if we weren't constantly sharing our location.
Dave Bittner: [00:19:46] Yeah.
Ben Yelin: [00:19:47] So while this is eye-opening, I don't think it, per se, means that this practice is going to stop.
Dave Bittner: [00:19:52] Well, it's an excellent piece, not only for the content but for the design. It's an interactive piece. So we'll have a link for it in the show notes - highly recommended. It's one of those pieces is definitely worth your time. So do check it out.
Ben Yelin: [00:20:05] Absolutely. And another thing I'll mention is on - we'll have it linked. One of the tabs on that page is basically how to prevent this from happening to your device. So the title of one of the tabs is "Freaked Out? 3 Steps to Protect Your Phone..."
Dave Bittner: [00:20:20] Yeah.
Ben Yelin: [00:20:20] ...Which would limit the collection of this location data.
Dave Bittner: [00:20:23] Worth checking out. It's time to move on to our Listener on the Line.
0:20:26:(SOUNDBITE OF PHONE RINGING)
Dave Bittner: [00:20:31] Our listener on the line this week comes from - not too far from us from Fairfax, Va. And here she is.
Julie: [00:20:37] This is Julie (ph) from Fairfax, Va. I have a question about the sharing of private or intimate photos online. Is there a legal distinction between if I share a photo with someone and they then share that photo versus if they take that photo from me without my knowledge and then share it? Thanks.
Dave Bittner: [00:20:56] All right. Ben, that's an interesting question and some nuance there, I suppose. What do we need to know about this?
Ben Yelin: [00:21:02] So a very good question. Thank you, Julie. There is a legal distinction between you voluntarily sharing the photo and somebody taking a photo without your knowledge or your consent. And this gets back to what we've talked about with the third-party doctrine. If you volunteer anything to somebody else, whether it be a photograph, even, you know, just your own words, if you say something incriminating to somebody else, you don't have a reasonable expectation of privacy in what that person does with that information. We used to see a lot of court cases about this as it related to confidential informants.
Ben Yelin: [00:21:41] So one of the main cases came from the Jimmy Hoffa era. They hired a confidential informant. Hoffa thought he was talking to one of his co-conspirators. It was really somebody from the FBI. And Hoffa's argument was, I didn't mean to share this with the government; I meant to share it with this one person. So I should have some sort of constitutional protection. And what the court has said is that's not the way it works. If you really wanted to keep something private, you wouldn't tell it to somebody else, you know?
Dave Bittner: [00:22:11] Right.
Ben Yelin: [00:22:12] The court's not going to go through the process of tracking where a certain piece of information came from once you, the holder of that information, you know, the person who owns that photograph, has released it.
Dave Bittner: [00:22:24] I guess the practical implications of this are, say, we have a couple who are dating and one of them shares an intimate photo with the other one and the relationship ends. And then the person who has that photo shares it with friends or shares it online or whatever. If the person who's in the photo calls the police or the sheriff and says, oh, my gosh, you know, this is terrible. This photo of me is out there. Can you do anything? The police really aren't going to be able to do anything because you shared the photo.
Ben Yelin: [00:22:58] That photo was shared voluntarily.
Dave Bittner: [00:22:59] Right.
Ben Yelin: [00:22:59] Yeah. I mean, that's why everybody has to be very careful about what they share online and in all forums. And the practical implication of this legal doctrine is that you shouldn't trust anybody else with your information...
Dave Bittner: [00:23:11] (Laughter) Trust no one.
Ben Yelin: [00:23:11] ...Unless you - yeah, unless you - you know, maybe it's your wife or husband you've been with for a while.
Dave Bittner: [00:23:17] Right, right.
Ben Yelin: [00:23:17] Sure. You know, if you think this relationship has a chance of not lasting, yeah.
Dave Bittner: [00:23:22] Well - yeah. And you don't necessarily have a legal backstop. I guess that's the lesson here, right? That...
Ben Yelin: [00:23:28] No, that's something that's kind of controversial and unfortunate. Once electronic images are shared on the internet, there's very little somebody can do to get them taken down, especially if those have been shared publicly. Now, there are potential causes of action if there's some sort of, like, defamation or something. But beyond that, it's extremely difficult.
Dave Bittner: [00:23:51] Yeah.
Ben Yelin: [00:23:52] And that's why people need to be extra cautious about what they share online.
Dave Bittner: [00:23:57] Right, right. All right. Well, again, thank you, Julie (ph), for sending in the question. It's a good one. We'd love to hear from you. Our "Caveat" call-in number is 410-618-3720. That's 410-618-3720. If you want to call and leave us a message with your question, perhaps we will use it on the air. Also, you can send us an audio file. You can send that to firstname.lastname@example.org. Coming up next, my interview with Michelle Dennedy. She is the CEO of DrumWave. And prior to that, she was chief privacy officer at Cisco - really interesting conversation with her. That's up next.
Dave Bittner: [00:24:35] But first, a word from our sponsors. And now we return to our sponsor's point about policy. KnowBe4 will tell you that where there are humans cooperating to get work done, there you need a common set of ground rules to ensure that the mission is accomplished but in the right way. That's the role of policy. KnowBe4's deep understanding of the human dimension of security can help you develop the right policies and help you train your people to follow them. But there's always a question of showing that your policies are not only sound but that they're also distributed, posted and implemented. That's where the Policy Management module of their KCM platform comes in. It will enable your organization to automate its policy management workflows in a way that's clear, consistent and effective. Not only that, KCM does the job at half the cost in half the time. It's your policy, after all. Implement it in a user-friendly, frictionless way.
Dave Bittner: [00:25:32] Ben, I recently had the pleasure of speaking with Michelle Dennedy. She is the CEO of DrumWave. I have interviewed her several times back when she was the chief privacy officer at Cisco. She's got some really interesting insights. She spent time as a lawyer in a previous career as well, so really up on the legal and policy issues here. Here's my conversation with Michelle Dennedy.
Michelle Dennedy: [00:25:54] So I've kind of done a tour of duty, if you will, around the whole Valley of the Kings here. Sun Microsystems bought by Oracle, McAfee, Intel and Cisco. And I think that all of them are very, very, very similar in culture, in point of view. So that's kind of my insider view. But I'll also say that in the last 25 years, I have worked with literally thousands of customers. And so I think I can divide the world into thirds.
Michelle Dennedy: [00:26:25] One-third is just winging it - stone-cold crazy, doing nothing. I've got nothing to hide. And I'm praying to God I'm not going to get any sort of fine or legislation against me. So those guys I kind of discount because you really can't help them out of the well, right? You can help their customers educate themselves to protect themselves as much as possible. But you really don't know. There are hospitals out there with no privacy program. So it's hard to say who's who in the zoo. So that's the one that's kind of - it doesn't keep me up at night because I would never sleep. But I ignore that bunch.
Michelle Dennedy: [00:26:56] So let's talk about the remaining two-thirds that are at least attempting, in their own language, to make a difference. One of those groups of people really wants to be compliant. I'll put a capital C and a hard C on it - compliant. They look to things like CCPA in California. And they try to figure out, what in the world is a sale? And they try as hard as they can to make that definition that applies to them as tiny as possible so that they can do whatever else they want. Those departments you see very typically lead in the general counsel's office. That's not to say that there's anything wrong with privacy in the GC's world.
Michelle Dennedy: [00:27:39] But you typically see a group of rather underfunded folks, probably pretty vocal in the political realm, but internally, they're not really looking and talking about systems. Those voices are the ones that are really trying to do the best they can for their, quote-unquote, "shareholders" to avoid fines. And it's not easy work. It's one way of doing it, right? It's like, do whatever you do, and the, quote-unquote, "market" will solve for X. There's another group that's about the other - the remaining third, which is really taking a proactive stance. As you can imagine - for years we've known each other - I sort of tend to be in that camp of...
Michelle Dennedy: [00:28:19] Maybe more than sort of - where, you know, I'm a believer that - you know, I love the term privacy by design. And Ann Cavoukian and I have been friends for decades. It's a policy term. And so very, very early on, I was very fascinated, as, you know, someone with a legal background myself, how do you convert policy into proactivity? And so that remaining third does things like having their identity management staff working at least at an equal - at level with their privacy officer. Even the least technical among us in that proactive camp understands what is the risk posture that we are willing to undertake for security measures? That person has a door that's hopefully open when the typical CISO says, oh, let's do badge sniffing to make sure that no bad guys are here or the HR person who's like, let's watch all of their video recordings to test for employee sentiment to prevent churn.
Michelle Dennedy: [00:29:30] That person proactively talking to their privacy person will stop and have pause and say, for example, the badge sniffing, that one I can't get in trouble with because every single company I've ever worked for has had some clever bunny that's come in and said, we're going to badge sniff to make sure we're using real estate appropriately, and we want to see when our employees are coming to work or, you know, John Smith is not performing. Let's sniff his badge and see when he turns up. Well, I have never had a conversation with that kind of supposed business need that didn't end in, who's the manager? Does that person have specific outcomes and known requirements, OKRs? Do they know what they're expected to be doing? Do we know what this team is expected to be doing? Do we know whether we're under some sort of economic duress? Is that manager showing up in comfortable shoes, walking around and offering free pizza? If you haven't done any of those things, you don't need to sniff badges. If you've done all of those things and there's a credible threat and you're in some sort of a place where you've got some sort of political activism or some sort of a known threat because there's some sort of an active shooter, then you've had a different conversation.
Michelle Dennedy: [00:30:46] So it's a bit of a drawn out example because people think that technology is going to end it all and be the solution to it all, and the answer is neither one. You have to get the human in the middle, and those are the privacy programs that go a little slower, but they actually change things. They engage in privacy, engineering of the way systems are organized. I've reported to pretty much everyone except for marketing, right? I've done people and places under Crawford Beveridge, if you remember him back in the old days at Sun Microsystems. I reported into public policy. I've been under the CFO. I've been under the COO. And I've done a couple tours of duty under the general counsel - all in the chief privacy officer role, all in an effort to experiment with how would purchasing either of those verticals or any of those verticals actually make an impact on the way that employees and customers are interacting with data products, and what is the data that we know that we're doing for the business or about the business?
Michelle Dennedy: [00:31:52] And so this is where I've kind of shifted my thinking from compliance to governance to really having a balance of what I call the magic equation, which is DV greater than DR equals success - or data value should be greater than the data risk you're taking to equal success. And so that leads me to why I started DrumWave - started at DrumWave, I should say. I leapt into an existing startup.
Dave Bittner: [00:32:18] Let's get some details about your efforts there. What are you setting out to do with DrumWave?
Michelle Dennedy: [00:32:23] I want to bridge the gap of what we were just talking about - the compliance shops versus the privacy engineering-based shops within big and small organizations, for the customers that really want to look at their data and treat it as an asset. So back in 2014, we published "The Privacy Engineer's Manifesto," and the thinnest chapter that I thought was most bankrupt of information is Chapter 13, which is why I put it there - my own little tongue-in-cheek joke.
Michelle Dennedy: [00:32:54] Because it was the finance chapter. So back in 1965, Grace Hopper said that one day information will be on the corporate balance sheet for in many cases, or most cases, information is more valuable than the hardware that runs it. In 1965, when a server or a mainframe cost millions of dollars, she foresaw that information would one day be more important than that big hunk of metal. And I've been chasing that thought for decades now. And it ties the knot between do nothing, which is the old way we used to deal with currency, right?
Michelle Dennedy: [00:33:33] It's like, you sold milk on the doorstep, and whoever - whichever grocer had more money at the end of the month was the winner. You don't really look at your assets. You look at the result at the end of a year and you say, do I have any left over or don't I? And if I do, then I'm a rich guy. Then we moved into investments and really understanding that if I spent that little left over on a new cow, next year I could actually sell more milk. And so it could be that the person with no currency at the end of the month was actually a better partner and business partner. And then the bottle-makers got into it and said, what if we put this milk in bottles instead of buckets? And so on and so on and so on.
Michelle Dennedy: [00:34:14] And so the currency market grew by simply doing what? Having data about resources, like inventory, like cash on hand, like what do currency markets look like? And then we look at what happened after the Great Depression. You know, we did that sort of investing. And what went wrong in the world? Was it too much gambling? Was it too much risk-taking? Was it too much fraud? Well, the '33 and the '34 Act in the U.S., the SEC laws said, wait a minute, let's have a law about data. They didn't say that at the time (laughter). I'm putting that on a modern-lens spin. But the 1933 and 1934 Acts that talk about what is appropriate for investors to say what they must disclose.
Michelle Dennedy: [00:35:02] Now what we do every single quarter for every publicly held company is to disclose some transparency, some data about our business operations, including - I have a chair. I've purchased this chair. And it will eventually become worn and I'll have to replace it. So I'm allowed on my balance sheet to degrade the value in currency terms of that chair and its depreciation. And we all look at these balance sheets and see how healthy a company is, and we were really looking at degrading chairs and buildings and things in addition to inventory, right? We've finally come through the stage where we've been talking about the digital transformation for several years now, if not decades.
Michelle Dennedy: [00:35:47] So what does it mean to be digital? What does it mean to have an inner connection? What does it mean to do business online? I mean, email as a business-going concern is something that happened not just in my lifetime, but after I was in graduate school. So we've got that now. We've got the ability to talk to almost anywhere and anyone on the planet. And so in that digital transformation, are we accounting for, do we have transparency rules for how we deal and govern with data? If we don't, then we're complying backwards with privacy laws that are on the books that are trying to anticipate the future. But we're not really governing our data assets. We're not really building privacy products that the market is really hungry for. We're not even really measuring what we're storing and why we're storing and if we're curating it.
Michelle Dennedy: [00:36:39] The future will be accounting for digital transformation. You will have better privacy if you've mapped your data and mapping will look like GPS. It will not look like a Thomas Guide tucked into your trunk that you never look at. So, you know, my engineering team may get angry with me, but we're building the GPS for data. We're building something where any businessperson can visualize, interpret and manipulate data that they are allowed to have access to so that they're, first, driving the business - the data for the business. And then you'll be able to actually curate your data so that you can share it in an exchange with other data partners. So when we teach the world to data, we will get to the digital transformation. So all of that gear, all that hardware that we've put so much money into, all of the cybersecurity in protecting the integrity and confidentiality and making sure that it's available to the right person at the right time, the big question that I've been trying to answer for decades is why? What's the point? And I think the point is - goes back to Grace Hopper. Data is on the balance sheet, and we just haven't acknowledged it.
Dave Bittner: [00:37:53] Is this like that moment where it was available to people to be able to see things at altitude? In other words, you and I take for granted satellite imagery and aerial imagery. If we want to look at a picture of our town, our city, our nation, our hemisphere, that's at our fingertips. But there was a time when someone could only imagine that. Are we at that transitional point with how we're able to visualize something that, to me, seems kind of ethereal, which is having this mental model of all of my data?
Michelle Dennedy: [00:38:27] We're absolutely - and it's a great analogy. So you're exactly, exactly right. We understand now how to exchange, store, manage and keep power, keeping our disks spinning. But have we really taken that satellite view to say, where are our data assets? Where are there storms brewing? What should I avoid? What's the quickest route? And should we not fly instead of sail? All of those questions, I think, start to create an even bigger marketplace when you look at the landscape of digital transformation.
Dave Bittner: [00:39:03] How do you suppose our policymakers are positioned these days to be able to handle these issues?
Michelle Dennedy: [00:39:08] I think that they are mostly elephant masseuses (laughter). I think that some of them have a piece of the tail. Some are touching the tusk. Some are trying to - figuring out the wrinkly skin. And not that many of them are really doing a multistakeholder approach that says, what's possible? What are we really fearful of? What is the data value? I think we're all really hyped up and pretty aware of data risk. I testified before the U.S. Senate a few weeks ago about the question of data ownership. So there's a number of senators out there in the U.S. who have got proposed bills about data ownership. And one of the senators sort of made me giggle - if it wasn't kind of, like, he's actually a senator who can actually write a bill about this thing.
Michelle Dennedy: [00:39:57] He said, I can see a cabbage, and I can tax a cabbage, and I can ship a cabbage, and I can make laws of ownership about cabbages, and data is a cabbage. And he was very adamant about it, then sort of stormed off in the middle of the hearing to do whatever harrumphing elsewhere he had to do on the Hill that day. I was sort of sitting there musing, thinking his analogy is so perfect for why data ownership is so tricky because if you think about a cabbage, if I have a little victory garden in my backyard, it's a miracle that anything grows. So if there was a cabbage that was actually large enough to be eaten and ripe at the right time and I actually wasn't traveling and I picked it, that would be the most coveted thing that I ever would have, right? That cabbage is worth, like, whatever $150 it cost me to buy the Miracle-Gro and the dirt and whatever to have my little bougie garden.
Michelle Dennedy: [00:40:50] If instead - so, you know, that cabbage is worth $150 U.S. If instead I'm an industrial farmer, I have millions of cabbages, and I put them in big bins. And the ones that are at the top go to Whole Foods market; the ones at the bottom probably are pig feed, right? So the - literally, the weight of data in the storage container itself causes whatever's at the bottom of it to probably not be the same quality as the top. And so you look at - if we decide to meter and own data like a cabbage, we will simply reflect the widening disparity between haves and have-nots. The people with the most rights can pay the most, and the people with the least get stuck with the pig feed. I don't think that's the answer.
Michelle Dennedy: [00:41:33] And of course, then you've got to look at CCPA, right? So California - I cast such shade, and I'm probably going to draw fire at some point (laughter). But, you know, you can't put something on the ballot and say, oh, my God, like, privacy is so awesome. Do you think so? Check box yes. And of course that thing was going to pass because who would say no to that?
Dave Bittner: [00:41:56] Right.
Michelle Dennedy: [00:41:56] I wouldn't say, no, privacy is bad. So we have this banged-up, sad, little thing in California that's going to cost a lot of money. And unfortunately, the people who are compliance people will get all the budget for a law like California's. It says, put a button on your website. It says, make disclosures about what's sale and not sale, without defining what sale really is. It really takes away the protections of infrastructure that are slower, that are quieter, that are more important to actual consumers, which is why I give it as much snark as I do. I'm not against a bill that is comprehensive, that has teeth. But I am against one that's made without any stakeholder input, like that bill.
Dave Bittner: [00:42:40] All right - interesting conversation with Michelle Dennedy. Ben, what do you think?
Ben Yelin: [00:42:43] Yeah. I mean, she's very interesting, has a very interesting resume and a wealth of experience, also comes up with some of the best metaphors we've ever heard on the "Caveat" podcast, including the elephant masseuse, which I will be using from now on.
Dave Bittner: [00:42:58] (Laughter) There you go.
Ben Yelin: [00:43:00] Yeah, so very useful for that. Yeah. I mean, I think a couple things that stuck out to me - the focus on getting more stakeholder involvement, especially as it pertains to legislation. She's very critical of the CCPA for what seems like, in her view, is haphazard drafting in order to please a constituency that isn't as well-versed in data privacy as it could be, and that applies to federal legislators as well. Some of them might be experts on, you know, various component parts of this issue, but there are very few who have the same sort of comprehensive view that somebody like Michelle does, having been in the industry.
Ben Yelin: [00:43:38] The other thing that stuck out to me at the beginning is just her sense of optimism on all of this, which, you know, I think it's something all of us could use more of. I mean, even though this is a time of great challenge, it's also a time of great opportunity because those of us, as she said, who have not been in this world, who are not techie nerds, now are feeling the impact of data and feeling the impact of the lack of data privacy, data breaches, etc. And that creates an opening to actually enact real, constructive change. So I thought that part was really interesting as well.
Dave Bittner: [00:44:15] Yeah. I have to say, I always enjoy my conversations with Michelle and appreciate that she takes the time to share her time with us. So thanks very much to her.
Ben Yelin: [00:44:25] Absolutely. And last thing I'll say is that she talked about this as a multidisciplinary issue. I think that runs through pretty much everything we talk about. I mean, it is not just a political issue; there's social psychology involved. It's obviously a technical issue as well. So our solutions have to be multidisciplinary as well.
Dave Bittner: [00:44:44] Yeah.
Ben Yelin: [00:44:44] But, yeah, I agree. It was very valuable to hear from her.
Dave Bittner: [00:44:47] Well, thanks to Michelle Dennedy for joining us. We want to thank all of you for listening. That is our show.
Dave Bittner: [00:44:52] Of course, we want to thank our sponsors, KnowBe4. Go to kb4.com/kcm and check out their innovative GRC platform. That's kb4.com/kcm. You can request a demo and see how you can get audits done at half the cost in half the time.
Dave Bittner: [00:45:10] Our thanks to the University of Maryland's Center for Health and Homeland Security for their participation. You can learn more at mdchhs.com.
Dave Bittner: [00:45:18] The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner.
Ben Yelin: [00:45:33] And I'm Ben Yelin.
Dave Bittner: [00:45:34] Thanks for listening.
Copyright © 2020 CyberWire, Inc. All rights reserved. Transcripts are created by the CyberWire Editorial staff. Accuracy may vary. Transcripts can be updated or revised in the future. The authoritative record of this program is the audio record.
KnowBe4 is the world’s largest security awareness training and simulated phishing platform that helps you manage the ongoing problem of social engineering. Their new school security awareness training platform is user-friendly and intuitive. It was built to scale for busy IT pros that have 16 other fires to put out. Learn more at KnowBe4.com.