Making the business case for privacy. — Special Edition
Dave Bittner: [00:00:03] In this CyberWire special edition, my guest is Cisco's chief privacy officer, Michelle Dennedy. We discuss what exactly a chief privacy officer does at a global organization like Cisco, why she thinks we're in the early stages of a privacy revolution, why we all tend to shake our heads cynically when a company claims your privacy is important to us and how - maybe, just maybe - respecting the privacy of your users and customers could be a competitive advantage. Stay with us.
Dave Bittner: [00:00:35] Time to take a moment to thank our sponsor, Cylance. Are you looking for something beyond legacy security approaches? Of course you are. So you're probably interested in something that protects you at machine speed and that recognizes malware for what it is - no matter how the bad guys have tweaked the binaries or cloaked their malice in the appearance of innocence. Cylance knows malware by its DNA. Their solution scales easily, and it protects your network with minimal updates, less burden on your system resources and limited impact on your network and your users. Find out how Cylance is revolutionizing security with artificial intelligence and machine learning. It may be artificial intelligence, but it's real protection. Visit cylance.com to learn more about the next generation of anti-malware. Cylance - artificial intelligence, real threat prevention. And we thank Cylance for sponsoring our show.
Dave Bittner: [00:01:27] Michelle Dennedy, it's great to catch up with you again. You are the chief privacy officer at Cisco. Let's just start with that. What does that mean your responsibilities are within the organization?
Michelle Dennedy: [00:01:38] Yeah, it's a great question actually because the profession of privacy is, you know, not quite 20 years old. So most the people that you meet have invented that position for themselves. And I'm certainly one of those people. And so at Cisco, it really is a strategic position. So I look out at our company internally. I look at customers. I look at public policy. And I'm really focused on two areas. And the one is what I call privacy engineering. And we'll get into that a bit more, I'm guessing, while we're chatting along here. And that is implementing the aspiration of privacy by design. And now, the legal requirement in many places for privacy by design, the toolset that you create to do that and the outcomes and the patterns that you build and the models that you build and the architectures are what I'm looking at as privacy engineering.
Michelle Dennedy: [00:02:33] And then the other side is a different but very compatible function in my team - that's really data evaluation overall. How do we get data on the balance sheet? Are we looking at what would it be to invest in privacy as a positive asset as well as preventing harms and risks that come along with compliance risks now that there is a panoply of global regulations that really manage and limit the use of information that is associated with people? So it's a small little task, and we love doing it.
Dave Bittner: [00:03:10] Now, I mean, can you sort of contrast what your definition of privacy within the organization is versus sort of the layman's idea - what we think about, you know, folks who are walking around on the street?
Michelle Dennedy: [00:03:22] Yeah. It's fascinating because I think privacy - as my dear friend Eric Bonabeau, who's, you know, a pioneer in AI, he was like, I don't know - he didn't want to write the foreword for our book. And he did eventually. But at first, I said, why won't you write the foreword to "The Privacy Engineer's Manifesto"? And he said, privacy, it has a branding problem. And it's always stuck with me that he said that.
Michelle Dennedy: [00:03:47] And I think the branding problem is this. I think the laymen, particularly in the U.S. and other parts of North America, think about privacy as cloaking or secrecy or failing to disclose. So the heart of privacy is non-disclosure. And to me, that's such limited thinking because the way I consider privacy and I think a more European and even Eastern thinking is privacy and data protection are synonymous, really. Privacy is the goal, but privacy is the authorized processing of personally identifiable information according to fair, moral, legal and ethical principles.
Michelle Dennedy: [00:04:33] So when you look at it through that functional lens, rather than the one-trick pony of hiding and scarcity, and instead you think about it as, how do we each tell our own stories - to when and to whom and why? And when you look at it through that lens, you think about the tools. You know, what's a privacy girl doing in the most important networking company on the planet? Well, we aren't in the business of not connecting people. We are in the business of connecting people. And so those can - those people sometimes are human people who want to have conversations over distances.
Michelle Dennedy: [00:05:10] So you and I are not sitting together in the same room, and we're able to clearly and competently communicate. That's a connection. That's not hiding your voice from my voice. We're doing this because our intention is to hopefully incite interesting conversations out into the world. Well, that's not really secrecy or hiding of privacy. Does that mean that you and I, Dave, have given up on our privacy? Hello. It means that we are really interested in having a story told, a conversation told, with integrity, with purpose, over tools that are sustainable and stable. And so that's what privacy really is to me at its heart - the authorized processing of personally identifiable information according to fair, legal, moral and ethical principles.
Dave Bittner: [00:05:57] And where do you think we stand right now when it comes to that?
Michelle Dennedy: [00:06:01] I think we're in the beginning. I think if you want to be engaged in this profession - and I use profession with giant virtual air quotes in that I think there are technical widgets that have yet to be designed to help control and contain and maintain individuals' privacy. I think that there on a corporate level, it's very difficult right now to understand what to measure and how to measure to capture the economic might of information that's been well-tamed and well-cared-for.
Michelle Dennedy: [00:06:36] I think that if you're a lawyer, there is a lot of room to interpret and reinterpret schemas that are quite old and brand-new schemas that protect information. If you are an ethicist, there's never been a better time to think about, what does artificial intelligence emerging on the scene and automated decision making, what does that mean for policing? How much information should - what kinds of governments at what levels know and understand about individual citizens? What should parents know about their children? There are ethical and moral, legal, technical and marketing issues to be conquered here. So I think we're looking at one of the greenest of green fields to hit the industry since the internet actually became a functional entity.
Dave Bittner: [00:07:27] Do you think there's a little bit of a natural tension there? Because when I think of privacy in the tech world, it seems to me like it was born from the legal side of the house as opposed to the IT side of the house. And I think about, you know, people who sort of have a head start on us. I think about things like HIPAA that has been around for a while and has had time to develop and mature. How do we get the legal side of the house and the IT side of the house both speaking the same language and coming at this in a compatible way?
Michelle Dennedy: [00:08:02] Yeah. There is a lot of tension there. And I think there - it's fascinating when you think about the relationship between technology law. And law is - it's not synonymous with ethics and morality. Law typically chases, right? It's trying to figure out either there's a present harm - people are murdering each other or stealing each other's stuff - so we create laws or guidelines or rules. And we give them to something or someone that can enforce those. And that becomes law. Or it's a technology that gets out of hand first and goes, oh, my gosh, you know, we've created email, and suddenly we had spam. You know, we're being - it's a denial of service attack, this new cool thing. So a law follows the proliferation of what that technology has enabled.
Michelle Dennedy: [00:08:55] So it's kind of a constant seesawing back and forth. And I think - the other leg of this that I think is fascinating is this sense of ethics, if you will, of what is - and I don't think that right and wrong are the right flavors, but what are the requirements and specifications to speak geek speak? What are the requirements for a system that should be communicating the written word so that letters aren't instantaneously generated and sent? What are the requirements if you say, oops, I sent an email I didn't want to, can I recall it again? Is that a thing? Can I run to the virtual mailbox and dig it out again?
Michelle Dennedy: [00:09:36] These are requirements that serve very human needs and requirements that aren't always reflected in the way we are developing either our laws or our systems. So I think that tension is met with - processed with practice, with transparency and things like this podcast that we're having. It's like, how do we have these discussions so that we take a step back and ask ourselves, are we developing what we actually want?
Dave Bittner: [00:10:03] You know, I think for many consumers, certainly, we're - we associate privacy with being hit by a EULA when we sign up for some new piece of software or some service, this impossibly long pages and pages and pages of legalese. And I think that can generate a natural sort of cynicism when it comes to this. We're just going to click through. There's no possible way I could understand this privacy policy. And yet we have the companies all saying to, us your privacy is important to us.
Michelle Dennedy: [00:10:34] (Laughter) Do we believe that statement anymore?
Dave Bittner: [00:10:36] Exactly. Do we believe it? How do they prove it? You know, I think about something like, you know, you have like the UL listing for electric devices. This has been certified by Underwriters Laboratories by an independent organization to meet a certain level of safety. Do we need to have some sort of standard, some sort of norm, a shortcut for the consumer that says, yes, you know, we've taken the time to read this EULA and this is where it stands. You know, here's the short version of what you can expect from this.
Michelle Dennedy: [00:11:10] Yeah. And I think UL labs has always been one of my favorites because there's so much complexity that went into setting a standard for an electrical system. You know, first there was the war over AC versus DC, which is safer? And there's all these like showmanship things, you know, electrocuting elephants and, you know, killing prisoners and things - right? - to show the might of the mighty electron. And now we've just come to look at those two little initials on a piece of tape around a cord, and that is enough for that complexity. So I think in the beginning, there was like - there was actually a grade.
Michelle Dennedy: [00:11:50] It brings to mind a sign that was in - if you've ever been to London and not been to the tech museum, you've got to go. You know, make time and go. It's right near Sloane Square. There's a sign from an old-fashioned electrical system - when you walked into a house, there was a warning against lighting a match and trying to illuminate the lamps with matches because people would come in and expect the electric candle to be an actual candle. And they'd, like, where do I put the flame? There's a certain teaching that actually went down before we just naturally walked into a room and flipped the switch.
Dave Bittner: [00:12:30] Right.
Michelle Dennedy: [00:12:30] So I think we are in the earliest days in an era where technological innovation happens, you know, in hyperloop speed, we are trying to train users using these clunky devices of contracts. It's not like, don't light a match, you just touch the switch. Instead, we're saying, set your settings to do not share or do you - and this is the question of privacy by design - should there be a default that everything is off, so that even if you buy a communicative device, you have to work to get it to turn on because the quote, unquote, "default" is don't do what the feature is that you wanted it to do or the first place.
Michelle Dennedy: [00:13:17] So I think we're still in that early era. But I think that, you know, I always got to have an upside. If ever there's a downside, that means commerce to me, that means innovation. And I think right now, the EULA - the user license agreement has to be rethought. I think the notion of opting in and opting out was something I railed about 20 years ago because we have actually - in common law in particular and even in civil law countries the notion of informed consent.
Michelle Dennedy: [00:13:47] So you go to your doctor, and that transaction happens far before you walk in the door. You look that person up either on the web or your insurance plan has a listing or there's a series of doctors' offices. There's all these signs and contextual cues that you're not walking into a sandwich shop. You're walking into a surgeon. Then there's the accreditation and licensing. Government has come in and forced this individual to take tests and take a certain level of education and practice. And maybe it's not true, but there are very harsh penalties if someone sets up a shop as a surgeon and, indeed, has no training or has lied about their background. There are sanctions and even imprisonment that would face that individual.
Michelle Dennedy: [00:14:37] So we don't have those cues online yet. And so we try to artificially shove them all into a 16-page, single-spaced document. And I've experimented over the years with things like a graphic novel. So when I was at Intel, we had a little ninja graphic novel privacy policy for our consumer McAfee business. And it was fantastic. The questions that I got were much more intelligent. You know, the desire to know where we processed was listed in the cartoon, and the little ninja showed you where it was. And it was cute, and it was fun.
Michelle Dennedy: [00:15:13] But it was easy enough to do because we had such - although complicated to get security done, we had a very simplistic business on that side of it. So I think having iconography is important. One of the things that we've done at Cisco that I'm quite proud of that looks like a small step, but it actually was a big step are something that we call the data data sheets. There's a good reason I'm not in marketing, Dave.
(LAUGHTER)
Michelle Dennedy: [00:15:40] So I - you know, we do the privacy impact assessment - so list where your assets are, what the controls are. And from that information that comes directly from the folks developing these products and services, you want to create trust. And so we have a center helpfully called trust.cisco.com. And on that site, you can find what we call the data data sheet. If you want data about what data is in our products and services, you can actually look at that specific data sheet and have a conversation if you want to purchase those products or manage them or figure out what your controls or risks are up front.
Michelle Dennedy: [00:16:18] Even before you've walked in our door, we're hanging our credentials out on a sign for you to look at, you know, do we just say that we care about your privacy or are we actually doing things, organizing and building so that we can, indeed, respect that promise that we've made? - because we actually do care about your privacy, as it turns out.
Dave Bittner: [00:16:39] Do you think we've reached the point where caring about privacy has competitive advantages?
Michelle Dennedy: [00:16:45] You know, I sure hope so. I think - I'm going to give you kind of a hmm answer.
Dave Bittner: [00:16:53] OK.
Michelle Dennedy: [00:16:53] You know, my inclination is, of course it is, Dave. We've got 4 percent...
Dave Bittner: [00:16:59] (Laughter).
Michelle Dennedy: [00:16:59] ...Of global turnover for GDPR if you mess this up. We've got a 5 billion - with a B - consultants took out of our pockets, lawyers and third-party consultants and other professionals - $5 billion service industry that built up around just getting to the front end of GDPR, the general data protection regulation in the European Union. So 5 billion feels like a market to me.
Michelle Dennedy: [00:17:27] And the question is, do you want to spend that on consultants every year or do you want to start building things that are automated and built into your systems? So I think the market is coming. I'm not sure that people would distinctly say being private or not private is a competitive advantage because it's required under the law. I will say those with greater transparency, working privacy interfaces, clarity about where and when and how data is processed, the cleaner you get - and we've actually done a study on this - the faster you get to closure.
Michelle Dennedy: [00:18:03] So 65 percent of respondents in our survey - and it's 3,000 people vetted over 20 different countries. We did a benchmark study - double-blind, so they don't know it's us. And we said do - are you experiencing what I call data friction? And data friction, to me, is when somebody wants your stuff. And they say, oh, I have questions or problems or issues with privacy or security. That becomes Day Zero for me.
Michelle Dennedy: [00:18:32] And we found in our study that on average globally, you can have as far out as a 16.8 week delay. And 16 weeks is longer than a quarter. So you've got sales people who are paid quarterly and goaled quarterly who now have to wait after they get to yes - 16 some weeks - to answer the questions and get through the issues of privacy and security. Where you've invested at least even to a mid-level in a five scale privacy maturity, what we found was correlated - we don't have causation yet - but correlated an 80 percent improvement on that number.
Michelle Dennedy: [00:19:13] So you can go from an average of 7.8 - 16 is an outlier in certain jurisdictions. But 7.8 is the average length of delay down to about a 2.8 week delay, and so I call that a marketplace. And so it's the first time - we did that measurement two years in a row before we finally published the report. We're working on the third year of data to see if that trend still exists. But I think where people recognize that data friction is a way to accelerate your business, that investing in privacy and governance is not just a way to get out of fines, but it's a way to accelerate closure of services and products that are increasingly information-based. To me, that smells like a market.
Dave Bittner: [00:20:06] There is much more to my conversation with Michelle Dennedy. And you will find it on her podcast, which is called Privacy Sigma Riders. Search for it. Look it up. Check it out, and subscribe. That's Privacy Sigma Riders, where you can hear the second half of our conversation on privacy and cyber security. My thanks to Cisco's Michelle Dennedy for joining us, and of course, to our sponsor Cylance for making this CyberWire special edition possible. Find out how Cylance can help protect your systems through the use of artificial intelligence at cylance.com.
Dave Bittner: [00:20:39] The CyberWire podcast is proudly produced in Maryland out of the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technology. Our CyberWire editor is John Petrik, social media editor Jennifer Eiben, technical editor Chris Russell, executive editor Peter Kilpe, and I'm Dave Bittner. Thanks for listening.