Security Unlocked 11.10.21
Ep 51 | 11.10.21

When Privacy Meets Security

Transcript

Nic Fillingham: Hello, and welcome to "Security Unlocked," a podcast from Microsoft where we unlock insights from the latest in news and research from across Microsoft's securities, engineering and operations teams. I'm Nic Fillingham.

Natalia Godyla: And I'm Natalia Godyla. In each episode, we'll discuss the latest stories from Microsoft Security, deep dive into the newest threat intel, research and data science... 

Nic Fillingham: ...And profile some of the fascinating people working on artificial intelligence in Microsoft Security. 

Natalia Godyla: And now let's unlock the pod. 

Nic Fillingham: Hello, the internet. Hello, listeners. Welcome to Episode 51 of "Security Unlocked." My name is Nic Fillingham. With me, as always, is Natalia Godyla. Hello, Natalia. 

Natalia Godyla: Hello, Nick. Excited to be here, as always, and excited for another special episode. 

Nic Fillingham: This is a special episode, indeed. We are taking the conversation that we had with today's guest, Whitney Merrill, who is the data protection officer and privacy counsel at Asana, as well as the founder of the Crypto and Privacy Village - we are taking the conversation that we had with her over on our YouTube channel, which is youtube.com/microsoftsecurity as part of "The Security Show," and we are giving you an extended edit of that conversation? In that conversation, we spoke to Whitney for almost an hour, and we had to cut it down to about 15 minutes. But the beauty of this podcast medium is that we can actually bring you much more of the goodness that was recorded there. And it's a fantastic conversation. Whitney is a deep subject matter expert, as well as at the forefront of the intersection of privacy and security, and it's a great conversation. One of the first questions we ask Whitney is, what is the intersection of privacy and security? If I'm a security person listening to "Security Unlocked," why do I care about privacy? And my little tl;dr here is that you can't have privacy without security, and security without privacy is sort of a dystopian nightmare. So what do you think, Natalia? Did I summarize that OK? 

Natalia Godyla: You, in fact, did. And Whitney did a great job carrying that thread throughout the entire conversation. So she really puts on her security hat and talks to security professionals about how privacy fits in their overall remit as a security professional. She dives really deep into the nature of privacy in the cybersecurity space, touching on different privacy threats and attack types that happen nowadays, how the regulatory environment is shifting to think about privacy attacks and reacts to privacy attacks. And of course, it wouldn't it be a security conversation without touching on threat modeling. We talk about threat modeling privacy situations - you know, having that doomsday conversation as a security team to understand how privacy really impacts your business. So just a huge kudos to Whitney for being able to have such a security conversation around privacy and really bringing it to that level for security professionals. 

Nic Fillingham: Whitney also has a great story to tell of how she fell in love with privacy and her experience, her journey on privacy and security and how that sort of personal experience has sort of shaped her vocation, which is great. It's a great perspective to get. I think everyone's going to really enjoy this episode. On with the pod? 

Natalia Godyla: On with the pod. 

Nic Fillingham: Today on the podcast, we have Whitney Merrill, data protection officer and privacy counsel at Asana. Whitney, thanks so much for joining us, and thanks for your time. 

Whitney Merrill: It's a pleasure to be here. Thank you so much for having me on this. 

Nic Fillingham: I'd love to start with an introduction. Whitney, could you please tell us about yourself? Tell us about your role at Asana. Who are you, and what do you do? 

Whitney Merrill: Hi. I'm Whitney. I am a privacy attorney and data protection officer at Asana. I have been doing privacy for almost 10 years at this point. I actually went to law school to go focus on technology and the law, and very early on in my legal career, I figured out that I absolutely love privacy and data security. I have been basically doing it since before, like, privacy was really a thing for attorneys to do. I mean, there were some people practicing it, but it really has changed a lot in the last few years. But after law school, I stayed and did my master's in computer science with a focus on computer security to kind of focus in and really learn about those issues. And then after that, I served the federal government as an attorney at the Federal Trade Commission, where I basically investigated companies for bad privacy and security practices. And then since then, I've been in-house at various companies - Electronic Arts, as well as a small fintech company. 

Nic Fillingham: Whitney, you say you fell in love with privacy. Tell us. Tell us about privacy. What do you love about privacy? 

Whitney Merrill: I think the thing I love about privacy is that it's always changing. You really have no idea what's going to come next, and you really are living within the gray. I once heard another attorney on a panel say, you will love privacy if you're comfortable not knowing the black-letter law, if you're comfortable not knowing exactly what the boundaries of the law say, you have to kind of navigate within that and really understand what that looks like. And I think that's exactly why I love it. I like that it's not just clearly what is legal and what isn't legal. It's ethics. It's morals. It's cultural values that all kind of come together to form what is privacy. And I like taking that into account in addition to the law. 

Natalia Godyla: You're also the founder of Crypto Village. Can you tell us, what happens at the Crypto Village? 

Whitney Merrill: Yeah, great question. So I started the Crypto and Privacy Village as a way to procrastinate studying for the bar. 

(LAUGHTER) 

Whitney Merrill: I was studying for the bar, which for those who are not familiar with it, is an all-day, every-day exercise for about 2 1/2 months until you take a two or three-day exam. And so someone had reached out saying they wanted to do something at DEF CON. Would anyone be interested? And I kind of raised my hand and said, hey, yeah, let's put a community together. And I worked with a few other folks to kind of put together the first Crypto and Privacy Village at DEF CON. And at this point, I don't remember which DEF CON it was - maybe 22. 

Natalia Godyla: Uh-huh. 

Whitney Merrill: We got together and people showed up. Frankly, we thought maybe we'll have a space and no one will come. People came. Volunteers came. People started guarding the doors and, like, doing crowd control without ever being asked to do it. And it kind of just organically grew, 100% organically grew. And that's also why I love the hacker community so much is they just kind of chip in and build stuff. And so the Crypto and Privacy Village focuses on cryptography and privacy topics that touch the security community. We have very, very technical talks on implementations of applied cryptography to breaking crypto implementations, as well as, you know, higher level policy conversations of whether or not cryptography should be backdoored. And from there, it's kind of grown. And every year we show up at various conferences, and people also engage and show up. And it's been really fun. We also have a puzzle that we do every year that has been really engaging and really fun and has been given, like, the honor of being a Black Badge contest at DEF CON, which to me is like one of the most validating things about something you build is to see it recognized by something so weighty as a Black Badge at DEF CON. 

Nic Fillingham: So Whitney, let's kick off some of these more formal questions. And I'd love to start with a big one. I'd love to get your perspective on the intersection of privacy and security. How do they come together and win? 

Whitney Merrill: Yeah, absolutely. The cool thing about privacy and security is you can't have one without the other because security helps create privacy. Well, you can't have privacy without security. You could, I imagine, have security without privacy. It's just basically a surveillance state. But, like, having security is the only way that you're going to be able to have privacy. And because of that, it's very, very valuable and important. The other thing I'll say is the differences between privacy and security is that privacy - because you don't necessarily need privacy to do security. I think they go hand in hand and work best if they are kind of marching as one. Security also creates attention for privacy. I think about this a lot when I'm working with security teams and I say what kind of data you can collect and how long should you keep that data when it comes to security logs or audit logs. These things are really, really important to think about when it comes to privacy and security because, yes, I mean, we're seeing this now in the tensions between the EU and the United States. Our surveillance laws in the United States are for security purposes. And regardless of how you may feel about those underlying laws, those are the reason that the EU feels very strongly that privacy isn't honored or respected here in the United States, and they've kind of started restricting the transfers of data. 

Natalia Godyla: So Whitney, why has it been so difficult to establish a common framework for privacy and in comparison to how frameworks have been established for the cybersecurity industry? If it's even possible to create one, how do you propose that we start as a community? 

Whitney Merrill: Great questions. So why is the common framework kind of not come about? I think in part, it's because of different cultural norms and values. Really, how people approach the question comes down to how they view the problem. In the EU, privacy and data protection are a human right. Here in the U.S., I would argue that privacy is in the Bill of Rights through the various amendments that exist, that kind of touch upon privacy. However, there are lots of people who argue the opposite and say because it's not explicitly mentioned, it's not core to our constitutional values and rooted in our laws. And I disagree. But I think that plays in. The fact that it's not a given, that it's not a clear, defined thing that everyone agrees on also affects the creation of those norms. So in developing a common framework, people are thinking about privacy from a human rights perspective in the EU, whereas in the United States, it's definitely much more reactive. It's something bad has happened, therefore, we should fix the problem, not let's guarantee a set of rights that are already provided to individuals. 

Whitney Merrill: So with the current state of laws that we have now or even pre-California Consumer Privacy Act, everything was very sectorial. We had HIPAA, the Health Care Privacy Act, that really only touches upon health care entities like doctors' offices, et cetera and insurance companies and the service providers for doctors' offices, health care entities and insurance providers. There aren't general protections for this core set of health data that may be collected by - name a company. You have a video Privacy Protection Act, which was a result of, like, rental data being collected by individuals, and it was kind of a reaction. And then you have GLBA, which is the Gramm-Leach-Bliley Act, which is all about financial privacy protection. 

Whitney Merrill: So I think the fact that we've taken this very sectorial approach means even moving to a more common framework, something that may already exist in the EU, is much more difficult because we have to decide what to do with the sectorial approach that we started with. Do we preempt it, which means we get rid of it in the new law applies, or do we create a more omnibus privacy law, federal or state, that kind of weaves into everything that isn't already covered? And that can be really, really, really confusing and really frustrating to end consumers, end users, because they don't know which law applies to them. 

Natalia Godyla: Between the two approaches that you've described, there's one that looks at privacy as being vertical specific. There has to be a uniqueness to the way that you shape the framework to adapt to that specific type of data and who's involved in that data. The second approach is more ubiquitous. It applies to many different industries. So from your perspective, do we need to customize the framework? 

Whitney Merrill: I think it makes a lot of sense to not treat all data as the same. I'd be remiss to say that your Social Security number is the same as your first name or your first name, last name. Obviously, those things hold very different weight and have different harms if leaked or used in a way that you don't consent to or that you - are not lawful. And - but, however, we can't also assume that the data that we assume has value now - right? - or presume has value now is going to have a same value or same harm 10 years from now. So I really think frameworks should not be super prescriptive but rather thoughtful about what pieces of data are more sensitive than others. 

Whitney Merrill: And I think there isn't a perfect way to go about this, but a really great example - and I was thinking about this the other day - is financial data. My credit card number, when leaked - there are umpteenth data breach protection laws that came out in the United States after a series of credit card breaches. If there's a credit card breach, you have to notify the local regulator. You need to notify your users. You have to rotate the cards, et cetera. There is a procedure and an accepted practice as a result of those breaches because you could tie very easily and logically the harm of the loss of a piece of financial data to actual harm to the user. It's an easy exercise. Credit card data leaked - person's credit card gets charged by a fraudster. Where we're now at is that these immutable characteristics and pieces of personal information like your name, your birthday, your Social Security number, your address, all of this data, some of which used to be historically public - open the white pages. You could find John Smith's phone number and address, where they lived and potential other relatives related to them. Those cultural norms have shifted over time. That isn't just assumed to be public by everyone. People keep that information private to them. 

Whitney Merrill: The other thing is it's not easy to change. So I - if you asked me today, would I rather have my name, birthdate, phone number and email address breached or a credit card number, I'd choose a credit card number because credit card will refund me. It will rotate. It will be a minor inconvenience, but generally, that problem has been solved. My email address, my phone number is breached, my - maybe my address, my name, now we have a lot more information that can, like, spam me. They can use that to social engineer me. They can use that to gain a lot more information about me. And, yes, I can change my email address. Yes, I can change my phone number. But that's not easy. That's not easy, and there isn't a good, quick way to do that. So I think the value around a piece of data and what security we then attach to that changes over times with norms. So as we set different values with different pieces of data, we need to think about if we make a decision about this now and say it's not important, may it be important later, or is it important to somebody else? 

Nic Fillingham: So, Whitney, how should organizations approach privacy? Does privacy need to be ingrained in culture? And if it isn't ingrained in culture, how do you do that? Is there a different approach, or are there different approaches based on industry, upon organization size? Or is there a one-size-fits-all model that any organization can follow? 

Whitney Merrill: That's a great question. I think privacy is a set of cultural values for a company. I think there are the things you have to do and you must do and that are the law, but I think that is the floor, not the ceiling. And I think the problem that we're currently in right now is that people see it as the ceiling or the thing they're trying to achieve instead of the basic norms. So when thinking about your privacy program, when building privacy at your company, I think it's really important to say, what do we want to do? Who do we want to be? And you see companies going through this exercise in their products and in their design. But this exercise applies more broadly, not just to the product you're building and your users but also your employees, how you interact with them, what type of balance you build there because they're so tightly ingrained together in creating that cultural norm within the company around privacy. 

Whitney Merrill: So if you say, our company value is that we will take a global perspective when it comes to privacy, you're not thinking about it just from the lens of an American or from someone living in the United States. You have to take into account other people's cultures. And I think if you set that, that also helps you grow faster in new regions when maybe, OK, you're only operating in the United States as a company, but now you're going to go to Europe, where the cultural norms and the expectations that the company provides privacy practices is not only the law, but they won't do business with you; they don't want to do business if you're not taking privacy seriously. And so if you start building that earlier into your company culture and into the development of your product, the easier it's going to be in the long run to kind of build that. 

Whitney Merrill: Also, frankly, privacy is the future. This isn't going away. So you can either get on board and, like, embrace it, or it's going to be a lot of regulatory headache and nightmare and painful experiences until, eventually, you get to a point where you've found your equilibrium for where you want to be when it comes to privacy. 

Nic Fillingham: Whitney, how do you recommend organizations think about the triangulation of sorts between empowering users with choice, the goals of the business and then the legal and regulatory requirements? I mean, first of all, are those the three elements? And is it a triangle? Is it a pyramid? What's the relationship and what guidance do you give to organizations to help navigate? 

Whitney Merrill: That's a great question. I think - so, yeah, the business objectives, the users' control over their data and then the law; I think those are the big three pieces. I'm sure there probably is another piece... 

Nic Fillingham: (Laughter). 

Whitney Merrill: ...That I can't think of, but, you know, for sake of clarity, we'll go with those three. How do you balance them? This is why it is not as clear as security - is because how you shift and - like, if you imagine it's an actual triangle and you move, like, a little person within that triangle of what is the most important, it's going to differ based on the organization, the business that they're in. 

Whitney Merrill: I can tell you right now, fintech companies, financial companies, they are required by law to collect your Social Security number and to share it in various circumstances in order to prevent money laundering and to prevent terrorism. That means that the law part of it is going to really outweigh any sort of user interest, any sort of business interest. That is going to be really, really important in comparison to maybe enterprise software - right? - B2B software. What the customer wants and the control that they have over their data is really, really going to matter. 

Whitney Merrill: And the more you can kind of facilitate your customers' needs, the more you're going to be able to take into account their privacy values and give them choice. Because in a lot of those contexts, too, in privacy, you're not the controller of that data, you're the processor, which are defined terms under GDPR. But basically, they mean what they sound like. A controller is the person or entity that controls that data and can make decisions over it. And the processor is the one who's doing the processing of the data by the instruction of the controller. So in that particular case, like, that controller, that business that you're contracting with, is really going to push that. 

Whitney Merrill: And obviously, the legal - if you have legal obligations or requirements under the law, that's going to outweigh those of the business because you have to comply with tax laws. So even if you want us to delete all your data, we have to keep certain data about you in order to pay our tax - like, a corporation's - to pay their taxes or to pay their employees, whatever it may be. 

Whitney Merrill: And then the final thing is, like, if you're in the B2C space, where you have users, this is the scenario where you're going to find the business interest is going to be the most important. At least that's how I'm thinking about it right now because the negotiating power of the end user feels much smaller than in the B2B context. And this is why you're seeing a lot of the harms regulation conversation around the B2C space, the business-to-consumer space, so anyone contracting or interacting directly to provide a consumer-facing product - that's the Facebooks, the Googles, et cetera. And there, you're seeing a lot of conversations about the privacy abuses because you feel like you have no power. You feel like you can't - yeah, I can stop being a user, but then I don't get a service. 

Whitney Merrill: And even if you're paying for it, you're only paying X. And then what's the value of that? And so you're seeing some models where you may have more rights and more control over your data if you pay the business. Now we're moving more towards control, so you might get more control in exchange for money. But from a privacy, like, rights perspective, that starts to feel problematic because that means only people who can pay get privacy rights or get a certain level of privacy rights. And, yeah, capitalism, but that doesn't seem right either. 

Whitney Merrill: And that's where the law has to come in, and that's where regulations have to come in. They have to force-correct that misbalance between the other two because otherwise, it's always going to be a pay-for-privacy scheme, which is going to disenfranchise poorer communities, which are traditionally, like, under-represented in kind of these interests. And that's wrong. That's wrong. That's why it being a human-rights issue in the EU, like, also affects that. It's so, so, so important to establish that if it's a human rights, everyone gets it, not who has money to pay for it. 

Nic Fillingham: Is there any research or data that shows a correlation between organizations that are navigating this triangle well - where they've established strong privacy values, they're providing choice to their customers and end users, they're staying within the regulatory framework - and where the outcome of this is actually a boon for business, a positive force? If you get privacy right, is it a vector for growth? 

Whitney Merrill: Yes, I absolutely think that privacy, if you really think about it as a brand narrative, is going to be big. And I think we're seeing that shift now. It's also going to really depend on the business model of the company of whether or not they want to make that part of their brand. If you sell targeted advertising as your underlying business model, it's going to be very difficult for you to say - especially as a public company - hey, investors who only - the interest of the company of which is making more money, we're going to take a more privacy-centric stance and make less money because we think privacy is right for our users. 

Natalia Godyla: So who should be educating the users, those who are reading the privacy policies from various companies and trying to make informed decisions about their data? Should regulators play a role in trying to establish some baseline of privacy awareness? What are your thoughts? 

Whitney Merrill: So I think, you know, whose responsibility it should be to educate about privacy - I really think it's larger societal problem, but I think schools play a big role. Just like critical thinking, we should be teaching life skills and providing information that isn't just can you solve this math problem or can you recall this fact? It's do you understand the larger impact of the decisions that you're making and how they might impact you? Very early on in my life, in undergrad, I actually wrote my thesis on pornography on the internet. And something I've always thought about is this intersection of young individuals with devices and how they use them and how it collects information about them. And I think we need something like a D.A.R.E. or - I'm sure a lot of people have really bad connotations with that - but, you know, a larger educational program to say, here's a safe way to use it. Here's where you need to be thoughtful. Here's how leaking where you go to school might put you at harm. And the younger you can do that, the more important it is going to be. 

Whitney Merrill: Frankly, I think that I am a privacy person because of my mom. She is very thoughtful and really aware of who has access to data, what can they do with it for as young as I can remember, and so that was always ingrained in me. And so when I first started thinking about privacy was in high school, I was horribly bullied on AIM Messenger for any of the folks who remember that. And I wanted to find out who the person was. And I went to a police department and I said, you know, hey, if we have the IP address, can you help me find the person? And they're like, we don't know how to do this. We don't know what to do with this. And I remember thinking, well, when should we reveal it? When can somebody be anonymous online? And where I thought I'd land with it was actually not at 16, 17, was not where I actually land now, which is I think it should be really hard. And I think you should have anonymity online. 

Nic Fillingham: Whitney, let's shift gears a little bit and talk about encryption. What is the role of encryption in privacy? Can or should encryption be a solution to protect against attacks and protect against privacy breaches? 

Whitney Merrill: Yes. 

Nic Fillingham: Great. Next question. 

Natalia Godyla: (Laughter). 

Whitney Merrill: Encryption is, like, so incredibly fundamentally important to privacy. Like, you cannot have privacy without encryption in our digital age. We are going to move to a place where paper records and documents are just going to be a thing of the past. And even if you have them, them existing in the physical world is almost going to feel more safe than them existing in the digital world. So encryption is vital. For me, more encryption is better, but what's - what I think privacy regulators are starting to hit up against is actual fundamental computer science limitations of encryption. Certain types of business models or values that can be brought through features - AI, machine learning, search - become very, very, very difficult when you start saying, hey, let's do end-to-end encryption. And so I think it is an answer to a lot of privacy problems, but I don't think it's going to solve all of them, at least not until we really catch up with the science and the technical limitations of encryption. Because it's like, yeah, you could encrypt search, but now search is going to take 30 seconds as opposed to almost feel instantaneous. And then at that point, who's going to want to use it? People are going to feel frustrated. And so I think it's also hard to decide where, when, how, what kind of encryption you use in what types of scenarios. 

Natalia Godyla: So as we're discussing the different solutions for privacy, both process and technology, I think it's important to pause and talk about the different types of threats that are impacting our privacy. What are privacy attacks? What are a couple of examples of privacy attacks? And what kinds of trends are you seeing in the way that their actors might be impacting our privacy? 

Whitney Merrill: Yeah, that's a great question. I think the privacy attack is very broad. I think it's everything from a breach of that data to somebody who shouldn't have access to it. I think it can be a misuse of that data internally by the company. So either - a common one that's come up in a handful of FTC cases has been I collect your phone number for two-factor authentication, but now my marketing team is using it to send you advertisements or to text you or to call you. That is definitely, like, a major privacy misuse. And that's a common one, and I think people are starting to be more thoughtful there. And some people refer to - I guess they kind of overlap dark patterns. 

Whitney Merrill: There's a service that's pretty popular right now that, for a period of time, required you to share your entire contact list in order to invite others to the platform. I see that as an attack on my privacy. I gave my phone number to that user, to a friend or a family member or whoever it may be or acquaintance, and they can't consent to share my data with another company in that way for it to be used for other purposes. So I think a privacy attack from a company perspective is a company collecting data they shouldn't collect in the first place and then doing things with it to create social webs or to connect people and then to leverage that information in a way that was never intended. 

Whitney Merrill: The business who was collecting these pieces of - these contact lists from users in order to facilitate the invite were doing so basically to create a social web. And I requested my data from this company in part because I had heard that this was the practice and, two, I wasn't a user. I never signed their - or I didn't accept their privacy policy or their terms. And so when I got my data back, I found out that 84 people had shared my phone number with this company, and I didn't consent to that. And they knew who shared the phone number with them, but then wouldn't share that information with me in order to protect the privacy of those users. And the California Consumer Privacy Act actually carves that out. You cannot share data in order to protect the privacy of other people. That makes a lot of sense. But it sure seems wrong that a company can collect all this data, create a social web about people I've given data to. They've aggregated it, and now I don't get that shared knowledge back either. It feels wrong. 

Whitney Merrill: So I mean, frankly, as a result, I asked them to delete my data, but they knew 84 people and then three people invited me and gave them my full name, Whitney Merrill. So they knew that phone number was associated with Whitney Merrill. And yeah. So I asked them to delete my data, but the problem is is that new users will join the platform. They may share their contact lists, and the cycle will begin again. They have changed that pattern, though, probably because they might have gotten in trouble, where when you go to invite individuals now on that platform, it doesn't force you to share your contact list. You can individually enter invites. And so hopefully that will mean less of my data is leaked to that platform. But that to me, feels like a privacy attack because I can't stop it, and I can't fix it. 

Nic Fillingham: Whitney, you talked about the decision that was made to collect your name and your phone number. And if we give the benefit of the doubt to the product teams and the engineers that were designing this feature, I can sort of imagine a scenario where they were thinking along the lines of, oh, this is great. When Whitney joins, we already know who she is. We can let her know that 83 of her friends are on our network, and they potentially saw it from that perspective. But as you say, they made a mistake from a privacy point of view. What are some of these common mistakes that you see being made? And how do we turn this into guidance to organizations? What are the common pitfalls? You said there was probably not a privacy person in the room. So can you avoid many of these mistakes by just having a privacy person in the conversation and a part of the decision-making process? Does it go beyond that? What's your guidance here? 

Whitney Merrill: I think it never hurts to think like an attacker, both in the privacy perspective and in the security perspective. And I think privacy people are always naturally thinking, well, how can this be used in a bad way? What are the bad things that can happen? There are a lot of people who do privacy compliance, and they're thinking, how can we get you to achieve the legal standard that you must be at? I think that a lot of companies take that approach. My approach is, how can this be misused? How can this be a problem? How would this hurt somebody? And how can we thread that needle between the three tensions - the users, the company values and the law - and make sure that we're doing the right thing? And so a privacy person may do it. Someone who's privacy-minded may be able to do that, someone who's maybe had previous experience dealing with these things in the past. So I won't even say it necessarily has to be a privacy person, but somebody who's thinking about those things. And I think going through that exercise of, like - well, what are the bad things that can happen in this scenario? - will solve not only privacy problems, but also harm and abuse problems that exist on platforms. 

Nic Fillingham: Whitney, what are some of the new threat types in the privacy space? What's coming that teams should be aware of? And maybe where would you recommend folks go to keep their finger on the pulse on the latest threats in this space so that they're prepared for the future? 

Whitney Merrill: That's a good question. I think there's a lot of privacy cleanup to do, so even if there aren't new threats, necessarily, I think they're going to become more visible. I don't think we totally appreciate the threat that data brokers have on individuals in society. I don't think we fully appreciate how much information about our lives has actually been collected because it's still so new. And I - even though I had my first cellphone when I was in eighth grade, like, kids now are getting an iPad. They're getting an account. Parents are reserving their identities on platforms so that they can have them later. I don't even think we understand the beginning of it. 

Whitney Merrill: From a threat perspective, I think what's going to feel like more of a threat is what we consider a breach is going to drastically be broadened. The EU - God, who was it? - I can't remember the decision off the top of my head. But in the EU, they had fined a company for not notifying the regulator of a breach within the required 72-hour timeline. And what was most shocking about that was that it was a misuse of data that they considered a breach. And for me, I think that's really interesting because you say, yes, it's a security problem in that data was misused or used by a team that shouldn't have - huge privacy problem and that was considered a breach. And so I think the threat now is going to be what maybe was normal or what was already considered an acceptable practice is slowly going to become unacceptable and become the threat, which is misuse of data, collection of data beyond, like, what you actually need. 

Nic Fillingham: Whitney, you've been so generous with your time. Thank you so much. Before we let you go, are there any resources you'd like to let the audience know about? Do you have Twitter? Do you have a blog? Where can folks go to learn more about the intersection of privacy and security? 

Whitney Merrill: That's a great question. My Twitter, I'm @WBM312 on Twitter. That's probably the best way to communicate. I love when people engage with me on Twitter. I try to avoid nuanced conversations, so I don't think it's the best platform for that. But if you want a just, like, brain dump of what I'm thinking about, what scares me, what's interesting to me, my Twitter is definitely that feed. And that's the best way to connect with me, as well. 

Natalia Godyla: Well, Whitney, thank you again for joining us today. It was great to have you on the show. 

Whitney Merrill: Thank you so much for having me. I appreciate it. 

Natalia Godyla: Well, we had a great time unlocking insights into security, from research to artificial intelligence. Keep an eye out for our next episode. 

Nic Fillingham: And don't forget to tweet us @msftsecurity, or email us at securityunlocked@microsoft.com with topics you'd like to hear on a future episode. Until then, stay safe. 

Natalia Godyla: Stay secure.