Research Saturday 11.2.19
Ep 109 | 11.2.19

Usable security is a delicate balance.

Transcript

Dave Bittner: [00:00:03] Hello everyone, and welcome to the CyberWire's Research Saturday, presented by Juniper Networks. I'm Dave Bittner, and this is our weekly conversation with researchers and analysts tracking down threats and vulnerabilities, and solving some of the hard problems of protecting ourselves in a rapidly evolving cyberspace. Thanks for joining us.

Dave Bittner: [00:00:26] And now a word from our sponsor, Juniper Networks. Join Juniper at NXTWORK 2019 to learn, share, and collaborate with game changers from companies across the networking industry. This year's event features keynotes from Juniper executives, as well as special guest speaker Earvin "Magic" Johnson, along with over forty breakouts and master classes led by distinguished engineers, as well as various opportunities for certification testing and training. Visit juniper.net/nxtwork for more information. That's juniper.net/nxtwork. And we thank Juniper for sponsoring our show.

Dave Bittner: [00:01:09] And thanks also to our sponsor, Enveil, whose revolutionary ZeroReveal solution closes the last gap in data security: protecting data in use. It's the industry's first and only scalable commercial solution enabling data to remain encrypted throughout the entire processing lifecycle. Imagine being able to analyze, search, and perform calculations on sensitive data, all without ever decrypting anything – all without the risks of theft or inadvertent exposure. What was once only theoretical is now possible with Enveil. Learn more at enveil.com.

Lorrie Cranor: [00:01:49] I think there was not a lot of focus on usability in security until recently.

Dave Bittner: [00:01:56] That's Lorrie Cranor. She's director of the CyLab Usable Privacy and Security Laboratory at Carnegie Mellon University. The research we're discussing today is titled, "Security and privacy need to be easy."

Lorrie Cranor: [00:02:09] I started working in this area around the year 2000 and was trying to build a usable privacy tool and went to look at what other people had done. And there wasn't a whole lot. There wasn't much in the research literature. There wasn't much in what companies were doing. And I think there was a group of researchers that started talking around then and that spurred some interest in companies. I ended up starting a conference called "Symposium on Usable Privacy and Security." And that has kind of spurred interest in this. And so now it's becoming much more common for companies to actually have usability teams that are focused on security and privacy.

Dave Bittner: [00:02:55] Now, back when you first heard this realization, what was your conclusion? Why had it not really bubbled up to the top at that point?

Lorrie Cranor: [00:03:03] Well, I think that a lot of the security and privacy researchers and developers were kind of insulated. They were very focused on security and privacy. And it was very technical, very mathematical. And their attitude was, you know, we're not experts in usability. We're trying to get the math right. We're trying to get the technology right. And we'll maybe throw it over the fence to some usability team later. And what often happened is there wasn't time – the product shipped without doing the usability work. Or this was done in a small company that didn't even have a usability team, and so there wasn't one to have work on it.

Dave Bittner: [00:03:42] I remember – I want to say back in the '90s when PGP first came out, and there was some excitement about that, that we were going to be able to apply encryption to our emails and so forth. And it never caught on, and I think a big part of that was it was just so hard to use.

Lorrie Cranor: [00:03:59] Yeah, I think that that really was a big problem. And, you know, one of the first research papers in this area was called "Why Johnny Can't Encrypt." And it was a user study using PGP, and found that people just couldn't figure out how to use it.

Dave Bittner: [00:04:15] So, I mean, at a basic level, how do you define usability?

Lorrie Cranor: [00:04:20] So, it really depends on the application. But basically, we're looking for tools, systems, whatever, that people can use, that people can figure out how to use, that people can use correctly without making errors, without being annoyed by it, efficiently – that people find a way to use the security and privacy as part of their normal workflow without having to stop doing whatever it is they really wanted to do in order to do it. All of those things go into usability.

Dave Bittner: [00:04:53] Hmm. Can you take us through some of the approaches that you take there at Carnegie Mellon – some of the research that you've done?

Lorrie Cranor: [00:04:59] We've looked at usability in a variety of contexts and we try to do user studies with actual people as much as we can, rather than just having experts look at something and go, oh, this is gonna be easy, this is gonna be hard. And one of the challenges we have that makes usable security different than any old usability testing is that when we're dealing with a security tool, it involves some sort of a risk. When using PGP, for example, it's not enough that I can figure out how to encrypt and decrypt my email, but I need to be able to recognize when someone is trying to send me a fake email – that I need to be able to, you know, check the signature and make sure it's really from who I think it's from.

Lorrie Cranor: [00:05:42] And so in order to do user studies in this space, we need to make the participants in the study feel like there's actually some risk that they're trying to protect from. But we can't actually put them at risk because, you know, ethics. We don't want to hurt our participants, right?

Dave Bittner: [00:05:58] Right, those pesky ethics. (Laughs).

Lorrie Cranor: [00:06:01] Yeah. (Laughs) So we need to design the study in the way that people behave and are motivated to protect themselves in a realistic way – in the way they would do it in real life without actually putting them at risk. And sometimes we do it by telling them upfront, this is a hypothetical scenario, but giving them such rich detail that they really get involved and get invested into it, even though they know it's fake. Sometimes we do it by giving them payments for being safe and try to simulate it, you know, through money. Sometimes we trick them and we make them think they actually are at risk through something that has nothing to do with the experiment, that just coincidentally happens. And then at the end we tell them, hey, don't worry, you weren't actually at risk – we faked all that.

Lorrie Cranor: [00:06:48] So, as an example of that, we were testing the phishing warnings that show up in web browsers, and so we brought people to our lab and we told them we were doing an online shopping study and we had them go online and purchase some inexpensive items. And we had them then fill out a survey about their purchase experience. And while they were doing that, we sent them a fake phishing email that looked like it came from the vendor they just made the purchase from. And then we told them to go check their email and get the receipt for the purchase so that we could reimburse them for the purchase. And while they were doing that, they would then see our phishing email, well-timed. And almost all of them would then click on the phishing link, which would then trigger the phishing warning in their web browser. And then we could see what they did at that point.

Dave Bittner: [00:07:41] Hmm.

Lorrie Cranor: [00:07:41] And that was what we were interested in, is, you know, do they swat away the warning or do they actually pay attention to it?

Dave Bittner: [00:07:46] Right. That's fascinating. Are there common misperceptions that you find people have when it comes to designing usability into their products? You know, for folks like you who are studying this sort of thing, you roll your eyes, you shake your head and you say, oh, that again?

Lorrie Cranor: [00:08:03] Yeah. So, I think that often developers assume that users know too much. You know, the developers are very familiar with the technology and they just sort of assume that the users will understand it, too. It's kind of like, you know, once you know something, it's hard to imagine what it was like before you knew it. And so I think that's common. I think also they forget that often security tasks are not the main task. You know, it's something that users only do because they have to, not because they want to. You know, I'm trying to send this email. I'm not trying to encrypt the email – that's just a side thing.

Dave Bittner: [00:08:37] Right. So this project that you've been working on and you find very clever as a developer may be little more than a nuisance to the end user at the end of the day.

Lorrie Cranor: [00:08:47] Yeah.

Dave Bittner: [00:08:47] Yeah, that's fascinating. What about that tension between usability and customization? You know, if you think back again, back into the early days of home computing, you know, there was that common perception that, you know, Macs are easy to use and PCs are harder to use, but you can do a lot more on your PC because you can customize it. And, you know, people would say, oh, Macs are easy to use, but they're just toys. And it seems to me like there's a spectrum there, and there's a balance between usability, but also not frustrating your users that they're not able to have control over the things they feel like they need to control.

Lorrie Cranor: [00:09:24] Yeah, that's a great point. So, yeah, users want to have control, but it turns out good user interfaces for control are pretty hard. You know, the more choices you give users, the more likely that they'll be overwhelmed. They won't know which choice to make. They won't understand how to make the choice. It's going to take them more time to go through and make those decisions. And so I think there's a delicate balance between offering users choices and not overwhelming them with choice, or in finding ways to introduce the choice so that those who want to get down into the nitty gritty can, but everybody else can just make a high-level decision.

Lorrie Cranor: [00:10:06] So, you know, some of the ways that this is done is sometimes there's kind of a basic set up and then an advanced setup where I can say, choose, you know, option A, option B, or I want to configure option A, you know, for me. And so then I can go down and drill down on option A and do the minute details, rather than just taking the whole option A package.

Dave Bittner: [00:10:27] When it comes to usability, is there an element of fashion associated with this? In other words, if someone comes out with something clever, a clever solution, do you find that that tends to start a trend with things for better, for worse?

Lorrie Cranor: [00:10:43] Yeah, I think there's definitely a lot of copying of user interface. And I think that's actually usually a good thing. One of the things that makes it easier for users is if the actions they need to take are familiar. So if I've learned that, you know, hamburger menu thing in the top corner, you know, the first time I saw it, it's like three lines that like, what on earth does that mean? But now that I know that when I click on it, it opens a menu, now it's easy – I know where to find the menu. And if we have some websites that instead of putting those horizontal lines, they changed it and made them vertical lines, then no one would know what that meant, and we'd all have to go start randomly clicking till we figured out, oh, that's a menu, too.

Dave Bittner: [00:11:25] Right.

Lorrie Cranor: [00:11:26] So there's definitely – it's definitely useful to have similar patterns across different products and services.

Dave Bittner: [00:11:34] Now, one of the things that you all have been working there at Carnegie Mellon is this idea of a privacy and security nutrition label for IoT devices. Can you take us through that effort?

Lorrie Cranor: [00:11:44] Yeah. So, the problem we're trying to solve here is people here that security and privacy can be an issue with IoT devices – a lot of that in the news lately. And so, you know, you go to Home Depot to buy your IoT device or you go online, and you want to find out, well, which brands should I buy to avoid these security and privacy problems? And basically, there's no information and it's very difficult to do that. And so what we would like to see is a label similar to nutrition labels that you find on food products that would have security and privacy information for your IoT devices in a standard format.

Lorrie Cranor: [00:12:23] So you could take two products – you know, two smart thermostats or whatnot – and look at them side by side and compare their security and privacy features. So we have been working on designing, you know, what are the ingredients that should be in the security and privacy label? So we've done user studies to find out what users are interested in, and then we've gone and talked to experts about what they think users need to know. And based on that, we've come up with a proposal. Now we're taking it back to users to see whether they really understand what experts want them to know, and finding better ways of explaining it to them. So, we're slowly converging on on what should be in that label.

Dave Bittner: [00:13:02] And I think that leads us to a conversation about public policy, which – how does usability intersect with public policy and the folks who make those decisions, who are setting regulations and making laws and so forth, how do they have to consider these sorts of elements of security in the work that they do?

Lorrie Cranor: [00:13:21] I think that it used to be that we didn't see much about usability in any of the policies related to security and privacy. But more recently, I think that's been coming up as an issue. You know, the Federal Trade Commission will go after companies for being deceptive in their privacy notices, in the information they provide to consumers about security and privacy. And so companies will settle with the FTC and they will rewrite their privacy policies. They'll change their consent experiences in their products.

Lorrie Cranor: [00:13:56] And then the question comes up, well, have they actually improved things? Have they solved the problem? And there's not kind of a strict test. You know, there's no law that says, like, how do you know that your informed consent truly was informed? You know, there's not a strict measure of that. But I think it's something that the FTC and other agencies that worry about these things are trying to figure out, and they are offering guidance for kind of best practices, and ways that you can provide notices to people that actually are informed and meaningful.

Lorrie Cranor: [00:14:31] When you look at what Congress is doing, there has been a bunch of proposed legislation about IoT device security and privacy. None of it has gotten very far, but one of the things that we're seeing in some of these proposed bills is that you have to inform consumers about security and privacy. So far, they haven't actually explained how. Some of them have even referred to the idea of putting a label on the product, or inform consumers if there is a microphone or a video camera. We don't have too much detail about how to do it, and one of the things we're hoping with our project is that if any of these pieces of legislation actually move forward, that we will be able to say, hey, here is a way to do it. You know, adopt this. So, we'll see what happens.

Dave Bittner: [00:15:20] What is on the leading edge right now? With you and your students and colleagues who are working on this research, what are the things that you're excited about for the future?

Lorrie Cranor: [00:15:29] Oh, so many different things. Right now, we're busy looking at how to improve opt-out and consent on websites and trying to come up with a set of validated best practices that we can put out for companies to use. So, hopefully we'll make some good progress on that. Also doing work on passwords, and what kind of password policies companies should adopt so that their users create strong passwords, but they're passwords that they can actually remember and use. So, some of the work that we've done on that has actually led to improvements in NIST's password guidance a couple of years ago. But there's more that we're working on to have some more actionable guidance for companies.

Dave Bittner: [00:16:14] What are your recommendations for companies who, as they're doing their development, they know that this is something they want to have part of the process, they want to be effective – how do they measure success?

Lorrie Cranor: [00:16:25] Well, I think in order to know how successful you are with usability, you really have to actually do user studies. And, you know, that's something that, as I said earlier, a lot of companies were just not doing for a long time. And I think still a lot of companies are not doing it. We're now seeing that some of the bigger tech companies are doing it, and they have teams. They hire my former students, my graduates are now going to big tech companies and are on some of these usability teams doing security and privacy work. But that's really what we need to know if it's usable, is actually test it with users.

Dave Bittner: [00:17:04] Our thanks to Lorrie Cranor from Carnegie Mellon University for joining us. The research is titled, "Security and Privacy Need to Be Easy." We'll have a link in the show notes.

Dave Bittner: [00:17:15] Thanks to Juniper Networks for sponsoring our show. You can learn more at juniper.net/security, or connect with them on Twitter or Facebook.

Dave Bittner: [00:17:24] And thanks to Enveil for their sponsorship. You can find out how they're closing the last gap in data security at enveil.com.

Dave Bittner: [00:17:32] The CyberWire Research Saturday is proudly produced in Maryland out of the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technology. The coordinating producer is Jennifer Eiben. Our amazing CyberWire team is Stefan Vaziri, Kelsea Bond, Tim Nodar, Joe Carrigan, Carole Theriault, Nick Veliky, Bennett Moe, Chris Russell, John Petrik, Peter Kilpe, and I'm Dave Bittner. Thanks for listening.