Caveat 4.14.22
Ep 121 | 4.14.22

Does security and identity go hand in hand?

Transcript

Aaron Painter: I believe cybersecurity is having an identity crisis. Today's internet has become a place where almost anyone can claim to be you. And therefore, they can claim ownership over your digital property.

Dave Bittner: Hello, everyone. And welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner, and joining me is my co-host, Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben.

Ben Yelin: Hello, Dave. 

Dave Bittner: Today Ben discusses a segment on data brokers from "Last Week Tonight With John Oliver." I've got the story of the U.S. removing malware from systems around the globe. And later in the show, my conversation with Aaron Painter from Nametag. We're discussing innovations in online identity technology. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben. Let's dig into our stories this week. Why don't you start things off for us here? 

Ben Yelin: I was a little tired of reading, so this week's story comes from YouTube. It is rare that the issues we cover on this podcast really get into the cultural zeitgeist. 

Dave Bittner: Yeah. 

Ben Yelin: But here we are. So this week's segment on "Last Week Tonight With John Oliver" was about data brokers. So the first 20 minutes or so of the segment was an overview of how data brokers operate, where the data comes from originally - so a little history of cookies and all of the breadcrumbs we leave in our online interactions - the purpose of data brokers, who they are, what their intention is, the extent to which they can collect our data and put together a pretty clear picture of our lives and sell them to the highest bidder and how their promises to anonymize the data, as we've talked about a million times, don't actually mean much when it's very easy with minimal research to figure out which data belongs to which person. 

Dave Bittner: Right. 

Ben Yelin: He mentioned a couple of things that we haven't talked about as much on our show. One of the later parts of the segment is about why our political system hasn't solved this problem. You'd think that people would be relatively outraged at their data, sometimes very personal data, being sold. 

Dave Bittner: Right. 

Ben Yelin: And, you know, Europe has taken action with GDPR to an extent, and several states within the United States have taken action as well. He questions why the federal government hasn't taken action, and he posits a couple of reasons. The first is that data brokers are very powerful politically. They have a lot of money. I'm always unsure about whether we overstate or understate the influence of money in politics, but that's his hypothesis. And the second - and this is a really good point - is political campaigns make use of data brokers. They buy this data themselves to do what's called micro-targeting where, in order to pick up votes, you get a bunch of information - a dossier, if you will - on an individual voter. So Dave in Maryland is - I'm not going to say a real age, but this age. 

Dave Bittner: (Laughter) A man of a certain age. 

Ben Yelin: A man of a certain age. These are his interests. 

Dave Bittner: Enjoys musical theater, cybersecurity and "The Muppets." 

Ben Yelin: Exactly. 

Dave Bittner: (Laughter). 

Ben Yelin: That describes you purposely - perfectly. Here are all the beers he drinks. Here are his purchasing habits. 

Dave Bittner: Right. 

Ben Yelin: And that's very useful information to political campaigns because they can target you specifically with online advertising. 

Dave Bittner: Right. Right. 

Ben Yelin: So they know which buttons to push to get certain demographics to come out to vote. And the way they decide what ad to put on your Facebook page is going to depend on what information they can glean from data brokers. 

Dave Bittner: Right. 

Ben Yelin: So that was really interesting. I mean, I think that's a point that is - that we haven't really talked about too much - is why there is maybe some reticence to do this at the federal level. And... 

Dave Bittner: Yeah. 

Ben Yelin: His hypothesis is - yeah. 

Dave Bittner: Well, I mean, I have to say it is an ongoing frustration of mine and, I suspect, most thinking people that - how often legislators exclude themselves from the laws that they pass, right? 

Ben Yelin: Yeah, and how many times they've been called out on it. 

Dave Bittner: Right. 

Ben Yelin: I mean... 

Dave Bittner: Right. 

Ben Yelin: Sometimes the attacks on it are in bad faith. Like, they - there was a whole thing about how they excluded themselves from the Affordable Care Act when really, that was about them being on their own health care system. And... 

Dave Bittner: Yeah. 

Ben Yelin: It got very complicated. But some times like this, there are pretty good reasons to criticize them... 

Dave Bittner: Right. 

Ben Yelin: ...When they are quite dependent on data brokers to run their political campaigns. 

Dave Bittner: Yeah. 

Ben Yelin: The other point he brought up, which I think is always worth emphasizing, is there's an attitude among many of us that we are an open book. I like the convenience of going to a web page. They know everything about me. They can tailor their ads towards me and my interests. When I go to a website that I like, my credentials are already there. They know what my favorite sports teams are. I can get the latest news articles on those sports teams. 

Dave Bittner: Right. 

Ben Yelin: That's great for certain people. 

Dave Bittner: Yeah. 

Ben Yelin: The point that John Oliver makes is even if it doesn't affect you, you're not thinking about the people that it does affect - victims of domestic violence, for example, where somebody is in hiding or trying to protect themselves from somebody dangerous, sometimes it's through a data broker where stalkers can get information on an individual. Or when we're talking about the federal government or state and local governments purchasing data, which they can do from these brokers without a warrant, that means they can target people for deportation or charging them with crimes without any individualized suspicion. And I think that's a point that's so important. I think so many people have the attitude, I'm an open book; I don't care what people know about me. 

Dave Bittner: Right. 

Ben Yelin: I care primarily about convenience. 

Dave Bittner: I saw one this week that's a lighter example of this. But it was a gentleman who was about to ask his girlfriend to marry him, and so he had gone out and purchased the engagement ring, right? And a few days later in the mail - got an envelope in the mail that said, congratulations on your recent engagement. 

Ben Yelin: Ooh. Yeah. 

Dave Bittner: And he was living with his girlfriend. She saw the envelope and ruined the surprise (laughter). 

Ben Yelin: Yep. 

Dave Bittner: So there you go. 

Ben Yelin: There are so many funny anecdotes about that. The one that John Oliver has in the story is a single guy who went to purchase baby wipes because his company bought a new office. Now, I don't know why he's cleaning his office with baby wipes. But he is. 

Dave Bittner: (Laughter). 

Ben Yelin: I would have gotten Lysol wipes myself. But - and all of a sudden, he started getting all these advertisements for baby products. And, of course... 

Dave Bittner: Right. 

Ben Yelin: ...He had no interest in them. 

Dave Bittner: Right. 

Ben Yelin: Yeah, they know more about us than we know about ourselves. And I just think this is a problem that isn't frequently discussed in such a - we've talked about it. And as much as I like to think that we can control the conversation around data privacy issues, I think sometimes you need something like a comedian doing a 30-minute segment to really get this into, like I said, the cultural zeitgeist. And I saw it happen with Edward Snowden and national security surveillance. He did a segment in 2015. I show it the first day every time I teach a class on this. It's informed. It's interesting. It's hilarious. It's vulgar. This is the type of way that you can start discussions about these serious issues. 

Ben Yelin: And then there's the kicker to all of this, which I think we'd want to allude to by playing a quick segment. 

Dave Bittner: All right. So here's a clip. This is from "Last Week Tonight With John Oliver." Let's play the clip. 

(SOUNDBITE OF TV SHOW, "LAST WEEK TONIGHT WITH JOHN OLIVER") 

John Oliver: Interestingly, the one time that Congress has acted quickly to safeguard people's privacy was in the 1980s when Robert Bork was nominated to the Supreme Court, and a reporter walked into a local video store and asked the manager whether he could have a peek at Bork's video rental history, and he got it. As soon as Congress realized there was nothing stopping anyone from retrieving their video rental records, too, they freaked the [expletive] out. And lo and behold, the Video Privacy Protection Act was passed with quite deliberate speed. So it seems when Congress' own privacy is at risk, they somehow find a way to act. And it also seems like they're not entirely aware just how easy it is for anyone - and I do mean anyone - to get their personal information, which brings me to me because in researching this story, we realized there was any number of perfectly legal bits of [expletive] that we could engage in. 

John Oliver: We could, for example, use data brokers to go phishing for members of Congress by creating a demographic group consisting of men age 45 and up in a five-mile radius of the U.S. Capitol who had previously visited sites regarding or searched for terms including divorce, massage, hair loss and midlife crisis. 

(LAUGHTER) 

John Oliver: We could call that group Congress and Cabernet, and then target that list with ads that might attract those men to click, like marriage shouldn't be a prison, or can you vote twice? We could also throw in, do you want to read Ted Cruz erotic fan fiction, just to see what would happen. 

(LAUGHTER) 

John Oliver: And if anyone clicked, we'd be able to harvest even more data from them, which we could then theoretically take steps to de-anonymize. Now, am I saying that we're actually going to do that - collect all that raw information and store it in, let's say, a manila envelope somewhere? Well, I am sorry to disappoint you. We are not going to do that. Why would we when we have already done it? Because... 

(LAUGHTER, APPLAUSE) 

John Oliver: ...All that raw data is currently right in here. And honestly, this whole exercise was [expletive] creepy. And if you're thinking, how on Earth is any of this legal? I totally agree with you. It shouldn't be. And if you happen to be a legislator who is feeling a little nervous right now about whether your information is in this envelope and you are terrified about what I might do with it, you might want to channel that worry into making sure that I can't do anything. Anyway, sleep well. That's our show. Thank you so much for watching. We'll see you next week. Good night. 

(LAUGHTER, APPLAUSE) 

Dave Bittner: Oh, so good (laughter). 

Ben Yelin: It's so good. You'd love to think that people who work at the Capitol are terrified and that at least one person at the U.S. Capitol, whether that is a staff person or a member of Congress, was actually baited to click on Ted Cruz erotica fan fiction. 

Dave Bittner: (Laughter) That's right. So let me ask - well, a couple of things. First of all, Ben, is blackmail illegal (laughter)? 

Ben Yelin: That's actually a harder question to answer than you think. It kind of depends on the context. 

Dave Bittner: Right. 

Ben Yelin: He also wasn't directly blackmailing them. 

Dave Bittner: Yeah. 

Ben Yelin: There's no specific ask here. 

Dave Bittner: Yeah. 

Ben Yelin: That's what makes this so hilarious is, huh, maybe you should think about changing the policy because now we have all of this potentially incriminating information on you in our manila folder. 

Dave Bittner: Right. 

Ben Yelin: Now you can at least temporarily realize what everyone else thinks and feels when their data is collected. So it's kind of a taste of their own medicine. It's not a direct threat, but it's one of those, in a very comedic way, saying the tables have turned. 

Dave Bittner: Yeah. I also think it's interesting how coming at something like this comedically helps break down those barriers that people have. It's hard to be defensive when you're laughing. 

Ben Yelin: Right. I mean, I - it's also just hard to listen to any 30-minute segment about anything if it's not mildly entertaining. As much as I think our podcast is extremely entertaining... 

Dave Bittner: (Laughter) That's right. 

Ben Yelin: And it is. 

Dave Bittner: According to our moms (laughter). 

Ben Yelin: Yeah, exactly. We don't have the same type of comedic writers - and maybe this is our problem - that he does. 

Dave Bittner: Right. 

Ben Yelin: It's a way where you can illustrate the issue that makes it relatable to people who didn't know anything about data brokers. I mean, I think one thing that comes out of this is even though 60% of people in the poll that John Oliver cites on his show claim to know that their data is being collected and they're kind of OK with that risk, there's so 40% of people who don't. And I still think that among those 60%, I don't think people realize the extent of it, how much is out there and who can gain access to it. It's not just private companies, although the private companies do have a lot of information on what we do, what we like to do, what we purchase... 

Dave Bittner: Yeah. 

Ben Yelin: ...But also the fact that we now have documented examples of government agencies buying up this data, things that are very personal like location data, and using it against us in a legal proceeding. And I just don't think people are properly aware of this practice. You can write all the articles and Vice, Motherboard - by Vice as you want... 

Dave Bittner: Yeah. 

Ben Yelin: ...But there's something about it being in a 30-minute comedy segment on HBO that I think is really going to do wonders and bring this to the forefront. 

Dave Bittner: Yeah. Well, we can hope, right? (Laughter). 

Ben Yelin: We sure can. 

Dave Bittner: Yeah. All right. Well, we will have a link to that clip on YouTube. As Ben says, it's just under a half an hour long. And I know John Oliver's not for everybody, but I think this one's worth checking out. Makes a lot of good points in here. 

Dave Bittner: All right. My story this week - this comes from The New York Times. This is an article by Kate Conger and David E. Sanger. And it's titled "U.S. Says It Secretly Removed Malware Worldwide, Pre-empting Russian Cyberattacks." So basically, what this comes down to is that sort of in the run-up to the Russian invasion of Ukraine, the U.S. got judicial permission... 

Ben Yelin: A secret judicial order. Yep. 

Dave Bittner: Secret judicial approval to go into computers around the world and worked with international partners and removed malware that would enable botnets - which would enable these computers to do all sorts of things and install other malware, run distributed denial-of-service attacks and those sorts of things. And so the U.S. federal government went around to the folks on the cyber side of our government and removed this, shut down these botnets, basically disrupted their ability to communicate with their motherships, their command and control servers, the things that give them the instructions of what to do. 

Ben Yelin: I'm picturing the poor botnet trying to communicate with his mother ship and just being so devastated. 

Dave Bittner: (Laughter) Sad and lonely, wondering where where did everybody go? 

Ben Yelin: Mom. Yeah. 

Dave Bittner: So - and by all accounts, this was successful and is certainly a stain on Russia's capabilities in the cyber realm. And I think it's fair to say so far, a lot of people are still scratching their heads over the fact that we haven't seen more cyber operations from Russia. 

Ben Yelin: Right. That could change as we're recording this, of course. 

Dave Bittner: Could change at any moment. But it's something that was, I think - a lot of people expected them to lead with, and they have not. 

Ben Yelin: Right. 

Dave Bittner: So isn't that interesting? But let's talk to the issue at hand for us here, which is that the U.S. government went into other people's systems, private companies around the world, let themselves in, looked around, did the things they needed to do, left, most likely without letting anybody know they were even there. And we're OK with this? (Laughter). 

Ben Yelin: I'm so conflicted on this, Dave. 

Dave Bittner: OK. 

Ben Yelin: I care deeply about digital privacy. 

Dave Bittner: Right. 

Ben Yelin: I think it's very important. We are talking about potential attacks on our critical infrastructure. 

Dave Bittner: Yeah. 

Ben Yelin: And Russia was willing, is willing and is able to wreak destruction on our water systems, our power plants, our electronic grids. There's a price we have to pay to protect those systems. And it depends on what price you're willing to pay. To me, going through a court order to go onto devices to delete botnets is a small enough price to pay in order to stop these cyberattacks in a time of war, especially after our diplomatic relationship with Russia broke down in the weeks preceding the armed conflict in Ukraine. They were already - the Russian intelligence GRU was already trying to use malware against their Ukrainian adversaries. So we know to a certain extent that they have that capability, and we know that they've used weapons, cyber weapons, in the past. They've deployed them here. They've deployed them in Syria and other places. If this had been done without a court order - and we don't have access to the actual court order, so I don't know exactly what's contained in there - I would be more concerned. I mean, it is an intrusive thing to do. But at least it should be some comfort that there was a judge or a magistrate who looked at this, looked at the legal justification for doing this, balanced the potential invasion of privacy with the amount of security that this would bring. 

Dave Bittner: What would be the legal justification for doing this? What - if you were saying I need legal - I need a legal backstop here, how would you come at this? 

Ben Yelin: So there are a bunch of ways. I mean, there are emergency provisions in various laws, including the Electronic Communications Privacy Act, that in certain circumstances would allow us to go on to people's devices and networks. There are also other causes of actions. I mean, you can get creative in using things like the All Writs Act to compel telecommunications companies to, you know, in the name of national security, take some sort of measure to protect computers and networks. So there are certainly laws that allow you to do this. Again, without actually having seen the court order, I don't know what their legal justification was, but there certainly are tools available to them. And there's a lot you can do when you get a court order. I mean, we talk a lot about warrantless searches of people's devices. Those are problematic because they're warrantless. If you get judicial approval, I mean, we allow all different types of intrusive searches and seizures of our data, of our digital data. So if there is some sort of law enforcement or national security interest, depending on how strong the security needs are and what our cybersecurity demands are, you can justify it if you can show that you are minimizing the effect on people's digital privacy. 

Dave Bittner: Right. 

Ben Yelin: It's part of a balancing test. And that's why the word reasonable is very important when we're talking about the Fourth Amendment. Courts do this reasonableness analysis sometimes in the absence of a warrant, but even with a warrant. To determine whether a program like this is reasonable, you do have to engage in this kind of balancing test. And when we're talking about things as dangerous as shutting down our critical infrastructure, I think it's a - it's reasonable for a court to conclude that under that balancing test, this can be justified. 

Dave Bittner: So what, if any, types of guardrails are put on something like this? In other words, let's say the government in good faith goes in to remove some malware. While they're in there, they happen upon something that attracts their attention, something that perhaps other parts of government law enforcement would be very interested in knowing about. 

Ben Yelin: I feel like the plain view doctrine comes in there as long as the original search is legal. 

Dave Bittner: (Laughter) OK. 

Ben Yelin: I mean, if I got a warrant to search your house for evidence of tax fraud and there was a crack pipe on your kitchen table, I could prosecute you for seeing that crack pipe 'cause it was out in the open. It was in there, even though it wasn't the object of the original search. So as long as the search itself is legal and you're - you are engaging in surveillance pursuant to the original authorization, then if you find evidence of illegal behavior, that's certainly fair game for some sort of judicial proceeding. 

Dave Bittner: How are the folks - you know, the usual suspects in terms of civil libertarians responding to this as they've - because we got - because we convinced the judges, is - are they, you know, staying on the sidelines on this one, or where do we stand? 

Ben Yelin: I have not seen the type of pushback that I have seen in somewhat similar circumstances, and I think that's because of the particular factors at play here. You know, when you have the vice president for intelligence at CrowdStrike analyzing the malware, linking the malware to Russia and saying that Russian organizations were intending to cause damage to our infrastructure and aid Russian military objectives. And you go and get a court order. I don't think that's going to raise the ire that much of the Electronic Frontier Foundation... 

Dave Bittner: Yeah. 

Ben Yelin: ...Or the ACLU. 

Dave Bittner: It's a strong case (laughter). 

Ben Yelin: It is. 

Dave Bittner: (Laughter) Right. Right. 

Ben Yelin: They - I think there is a more fertile ground for them to fight on. Now, they - I'm sure if they got access to the judicial order, there could be holes that they could poke through. And I'm sure they'll be doing FOIA requests to get access to anything they can. But I'm not seeing the type of pushback on this that I've seen in similar circumstances. I think it's just when you take it in the context of what's happening around the world, I think it takes on a different meaning, and I think it's less offensive than it otherwise would be to these types of organizations. 

Dave Bittner: All right. Well, again, that story is from The New York Times, written by Kate Conger and David Sanger. We will have a link to that in our show notes. 

Dave Bittner: Ben, I recently had the pleasure of speaking with Aaron Painter. He's from an organization called Nametag. And they are in the digital identity business, as their name implies. And our discussion centers on some of the developments that are going on when it comes to online identity technology. Here's my conversation with Aaron Painter. 

Aaron Painter: I believe cybersecurity is having an identity crisis. Today's internet has become a place where almost anyone can claim to be you, and therefore they can claim ownership over your digital property. The way I see the perspective of historical context is that in the early days of networked computing, usernames and passwords became the way that people identified themselves. And passwords were the way that you protected that access to your identity. But passwords are not passports. Perhaps it was once a smaller group of people back then. But despite billions of people now using the internet and all the other aspects of technology having evolved in the last 50 or 60 years, online authentication has not kept pace. 

Aaron Painter: Security is a much bigger challenge, as many of your listeners know. And passwords can take many forms. And while they're still convenient, they're not keeping people safe. SMS messages, email links, authenticator apps and the like all are trying to make the humble password more secure. But none of them prove the real owner of the account. 

Aaron Painter: And efforts to check identity are often one time, scan an ID because I'm opening a bank account for something like KYC. But even if I've done that, when I go to access that bank account or maybe do a transfer or some other important transaction, I'm often stuck back to knowledge-based questions or asking for a password. And they're limited in what they can do. The best is often someone showing up in person still or asking to fax a copy of an identity document with a signature on it. They're not keeping people safe because while you might know that someone holds the device, you don't often know who the owner of that device is. 

Dave Bittner: You know, I think - we see studies that have shown that, you know, compared to a regular username and password combination, that adding that additional factor really does make a difference. And we've certainly seen a shift to prioritize that or emphasize that, you know, throughout the industry. So in what way is that coming up short? You know, if I'm having something like a YubiKey, or I have, you know, a verification app on my mobile device, does that get me most of the way there? Or is there still more to go? 

Aaron Painter: You know, one of the biggest challenges or limitations of that we're often finding is ownership matters, particularly when a user gets locked out of that 2FA or MFA approach. Typically, it's a lost phone or an email account. And when that happens, it's nearly impossible to ensure the - only the account's rightful owner gets back in. And that creates a massive security vulnerability because fraudsters can claim they are the ones locked out, and they are in need of a password reset. There's often not a simple password-reset button in a world when MFA has been enabled. And the burden of proof then often falls on that rightful account holder to convince the company that it's really their account, again, often using primitive methods and tools, sometimes resulting in a person needing to even show up in person to prove it's really them. 

Dave Bittner: What about, you know, things like Face ID or Touch ID? I think a lot of us have a positive experience with those systems. So where do they - what are the pros and cons there? 

Aaron Painter: They can be really effective. And Face ID and password managers have certainly made our lives easier in a world when we just have so many accounts that we all work with. The limitation, though, is that even Face ID most often is used essentially as a keychain, is used as a way to enter in someone's password that's been saved. Very few accounts are able to rely only on that encryption key that's sent from that iOS device over to the provider as a way to authenticate the person. 

Aaron Painter: They need to still have some other form of password backup because you're accessing maybe from an app on your mobile phone one day. You're accessing a site another day on your desktop browser. And then things happen, like getting locked out of that account. Maybe you've lost the phone. If a company has begun to only trust the encryption key that's sent from Face ID, then if you've lost that phone, it's incredibly difficult to actually recover you, the authentic owner of that account. And again, it falls back to needing to prove someone's identity. 

Dave Bittner: So where do we go then? I know you all are active participators in this game. Where do you suppose the future lies? 

Aaron Painter: We've created something we feel makes accounts more secure. And we call it, in simple terms for an end user, sign in with ID, as opposed to simply signing in with a username and password. The technology underlying that we think is a level beyond MFA, or multi-factor authentication, such that we call it multi-factor identity. And it allows a user to create a reusable sense of identity linked to the government ID, a selfie. We match the photo of your selfie to your government ID, and we keep that reusable and stored inside the secure enclave on your phone so that you can use it in an ongoing way when you're accessing an account or when you're prompted maybe by various companies to validate the actual ownership of your account. 

Dave Bittner: Is this the sort of fantasy of, you know, one login to rule them all? 

Aaron Painter: We believe that each login is still unique, and you should almost have the equivalent of a receipt for it or be in the ability to control, hey, I've given this bit of data or these login authentication to a particular site. But part of that is involving consent from the user as part of the security transaction. Today often the best case is a company trying to guess maybe that you're the user who has called in or is trying to access an online site. Maybe they reference back-end sources. Maybe they try and look up the phone number you're calling from. But the end user is often not a part of that consent path. 

Aaron Painter: So we've created a way for the company to essentially prompt the user to say, hey, I want to make sure you're a real person. And I want to make sure you're the real account owner. But yeah, that should be reusable when you put the consumer in control of that information. 

Dave Bittner: And so how would something like this work, you know, from a user's point of view, if I wanted to use a system like yours to access something online? Can you walk us through the process? 

Aaron Painter: Yeah. It's a really simple flow. Essentially, a company - you would be clicking on a button on a website or maybe on your mobile phone or inside an app. If it's actually on a desktop browser, we've implemented QR code-based login. So instead of a username and password, you scan a QR code. Regardless, if it's a tap on the mobile app or scanning a QR code, the next thing that happens is a box pops up using this really neat technology from Apple called App Clips - Android's companion one with Instant Apps. So essentially, a mini authenticator app-like instrument is downloaded over the air into the user's phone, so it feels like it's a native part of the OS, and pops up, asks the user to scan the government-issued ID, to do a selfie check and then provides real clarity, asking the user for consent to share pieces of the information from their government ID with the company. The button illuminates once they put in the information, and they simply click log in, and the user's authenticated into that site or app experience. It minimizes, and the user continues on with whatever they were hoping to achieve. 

Dave Bittner: So I hold up my ID. Let's say in this case, you know, my driver's license. So that gets scanned. And then it asks me to take a selfie. And it compares those two photos? 

Aaron Painter: That's exactly right - and among some other steps, including validating, you know, is that ID valid and accurate - a few steps in that regard. And then we look at other parameters. But from the end-user perspective, they feel like they're really using primarily the government-issued ID and their selfie to match. 

Dave Bittner: And is it - when you do the selfie part, is it checking to make sure that this is a real live person here and I didn't, you know, scrape a photo of me off the internet? 

Aaron Painter: It is - and a variety of other increasingly really cool kind of anti-fraud techniques. And that's part of the reason why we bounce the user to their mobile phone, as opposed to being able to use a desktop camera. There's so much advanced technology in today's modern mobile phones - from depth mapping, the types of cameras, the secure enclave - that make it a much more robust experience, all data that we can leverage to help authenticate the identity of the person and make sure they match to that ID. 

Dave Bittner: You know, something that my co-host Ben and I have talked about in terms of, you know, policy and privacy is, you know, let's say I go to a bar, someplace like that. And I'm there, and I want to get in, and I hand the person my driver's license. Well, they have access to all the information on that driver's license. It intrigues me, what you're talking about here, that this puts me in control of which information I want to share. 

Aaron Painter: I completely agree with you. And we love that analogy. We talk about it a lot. You know, sometimes those bouncers - while they might be well intended, they're not always people that you want necessarily to know your home address or other things when the only question they're trying to ask is, are you of age, are you 21-plus to come into that bar? And we believe firmly, whether it's in the bar scenario or online platforms, privacy does not need to mean anonymity. There are use cases where it's great to be anonymous, and you might want to go spin up a new Gmail address and log in as a new person. But there are so many important transactions that occur today online. And you need a sense of actually knowing who the real owner of an account is in many of those scenarios. 

Aaron Painter: We call this technology a privacy mask. And so it allows a user to, even though you've scanned or uploaded your ID and we're doing that matching, doesn't mean you need to share all the elements on the ID. In fact, you might not even need to share your birthdate in that bar example. You might only need to share that you're 21-plus. So the user gets a chance to review what's been asked by the company and then consent specifically to sharing that information. 

Dave Bittner: What about for folks who might be hesitant to scan in their official government ID? What's going on behind the scenes to protect that? 

Aaron Painter: We have a really robust privacy policy that we're really proud of. We summarize it in ten words, essentially that we only share information when you specifically ask us to. And we've assembled a really strong team of cybersecurity professionals who have made their lives and careers out of this. Part of it, too, is the benefit to be able to build a new product and infrastructure in today's day and age and use kind of all the latest bells and whistles of cloud computing and the secure enclave and a variety of other tools. We have a lot of steps to secure the information and then a really proactive way that we're both achieving consent or obtaining consent from the user and allowing them to protect the privacy of what they're sharing. 

Dave Bittner: Now, with a system like yours, what about for the folks on the other side? If I'm someone who wants to accept your, you know, system as a form of ID, what do I have to do on my website? 

Aaron Painter: Incredibly easy to implement. We've essentially built on standards like OAuth and OpenID Connect 2.0. It's essentially as easy as a login with Google Button to integrate into your infrastructure. But then there's some other really fun things in how we designed this that we're really proud of in that a company doesn't necessarily need to store the PII that they're asking for from that user. In fact, we have the ability for a company to choose to store no PII. We act as the guarantor of that. The user has given consent for what elements they share. But a company can essentially just store almost an encrypted key in their CRM or other system that doesn't have any underlying PII data in it but has the ability to call on that data from Nametag whenever they need it. 

Dave Bittner: And where are we headed in terms of adoptions of this sort of thing? I mean, you all have this system. Is this - is it exclusive to you? Are there standards in this space? Where do we sit with that? 

Aaron Painter: We've built on as many industry standards as we can find, but we've combined things in a way that hasn't been previously combined. So we're really excited about the use case implementations for this. The use cases end up being quite consistent. It's often continuous account access in a more secure way - let's say every time you're logging into maybe a dating site or an account of value to you - and then also account recovery, where there are organizations that have implemented maybe MFA or a higher level of security in the password. But then again, that user gets locked out, and they need to do essentially an ID validation to help reset that account or access to that account. So account recovery and continuous account access seem to be the most consistent use cases we hear from companies and organizations. 

Dave Bittner: Yeah. It really seems to take care of that issue where, if you lose your mobile device, you know - the nightmare that that can be of having to reset all of those authentication apps and (laughter), you know, that list of 10 codes that you printed out and stored somewhere, right? 

Aaron Painter: That's exactly right. And that is a nightmare, unfortunately, for both sides because, again, the burden often falls on the user to prove it's really them. And the companies are struggling, we hear, between seven, sometimes 14 days. You know, you're locked out of an account, and the company just kind of throws their hands in the air like that emoji, my favorite. I don't know what to do. How do we really make sure this is you? And that's... 

Dave Bittner: Right. 

Aaron Painter: ...A complex process that today just doesn't have an easy way. We built an easy way. But then you get into, well, can't I just use that same level of assurance every time I'm logging in and provide a greater level of confidence to both sides that the authentic owner is the one getting into that account? And that's what we've tried to build. 

Dave Bittner: Where do you suppose we're ultimately headed with this? I mean, you have, you know, the system that you all have developed here. Can you envision a time in the not-too-distant future when usernames and passwords are a thing of the past? 

Aaron Painter: I think that future would allow a greater sense of security on all sides. And access is really one of the greatest threat vectors into a system or a network or, you know, an account if I'm an end user. The wrong person getting in has just detrimental consequences for everyone involved - the cost of fraud for the company and then that risk of identity theft. 

Aaron Painter: In fact, this is what sort of led me in the first place to get this team together and build Nametag. I had friends and family members who had become victims of identity theft early in the pandemic, and it led to some really dire consequences. And when I tried to help them recover and I went to check my own accounts, it was the start of the pandemic. I couldn't go into a branch. But it felt like the Dark Ages. I could do almost everything from my mobile phone except actually prove who I really was. And so, of course, my accounts were insecure. And unfortunately, my friends and family were great numbers. By some accounts, almost half the country has had their identity stolen in the U.S., for example. That's just not a safe infrastructure for any system. 

Dave Bittner: All right, Ben. What do you think? 

Ben Yelin: Really interesting. I love how he called it an identity crisis in the online identity world. But it's a really interesting history. I mean, I think for the last 20 years or so, our assumptions about the safety and security of usernames and passwords has slowly eroded as the bad guys have gotten more sophisticated. And the things that have protected us in the past, like security questions, aren't as secure as we originally thought. So the field is rapidly expanding. And I think we are getting better. I mean, multifactor authentication is a real game-changer. 

Dave Bittner: Yeah. 

Ben Yelin: So I found the interview kind of encouraging in that regard. 

Dave Bittner: Yeah, absolutely. All right. Well, our thanks to Aaron Painter from Nametag for joining us. We do appreciate him taking the time. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.