Caveat 10.12.23
Ep 190 | 10.12.23

Issues in cybersecurity policy.


 Caleb Barlow: There are very clear requirements for security that critical infrastructure in China needs to have. We don't have that here in the U.S. It's anything but clear, so we kind of have to give them some points for that. On the other hand, the big "yeah, but" with that is they can go so far as to access the network and can do it, you know, under authorities where they don't even need to tell you why. So, you know, that sets up a scenario we obviously have to be worried about -- privacy laws and intellectual property laws.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the Cyberwire's privacy, surveillance, law, and policy podcast. I'm Dave Bittner and joining me is my co-host, Ben Yelin, from the University of Maryland Center for Health and Homeland Security. Ben, hello!

Ben Yelin: Hello, Dave.

Dave Bittner: Today, Ben has the story of Amazon Alexa devices spreading election misinformation. I've got the story of the legal pushback against searches at the U.S. border. And, later in the show, Ben and I welcome back Caleb Barlow who has some challenging global policy issues for us to ponder. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. All right, Ben. We've got some good stories to share. You want to kick things off for us here?

Ben Yelin: So mine is kind of a funny one that also has some serious elements. It's a story that was published in the Washington Post by Cat Zakrzewski I believe is how her name is pronounced. I'm probably completely butchering that. And it is entitled, "Amazon's Alexa has been claiming the 2020 election was stolen."

Dave Bittner: Mm.

Ben Yelin: So I believe it was a reporter from the Huffington Post who first kind of brought this to everybody's attention on the "artist formerly known as Twitter."

Dave Bittner: Mm-hmm.

Ben Yelin: And noticed that if you asked -- at least as of a week or so ago -- if you asked an Alexa device, "Was the 2020 election stolen?", it would respond, "Yes." And basically would cite a couple of Substack articles by sources that aren't exactly reputable. So this is a pretty big problem because, despite the fact that all of us should be aware that Alexa devices are imperfect, they don't always tell us the truth, they are not actually intelligent human beings, I think most people see them as some type of authority. I think when people ask Alexa questions, they expect the answers to be factually correct.

Dave Bittner: Mm-hmm.

Ben Yelin: So Amazon was alerted to this and they said they went through proper protocols. Now this type of misinformation has been removed and when you -- supposedly, when you ask Alexa now if the 2020 election was stolen, it points you to more reputable sources like the Associated Press. So their -- their response was these responses were errors that were delivered a small number of times and quickly fixed when brought to our attention. We continually audit and improve the systems we have in place for detecting and blocking inaccurate content. They said they work with credible sources like Reuters, Ballotpedia, and RealClearPolitics to provide real-time information. There's one problem with that. There was an audio clip of somebody asking that question to their Alexa device. I played it on my computer. It triggered my Alexa device and I got an answer two days ago about how the election was allegedly stolen --

Dave Bittner: Really!

Ben Yelin: -- in 2020.

Dave Bittner: Huh!

Ben Yelin: Citing some type of article about voter irregularities in Pennsylvania. So the real serious problem here is people see their Alexa devices as authoritative sources for information.

Dave Bittner: Well, sure. It has all the world's information at -- at its command.

Ben Yelin: Yeah, exactly! And Amazon represents it as we're bringing together the best sources of information. We use reputable sources. We're going to make sure that the information that you receive from our devices is accurate. Here are all the controls that we have in place, and they talk about artificial intelligence tools they use, but also human controls to weed out misinformation. But this is like a pretty big failure on -- on their part if such a broad, important question that's been a major public policy concern for the past several years has ended up devolving into this incorrect answer. And it raises concern for the 2024 election that these devices could be manipulated to spew more false information. Something like what happened in the 2020 election is serious, but you could envision a scenario where an Alexa device learns information from disreputable sources about how an election is run, or where people's polling places are, or what the voting hours are. And if that's incorrect, that could have a major effect on our elections. And it seems like the controls that are already in place are just not adequately -- are -- are not working to the extent that we would expect them to work. So it's certainly a -- a concern. I think Amazon crowdsources its answers from its customers, so the danger of crowdsourcing is when you put a bunch of information into -- if you input that into some type of machine -- garbage in, garbage out. The output is going to reflect what's been put in. Amazon says that it moderates these responses with automation-trained moderators and customer feedback, but clearly it's not working 100% of the time, and that's certainly a public policy concern.

Dave Bittner: I have so many things I -- this makes me wonder. First of all, how could something like this bubble up to the top of an Amazon answer? How could a low quality news source -- the answer -- and I wonder is it volume? Is it that there's so much misinformation being spread around that -- that that's how it ends up being weighted more heavily than the legit news sources like Reuters, like the Associated Press. You know? Those -- those kinds of places. I wonder how this could happen?

Ben Yelin: Yeah. So that's the hypothesis of a professor at New York University who's quoted in this article -- Meredith Broussard. She is the author of More Than a Glitch: Confronting Race, Gender, and Ability Bias in Tech. And her view is that these developers, including Amazon, think they have to give a balanced viewpoint, so they alternate sources between the right and the left, thinking that this is going to give some sort of balance. But quality sources that are categorized as right wing or left wing vary in their reliability. So if you have X number of right-wing sources and X number of left-wing sources, but the sources on one side of the equation are frequently wrong, misinformed, giving out false information, you actually haven't really achieved anything. You may have achieved balance, but it's not balance in search of the objective truth. We've seen that happen, not just with Amazon but with other companies as well. When Facebook, as it was known back then, in 2019 unveiled its specialized news tab, there were a bunch of people who follow media sources for a living criticizing the fact that they included sources like Breitbart News which was -- is a news source led by Donald Trump advisor, Steve Bannon. And this was put next to more traditional outlets. Now there's no good way of doing this because Amazon and Alexa and all these other companies are going to get yelled at by conservatives if their sources are considered to be too liberal.

Dave Bittner: Right.

Ben Yelin: And conservatives have successfully argued for a long time that so-called mainstream news sources like the Associated Press and Reuters and CNN and whomever -- they themselves are -- have some sort of liberal bias or slant. So they worked the refs enough to the extent that these companies really have to bend over backwards to include more right-wing sources, but that's going to lead to -- to outcomes like this. I don't know that there's a right answer because you don't want to get to a point where companies are -- are being too guarded with information and overly reliant on a limited number of so-called "reliable sources" because there are occasions where these reliable sources get things wrong. I mean, we saw that with the censoring of the Hunter Biden laptop story where it turns out that at least elements of the story were true.

Dave Bittner: Mm.

Ben Yelin: So that's -- that's the danger on the other side. But I -- I think the equilibrium we have now which is Amazon Alexa's answers being based on crowdsourced information with proper human and nonhuman controls, that's clearly not working as well as it should. And I think this is a problem that the industry needs to figure out a way to solve.

Dave Bittner: Yeah. I remember a couple years ago I -- I stumbled across a bit of misinformation or -- or -- I don't know if it's fair to say that -- an error when I was asking the automated device in iOS devices whose name I'm not going to mention --

Ben Yelin: Yup. It will trigger -- it will trigger something.

Dave Bittner: -- summoning them. So I asked that device for something that was just factual. It was a President's birthday. What was the Pres -- you know, what -- what is so -- you know, George Washington's birthday. And it got it wrong. It was just wrong. Like, it was -- you know, it was, like, a hundred years off -- wrong. And so that's one thing when you get a fact wrong. But what is an organization like Amazon to do in response to a good-faith query like, you know, was Donald Trump the greatest President in history? Or was Obama the greatest President in history? Right? Like, that's going to elicit -- that -- that -- that's an opinion --

Ben Yelin: Right.

Dave Bittner: -- answer. Right? But -- so what should the device do? Should it say -- like the way I think we see some of these large language models responding where they'll say, you know, "Opinions differ. Some folks say this and here's the evidence for it. And some folks say this and here's the evidence for it." That's not very --

Ben Yelin: Yeah. I think that's the right way to address opinion questions.

Dave Bittner: Right.

Ben Yelin: The gray area is things that are fact-based but could be argued as opinion-based. I mean, there's kind of a meta question. What do you mean by "the election was stolen?" So, like --

Dave Bittner: Right.

Ben Yelin: -- Donald Trump might think that literally hundreds of thousands of illegal immigrants, or whatever, voted in Detroit and that swayed the outcome, and that's obvious B.S. --

Dave Bittner: Yeah.

Ben Yelin: -- been debunked a million times. But there are some people on the conservative side of the country who think that big tech's involvement in suppressing the Hunter Biden story, that that was election interference. In their minds, that means the election was stolen.

Dave Bittner: Mm. Mm-hmm.

Ben Yelin: So --

Dave Bittner: So there's nuance here.

Ben Yelin: There is nuance there.

Dave Bittner: Yeah.

Ben Yelin: It's just a very, very difficult question to resolve. I think there -- I dare to say that there's a role for policy makers to try and get involved here, but I also just don't think that would go over well.

Dave Bittner: Yeah. Well -- and you've got a device whose job is to provide soundbite answers. Right? You don't want your Alexa to give you four or five nuanced paragraphs.

Ben Yelin: Right.

Dave Bittner: You don't have time for that. That's not what you're after.

Ben Yelin: Exactly. When I ask the weather to my Alexa device, I don't want to hear about the jetstream and --

Dave Bittner: Right.

Ben Yelin: -- yeah, the barometer pressure. I just want you to tell me what the temperature is going to be.

Dave Bittner: Mm-hmm.

Ben Yelin: So, yeah, I -- I think it's -- I -- I hesitate to bring a story to our attention where there's no good resolution, and I'm just bringing up a pretty serious problem. But I thought this one was just too good to pass up.

Dave Bittner: Yeah. Makes me wonder, too, if -- like, what happens if over time your device figures out what your particular leanings are -- and provides you the answers it knows you want to hear. Right?

Ben Yelin: It reconfirms your biases. Yeah. I mean --

Dave Bittner: I mean, think about -- like, I'm sure either of us could go to -- let's just use Amazon. Right? Amazon, I am certain, knows what your political leanings are, what my political leanings are, my parents, my spouse, my kids. You know, it knows that. It has somewhere in its database --

Ben Yelin: Yup.

Dave Bittner: -- it knows that. What's the ramifications of it presenting me answers that it suspects will please me in a fuzzy situation like this?

Ben Yelin: First, I would say you've got to trick it, by the way. This is what I do with YouTube. So -- I'll search both liberal and conservative content. I've watched a lot of Jordan Peterson videos.

Dave Bittner: Uh-huh.

Ben Yelin: I'm kind of fascinated by him. I wouldn't even -- I wouldn't say I like him. I just find some of what he says to be entertaining in kind of a weird way.

Dave Bittner: Uh-huh.

Ben Yelin: But it means that I get a nice proper balance of political content on my YouTube feed. So try and trick it that way.

Dave Bittner: So you're trying to break your own bubble.

Ben Yelin: Exactly.

Dave Bittner: Uh-huh.

Ben Yelin: Which I actually think is a healthy media consumption habit for everybody is -- figure out what the outlets that you think are extreme on the other level, on the other side of the political spectrum, figure out what they're saying. Be ready to contend with those points. But, yeah, I mean, I think you absolutely could get this confirmation bias. We see it in all different types of algorithms for social media sites is they give us content that we want to hear, that we engage with, and that ends up confirming our biases. You'd think something like Amazon wouldn't be like your Facebook page, but would be more like something like Wikipedia where there's at least an effort to strive for the truth. Now don't get me started on whether Wikipedia actually represents the truth, but that's at least the ostensible goal. And I think it should be Amazon's goal not to give you content -- or at least with Alexa devices -- not to give you content that confirms your biases but to give you, at least to the best of its ability, true information based on objective facts. That would be -- I mean, that's something that I think they should prioritize, at least.

Dave Bittner: Mm-hmm. Do you suppose that these devices should opt out of giving answers to some things?

Ben Yelin: No, I don't. I mean, there are some very philosophical questions that I think it can tell you that it doesn't have a good answer -- like is there life after death? I don't think your Amazon device can give you a satisfying answer. I don't think there is a right or wrong answer that's going to satisfy everybody. But I also don't think that they should try to suppress factual information just because they might get it wrong. I think it's more important in the long term to make sure that you are a valuable source of information, that people can properly rely on you to learn about what's going on in the world.

Dave Bittner: Yeah. I guess it's particularly hard when you have actively hot-button issues like this. Right? People are, for better, for worse --

Ben Yelin: Maybe can just get rid of active hot-button issues and everything will be resolved.

Dave Bittner: Great idea, Ben. Why hadn't I thought about that? That's -- that's easy. Consider it done, my friend --

Ben Yelin: Easy solution.

Dave Bittner: -- yeah. All right. Well, we will have a link to that story in the show notes. My story this week actually comes -- this is from the Knight First Amendment Institute at Columbia University. They recently filed an amicus brief in a case that is challenging electronic device searches at the border. This is something you and I have talked about many times here on the show, and it's actually coming up. There's a case coming up about whether these warrantless device searches violate the First and Fourth Amendments and the Knight Institute has weighed in on this. Their opinion is that these things do violate our -- our rights. One of the reasons I wanted to bring this up, Ben, is just that I think it's great that this is being looked at and being considered. I certainly think that the -- the -- the reach is overly broad when it comes to border searches of our mobile devices. What's your take here?

Ben Yelin: Yeah. So most circuits have said that these types of border -- forensic border searches are constitutional under a reasonableness framework. The 2nd Circuit, which is where this court -- this court decision will take place hasn't weighed in yet. And I think this is certainly a live controversy. Really, you're balancing competing values and competing legal principles. One value is that there is a lesser standard as it applies to the Fourth Amendment when it comes to border searches because we have a special societal need at the border to make sure that we're not letting in contraband, that we're not letting in people that might do us harm. There is a higher standard for those types of searches than in your garden-variety criminal investigation where the stakes are much lower. But the other value here is the value of the information that we store on our devices now. We've seen the Supreme Court in several cases re-emphasize that cell phones are basically part of our bodies. They contain such a multitude of information. They mention this in Riley v. California. To a slightly different and lesser extent in the Carpenter v. United States case, where we should have an increased expectation of privacy in the contents of our digital devices, especially for U.S. persons coming back across the border from overseas. So I think that this really is something of a competing values question. Luckily, the Fourth Amendment, at least the way jurisprudence has worked, has a pretty good way of resolving these disputes. It's not the most objective way to resolve them, but it is a framework that courts have used, and that's this reasonableness framework. And to use this reasonableness framework, you have to weigh the invasion of privacy or the inhibition on personal privacy against the security interest -- the societal interest here. And I think that's what the 2nd Circuit is going to have to do. And I think what the Knight Institute has done here is put a giant bag of pennies on the scale of the invasion of privacy side.

Dave Bittner: Mm.

Ben Yelin: If you imagine the sort of balancing test --

Dave Bittner: Mm-hmm.

Ben Yelin: -- what they're saying is it's not just Fourth Amendment rights that are at stake, but First Amendment Rights -- the right of attorneys who represent disfavored clients who may be located overseas. And then, most importantly to them as an institute concerned with the First Amendment in journalism, is the rights of journalists to protect private information that they're using as part of their investigations. So I think those are things the 2nd Circuit is going to have to take into consideration. I think this brief is persuasive. I think you could certainly make an argument that the invasion of privacy here outweighs the potential benefits to our -- our national security. But that's going to be their task, and I think the Knight Institute, from what I've read, focuses on that narrower question on the inhibition of privacy. And I think their attorneys, when they're in court, are also going to have to argue that you're not sacrificing much in terms of the security of the United States. And that's going to be a much more difficult argument to make.

Dave Bittner: What happens if they convince this circuit court to take their side and that's contrary to some of the other federal circuit courts? Does that put them on a pathway to the Supreme Court?

Ben Yelin: Probably, yeah. So we'd have a circuit split. In the 2nd Circuit, there would be a warrant requirement for border searches for devices, and that wouldn't be the case in some of these other circuits. That means, usually with a circuit split on a high-profile issue, those are the types of cases that the Supreme Court loves to take. They only take a hundred some odd cases every year, so there's certainly no guarantee that this would be resolved in short order. But I would expect this type of thing to come to the Supreme Court if we see that circuit split.

Dave Bittner: Mm. All right. Well, we will have a link to that from the Knight Foundation in the show notes. And, of course, we would love to hear from you. If there is something you would like us to consider for the show, you can email us. It's [ Music ] [ Music ] Ben, it is my pleasure to welcome back to the show Caleb Barlow. He is the CEO at Cylete and a regular guest on our show here. Caleb, welcome back.

Caleb Barlow: Hey, guys. How are you doing today?

Dave Bittner: Doing good -- doing good. Ben, you want to say hi to Caleb?

Ben Yelin: Good to talk to you again, Caleb. Welcome back.

Caleb Barlow: All right.

Dave Bittner: So we -- we have been brought together here for a rare dual interview at Caleb's invite. And he has a fun game that he's going to play with us today -- a little what-if mental exercise. Caleb, you want to describe wha -- what you're up to today?

Caleb Barlow: Okay. Well, first off -- you -- you guys need to close your eyes for a second and I want you to imagine a world -- a world where we don't have 52 different [inaudible 00:21:04] disclosure laws, a world where we have a federal privacy standard in place, that governs privacy across the land.

Ben Yelin: I mean, I have an imagination, but I don't know if my imagination can extend that far.

Dave Bittner: Just crazy talk, Caleb.

Caleb Barlow: We said this was a game, Ben. You've got to play along.

Ben Yelin: All right. All right. Won't fight the scenario.

Caleb Barlow: Imagine that the government has clear standards for security that everyone needs to follow and that critical infrastructure in government actually have a tight working relationship. And -- and even that vulnerabilities, when found in software, are reported immediately to the government, in fact, before they're even publicly disclosed. Can you imagine such a world, Ben?

Ben Yelin: I don't know that I can, to be honest. I -- I'm having to extend my imagination the way I do when my daughter talks about unicorns.

Dave Bittner: Yes. There you go.

Caleb Barlow: Well, there is a place in the world, and it's called China.

Ben Yelin: I bet people didn't think you were going to say that, which is what makes this segment so fun. Go on.

Dave Bittner: Yeah.

Caleb Barlow: So -- so, you know, there is one caveat, all pun intended, and it's China. Right? And we -- you know, but I think what's really interesting and what we'd like to do today, Ben, is have you sit in the seat, not of the SCOTUS nerd that you are all the time, that we all love, but sit in the seat of a China lawyer and, you know, let's talk about some of these laws and the implications because what I think is really interesting, actually, is that in a lot of cases, in different scenarios, these laws are way ahead of the U.S. because they can get agreement on things. Right? I mean, it's --

Ben Yelin: Is -- is agreement in air quotes?

Caleb Barlow: Yes, agreement is in air quotes.

Ben Yelin: If -- if China needs to build a road, they build a road. Right?

Caleb Barlow: Yeah. I mean --

Ben Yelin: Or a dam or whatever. I think that truth kind of underlies everything here, as one of the themes of our show is that paralysis in Congress means that getting anything passed is just exceedingly difficult and you don't have that problem in totalitarian countries --

Dave Bittner: Right.

Ben Yelin: -- for better or worse.

Caleb Barlow: But -- but here is the reality. Right? For U.S. companies is -- this is soon to be the largest economy in the world, so we do need to figure this out. And I think a lot of what U.S. companies need to be able to do is really evaluate the risk because, although -- you know, and this part is like the U.S. -- although the laws are on the books, it doesn't necessarily mean they've been used yet or litigated, so they're really scary but we don't really know what they mean. So how about we start with a scenario and let's start with something pretty simple and common. Let's say we're in a software business. You know, we develop some software. We sell maybe a popular business productivity application and we've developed it in the U.S. but it's sold in China and, you know, although we don't do software development in China, we do, like, a lot of localization. We've got Chinese employees and, of course, the Chinese audience is just a massive opportunity for our company. So, of course, we want to sell there. Now let's say we found a vulnerability in one of our software applications during our software development that nobody knows about. Can we just patch and fix this? Or is this a little bit different in China?

Dave Bittner: I -- I will jump in here and say it is my understanding that I am obligated, indeed perhaps compelled, to reveal this vulnerability to the Chinese government before I reveal it to the general public. Is that right?

Caleb Barlow: I believe that is correct. Ben?

Ben Yelin: I believe that is the nature of the Chinese privacy law --

Dave Bittner: All right.

Ben Yelin: -- which is not the nature -- at least at the federal level -- of our privacy apparatus.

Caleb Barlow: Now imagine with the software development company. Like, what's the difference between a vulnerability and a bug? Like, I mean, you know, your typical software development product has hundreds, if not thousands, of known bugs and vulnerabilities and, you know, oftentimes you're kind of working through a stack ranking of what can be publicly exploited? What's publicly known? You know, what do we know about this vulnerability that's behind the scenes that nobody is ever going to find? So that -- that's going to be the thing we fix next quarter. And, you know, this really changes the way a company has got to think about their vulnerability and bug stack because everybody that builds software releases software with known bugs in it all the time.

Ben Yelin: That's part of an iterative process of just being a technology company. I mean, that's why they're successful is you don't wait until all of the bugs are fixed before you bring a product to market, otherwise you'd never bring a product to market.

Dave Bittner: Right.

Ben Yelin: Yeah.

Caleb Barlow: But -- but, you know, how does -- but think about this for a second. Like, if -- if you're sitting there kind of as legal counsel now and you've got this company and they're doing work in China, Ben, like, how does your advice change on what you're going to tell that company to do with these -- you know, you come up time for the release and, hey, there's fifty known vulnerabilities in the stack we're about to release and it's going to be sold in China. What do you do?

Ben Yelin: I think you have to err on the side of caution. First of all, I -- I should caveat, also pun intended -- that this is not legal advice. I know we say that at the beginning of our show, but I don't want anybody bringing this up in court. Right? When -- when you're in front of some proceeding in -- in Chinese civil court. So I will -- I will caveat it with that. But I think you have to err on the side of caution, both because we don't know how these laws are going to be applied. You don't want to lose your ability to sell your product in China, which I think is going to be the ultimate consequence of a compliance failure. In a way, that's just not as much of a risk in the United States. Our enforcement agencies generally, for a bunch of reasons, aren't strong enough to just ban you from selling your product, except under very limited circumstances usually relating to something like national security. So I think as a -- a counsel for one of these companies, I would be extremely cautious and I would be kind of over eager to reveal all known vulnerabilities. Basically, in the event of a tie, if there's any sort of question, I think you have to reveal them in order to ensure compliance, just because we don't know how this law is going to be enforced.

Dave Bittner: Can I jump in here and just say, I mean, what -- what -- don't you think it would be similar to the kind of thing that happens here where a conversation with the regulator is in order to say -- so what do you really -- like, how -- how much do you want from me? Right? Like, you know, do you want bugs? Do you want -- or do -- at what level does it rise to --

Ben Yelin: From bug to vulnerability.

Dave Bittner: Right.

Ben Yelin: Yeah.

Dave Bittner: Where on the spectrum do you want it to be where you want me to reach out to you so I'm not filling your inbox with -- with noise?

Caleb Barlow: Well -- and now, let me throw another question in here, though. Okay. Wherever that answer lies, and I think you know we kind of heard, you know, Ben's opinion from the lawyer's perspect -- perspective. Now let's figure the VP of Development, and you're sitting here and Ben just said, well, you know, we've got to err on the side of caution, and that VP of Development is looking at us and going, well, yeah -- I've got to get your next release out a year and a half from now. Right? I mean, this is going to be the really interesting internal debate that I think could dramatically change if China flexes its muscles on this. How we develop and release software because we're going to be paranoid about pushing anything out that's got a known vulnerability in it that might be exploited. On the other hand, we've got the issue of -- I only have so many developers and so much time to fix all this stuff.

Ben Yelin: Can I give a super, super hot take here, questioning a key premise of everything that you're saying?

Caleb Barlow: Yeah! Please! That's the idea.

Ben Yelin: What if China is not as economically strong as we think they are? We've projected them to be the strongest worldwide market, but actually their economic vulnerabilities are worse than we could have anticipated. They're no longer the most populous country. That is now India. And due to a variety of factors, economic growth has significantly slowed. I mean, it still exceeds what we have here in the United States. But from where we were even fifteen, twenty years ago -- I think that's a -- I -- I honestly think that's going to be in consideration if I'm the VP of Development. I mean, if you're going to start making the assumption that compliance is going to be very difficult and very costly, I would at least question the premise that this is a market we want to be selling in.

Caleb Barlow: I think -- I think that is the key point, Ben. I really think that is a very key point in all of this is evaluating that risk and the implications it could have to your broader business because, you know, just like GDPR affects how we treat privacy here in the U.S., even though it's not a U.S. law, I think this is a really great example of where a Chinese law could affect our development schedules for our products around the world, and we may decide, hey, we're just not going to do this.

Dave Bittner: What if you're, like, the eight-hundred-pound gorilla here. Say you're Apple and you're doing the vast majority of your manufacturing in China so you're dependent on them for that. You can't just switch to -- there aren't other providers who can make the things you need in the volume that you need them.

Ben Yelin: Maybe we just loosen our child labor laws, Dave. That was a joke, by the way. Sorry.

Caleb Barlow: Well, okay. So this is actually an interesting --

Dave Bittner: People are advocating for that!

Caleb Barlow: -- this is actually an interesting lead-in, though, to --

Dave Bittner: Yeah.

Caleb Barlow: -- another scenario. Right? So let -- let's say we're Apple or a -- a hardware manufacturer with a new product in China that, you know, we've, like -- you know, take an iPhone. Right? Huge amounts of innovation. All kinds of patent pending features and security protocols. Maybe it's an IoT device. Maybe it's a mobile phone. You know, in the U.S., government inspections are really limited to, you know, kind of product safety, lifecycle management, you know, recycle -- you know, your ability to recycle the product. Things like that. I mean, that's where we see lots of regulations around hardware products. How is that going to differ if we're Apple making something like an iPhone in China? I mean, what -- what do you think are the implications for the government's access to my IP, my network, and things like that?

Dave Bittner: Well, I mean, I immediately think of things like the secure enclave, you know, hardware on Apple devices, on the laptops, on the phones and things like that. And -- and I will be first to admit that I am speaking from a position of ignorance here when it comes to the under the hood encryption and technology and all that sort of thing, but what if the government requires that you reveal everything that goes into making a part of your device that is dedicated to security, and in doing so undermines that security. Do you then -- do you have to be clever to make a type of security that cannot be unwound? Or how do you proceed? Do -- do you -- how do you lift the hood for China and refuse it to your home nation, the United States? Ben?

Ben Yelin: That's a really good question. I mean, we've now actually seen tech companies do that and they've gotten some pretty bad publicity for it.

Dave Bittner: Mm.

Ben Yelin: I think of X and Elon Musk, who has taken this very hard line for free speech absolutism where we're not going to be restricting any sort of content in the United States. These are, you know, principles that we hold dear.

Dave Bittner: Mm-hmm.

Ben Yelin: But then when some type of foreign government -- it's happened in China -- it's also happened in countries like Turkey or other Middle East countries, ask X to take down content. They've been very compliant. The same thing has happened with Modi's government in India. I think there's a cost to that type of compliance -- in terms of publicity, but it's also -- it hasn't brought down the company. So maybe you can -- maybe you do have a little bit of latitude to treat your own country a little more poorly, so to speak, than -- than you treat other authoritarian countries.

Dave Bittner: Hmm.

Caleb Barlow: Well, there -- there's another aspect to this, too, which is, depending on your classification of critical infrastructure in China, which it doesn't seem like -- at least I couldn't find any real details on how you determine whether you're critical infrastructure or not. But let's say you, you know, fall into that category. They've got rights to access your network and, you know, the point here is if designs, whether that's a circuit board or software or on that network, they can get access to that if -- if they deem it's required. So, you know, I think this also sets up a really interesting scenario. Right? On one hand, you know, the positive side of this -- right? -- I mean if we try to look at this with a glass half full lens, there are very clear requirements for security that critical infrastructure in China needs to have. We don't have that here in the U.S. It's anything but clear, so we kind of have to give them some points for that. On the other hand, the big "yeah, but" with that is they can go so far as to access the network and can do it, you know, under authorities where they don't even need to tell you why. So, you know, that sets up a scenario we obviously have to be worried about -- privacy laws and intellectual property laws. And if -- if you're developing some high-tech product in China, I think that becomes a really challenging risk to manage.

Ben Yelin: Yeah, I mean, I think it's almost an impossible risk to manage, and then you go back to that evaluation of -- is it worth selling your product there? If you're going to be exposing those kind of vulnerabilities and showing your customers -- your Western customers, your U.S. customers -- that you're willing to have your hardware, software, whatever, snooped on by the Chinese Communist Party. I mean, certainly a company like Apple is going to be conscious of something like that. The smaller companies are not and, for the most part, if your product is good enough, you can get away with it. The kids still love to use TikTok, even though we know that ByteDance is basically an agent of the Chinese government, and TikTok is still one of the most popular applications in the United States. Despite complaints in Congress, we have not yet banned TikTok. We've banned them in a couple of states. I know Montana is one -- one of the states that's tried to ban them. Utah, I believe, is another. And we've seen state governments prohibit the use of TikTok on government employee devices. But we haven't actually been able to get rid of it. So -- yeah, I mean, it is a balance. I think if your product is -- is good enough, if enough people want to use it, then people are willing to look the other way on how you handle privacy threats vis-a-vis China. I think that's the reality of it.

Caleb Barlow: What -- what about intellectual property theft? Like, how -- you know, the -- I think you've got -- you had a very interesting kind of demarcation on the privacy front. Right? Which -- I don't like that answer, but I would agree with it. Right? How would you do the same evaluation -- so let's say you're not an Apple. You're some mid-level manufacturer that has a product and you're worried about the intellectual property theft side of this. Right? How do you advise someone in making that kind of determination of, hey, I really want to go manufacture this in China because I'm going to be able to sell my product in the U.S. for half what it costs me if I was going to build it here. So I might not even be trying to market my product in China. I might just be trying to build it there. How do I manage and mitigate that risk?

Ben Yelin: It's -- it's a very difficult risk to try and manage. I mean, in the last ten years, we've seen an explosion of intellectual property theft, to the point that it's become a major public policy concern at the federal level. You know, the policy guy in me thinks there are enough proposals in Congress to cut down on intellectual property theft on the -- on the part of the Chinese government that maybe we can use the full force of our federal government and our laws to change policies so that it becomes -- it can continue to be cost effective to manufacture products in China. The other option is that we become more protectionist. But, again, that's going to increase prices for consumers which has its own drawbacks from both the micro level as an individual company, and the macro level. We've seen over the past couple of years people, no matter what the other benefits are -- low unemployment -- people don't like things being more expensive.

Dave Bittner: Mm. Right. Right. So it's the --

Ben Yelin: People have freaked out about that.

Dave Bittner: Yeah. I used to make a sort of a dark joke about how we hate the idea that -- of -- of child labor, but we sure do love $25 DVD players. You know? Like --

Caleb Barlow: Isn't it true?

Dave Bittner: Yeah. I'll bring up another quick point here. You know, back in -- in my days as -- in the television world, I had a friend who was doing a production in China and visited a Chinese television studio. And he looked up in the lighting grid and he saw that the grid was full of these beautiful ARRI lights. And ARRI is a German brand. It's A-R-R-I. And they're one of the top providers of lighting equipment in the film and television production world. But on closer inspection, he found that those -- every single one of those lights were exact knock-offs of ARRI lights. And they were perfect.

Ben Yelin: Including, probably, the logo, too. Right?

Dave Bittner: Yeah. There -- but there was no quality hit. There was -- like, they went and they made their own. And so I wonder -- is that something you just accept because -- if the -- the Chinese want to make something, they're going to make it. And what can you do about it? Right? If you sell a product and someone can buy it and make a copy of it, or get the plans for it, or -- do you see where I'm going with this? Like, if -- if you're someone who's trying to sell into China, is part of your equation that there are going to be knock-offs. And how do you fight that?

Caleb Barlow: I think it depends a lot on what you're making. Right? If you're making handbags that can be sold out of the back of a truck on a New York City street, then there's a real risk, both in your primary market as well as the secondary market, of selling into China. You know, on the other hand, like, a buddy of mine manufactures these really large, steel basketball hoops that you -- you know, if you want, like, an NBA-quality hoop at your house, you can buy them from him. Well, he gets them made in China because it's so much less expensive. But, okay, let's say there's a knock-off in China. That's not his primary market. And it's not like you can sell one of these things, because they weigh over a thousand pounds, out of the back of a truck in the U.S. So, you know, his risk there is really low versus, like, if you were a handbag manufacturer.

Dave Bittner: But what -- I guess what I'm getting is that with that -- would the equation for him be that I'm not even going to try to sell in a Chinese market because I know there's no way I can beat the knock-offs?

Caleb Barlow: 100%. And I'm sure he's got knock-offs there. All right. Let me give you one last one. So, okay, Ben, let's - let's say -- and, you know, if you're a social media company, if you're a carrier, you know, you have lawful requests from government coming in all the time. You know, maybe they're tracking down an online stalking situation and they come to you with a lawful subpoena requesting information on one of your clients. You know, so in the U.S., we know how companies deal with this. And in some cases, we even see companies like Google and Apple push back sometime on these things. You know, so let's take that same example. Let's say we're operating in China with our social media company. We have a lawful request that comes in on an online stalking situation, or at least what allegedly is, requesting information on, you know, a hundred of our clients. What do we do? How do we handle that situation maybe differently than we would the same request if it came into the U.S.?

Ben Yelin: [sighing] That was a loud sigh.

Caleb Barlow: Oh, see -- I'm calling that a victory. I got a sigh.

Ben Yelin: It's a --

Caleb Barlow: I got a sigh out of Ben Yellin. I've done my job today.

Ben Yelin: All right. I -- I will quote you on that. I mean, there are -- there are several kind of angles you could use to approach this question. There's kind of the ethical angle and then there's the practical angle. If -- if my role is with a company that has manufactured hardware, software, or whatever, I think you have to take the -- kind of the same mode of caution that you would with every other privacy regulation that we discussed earlier. I think if you're really concerned about your market share there, you do not want to get on the bad side of the Chinese government. So you basically can't push back at all against these requests, no matter how egregious they are. It's not easy to push back against these requests in the United States. That's why very few companies actually end up doing it. I mean, we always talk about these high profile examples, but just in the course of business I would estimate, what, 95, 99% of the time they just comply with the subpoena and nobody ever hears about it.

Caleb Barlow: And what happens if that was -- that request was for a geotag location around a government protest?

Ben Yelin: [sighing]

Caleb Barlow: I got a second sigh!! Did you hear that? I mean, I got a second sigh!!

Ben Yelin: That's really bad, morally -- it's really bad. I don't -- I would not want to be associated with that as a company. I think if you want to sell products in China and if you're more concerned about your bottom line, I think -- I think you say yes. Now I personally would not because I like to feel that I would have the type of corporate values that would be willing to forego a little bit of revenue in service of human rights we refer to in the United States as "free speech rights" -- "First Amendment rights," etc. I think there is some level of corporate responsibility that goes on there. I also don't manage or don't have any affiliation with a software company with very, very thin margins where I have a niche product, if my market in China fails, I'm out of business. You know, I don't have access to -- to some venture capitalist that's going to bail me out. I understand that that's a completely different equation. I consider myself a moral person and I -- I think if I were working for one of those companies, you have to think about corporate responsibility and avoiding the situation where you're fostering further totalitarianism. But I also know that there are representatives of these companies who will say, yeah, but I need to sell my product there. I spent -- I've spent my life developing this. This is the only way my product is going to survive. I -- I just have to do it.

Caleb Barlow: Well, and I -- and I think this is the, you know, the overall point of this little fun exercise, right, is that at the end of the day, if you're going to do business in China, what you're going to need to do is really build those crisis and incident response plans for what happens when you get into these situations, but also kind of tabletop them ahead of time. You know, what are the -- what is the parade of horribles that we might run into? How are we going to think about that? And just like we talk all the time about cybersecurity instant response, can we lay out our response ahead of time so that we can think about it ahead of time, debate it in the philosophical ahead of time so that when it actually does happen we already know what decision we're going to make and what the risks of that decision are.

Ben Yelin: I think that is such a great point. As somebody who's in the emergency management field, I will never argue against tabletop exercises. I think that's exactly what has to happen here. I think you need to play out these scenarios the way that we're doing right now. I mean, these are ultimately value judgments. I don't think there is a correct answer to the questions that you're asking. It depends on risks and benefits, and also how strong your values are relative to how much money you want to make. So I think you do need to workshop this, and I think the C-suite needs to be there when you do.

Caleb Barlow: Yeah. The only -- the only wrong answer is not having an answer and not making a decision because that -- that usually is a decision and it's usually a bad one.

Ben Yelin: Exactly.

Caleb Barlow: And if you go into it thinking this is not going to happen to me. I can get all the carrots without facing any of the sticks, you're wrong. You are going to be faced with some type of dilemma that comes up if you are selling your product in China. So you better be ready for it. I think, based on the discussion we've had here, I -- I think hopefully people will now know that you're going to have to make these difficult choices and get the people in the room who are going to make those choices and -- and have them talk it out. Weigh the risks and benefits.

Dave Bittner: All right. Well, we're going to have to wrap it up there. Again, our thanks to Caleb Barlow for joining us. He is the CEO at Cylete. Caleb, thank you so much for bringing this to us. This was great fun. [ Music ] [ Music ] That is our show. We want to thank all of you for listening. N2K Strategic Workforce Intelligence optimizes the value of your biggest investment -- your people. We make you smarter about your team while making your team smarter. Learn more at Our senior producer is Jennifer Eiben. This show is edited by Tre Hester. Our executive editor is Peter Kilpe. I'm Dave Bittner.

Ben Yelin: And I'm Ben Yelin.

Dave Bittner: Thanks for listening. [ Music ]