Caveat 11.3.22
Ep 148 | 11.3.22

Privacy Compliance and Culture.


Christina Montgomery: My concern is that with the regulatory focus, with the splinternet, those real, necessary, real-world uses of data are going to get caught up in a regulation that's going to hamper the economy.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner, and joining me is my co-host, Ben Yelin from the University of Maryland's Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: Today, Ben discusses a possible new right-to-repair law in New York State. I want to talk about Twitter, who you may have noticed has a new boss. And later in the show, Ben's conversation with Christina Montgomery, vice president and chief privacy officer at IBM. 

Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right. Ben, lots to talk about this week. Why don't you start things off for us here? 

Ben Yelin: So I have a story from our friends - or I guess our unrequited friends - at Motherboard by Vice by Matthew Gault. 

Dave Bittner: OK. 

Ben Yelin: And this is about a statute that passed the state legislature in New York and is currently awaiting the governor's signature. So the statute is called the Digital Fair Repair Act. It would be the first state law of its kind that would institute a so-called right to repair. So there's been this problem of people wanting to have access to some of the tools, instruction manuals to fix their own devices. But obviously, the manufacturers want to prevent people from being able to do that because they want the revenue from fixing the devices themselves or forcing people to purchase new devices. 

Dave Bittner: Right. 

Ben Yelin: So what this bill would do is it would force manufacturers to offer documentation, tools and parts to customers and independent repair stores. The bill has been whittled down. We had talked in previous stories about kind of non-big-tech industry companies and the right to repair, so things like John Deere tractors. 

Dave Bittner: Right. 

Ben Yelin: And again, these have digital parts. We're not just talking about the wheels and the engines or whatever. 

Dave Bittner: Yeah, everything's a computer now (laughter). 

Ben Yelin: Exactly. Exactly. 

Dave Bittner: Right. 

Ben Yelin: Those interests are apparently quite powerful in the New York state legislature. You don't want to mess with the John Deere people. 

Dave Bittner: Who knew (laughter)? 

Ben Yelin: Yeah. Yeah. I mean, look, if you have a giant state where a good portion of the legislators, especially in the state Senate, are from upstate New York, where - I don't know if you've been there, but there is a lot of farmland. 

Dave Bittner: Yeah. 

Ben Yelin: Yeah, you... 

Dave Bittner: It's gorgeous, yeah. 

Ben Yelin: It is great country up there. 

Dave Bittner: Yeah. 

Ben Yelin: But you do not want to mess with the John Deere people. So they were actually excluded from the bill. Then there's also medical device companies. They were included in the original version of the bill, and they were able to eke out their own exemption - another very powerful lobby. But the law still would cover electronics like iPhones. So I think for most people who look at a right to repair law, they're thinking about their personal devices. And if anything goes wrong, if you have an iPhone, you have to go to the Apple Store. You know, you might not be able to get an appointment at the Genius Bar for six weeks. 

Dave Bittner: Right. 

Ben Yelin: It might be more convenient to just buy a new device, even if, you know, there's the possibility that if you went to an independent dealer, they would be able to fix your device. They'd have the parts, the tools, the know-how. 

Dave Bittner: Right. 

Ben Yelin: So the only question right now is whether the current governor of New York, Kathy Hochul, will sign the bill. Her staff has said that she is reviewing it. It's certainly been a long time. The bill passed with pretty large majorities - bipartisan majorities in both houses of the state legislature back in June. So I don't really know exactly what she's waiting for. The other complicating factor is she is in a surprisingly close race... 

Dave Bittner: Oh. 

Ben Yelin: ...For reelection. 

Dave Bittner: I see. 

Ben Yelin: So it could be that she just doesn't want to upset the wrong people... 

Dave Bittner: Right. 

Ben Yelin: ...Prior to November. It's also very possible that she loses, which would mean a new Republican governor comes in, in January, and that Republican governor may not be as amenable to this type of law, even though it's popular in both parties. So I guess we'll just have to hold our breath on whether this gets signed into law, but if it does, I mean, this could be a model for the other 49 states in these - to have this type of right to repair legislation. 

Dave Bittner: And do we suspect this is the kind of thing that would go state by state instead of a - some sort of federal effort? 

Ben Yelin: Yeah. So I do think this would be more of a state-by-state effort. President Biden was able to institute an executive order that put into place a general right-to-repair policy. But its applicability is limited. It's only in terms of organizations, businesses that contract with the federal government, which is pretty much the only power that the federal government has unless it kind of occupies the field and comes up with a federal statute and regulation for this topic. So I think this is going to be more of a state-by-state issue. And obviously, that depends on the politics of the state, the individual players. I don't think there's a lot of sympathy for the Apples of the world. 

Dave Bittner: Yeah. 

Ben Yelin: You know, I don't think there's a Republican legislature in whatever - Oklahoma, Nebraska, South Dakota are going to be swayed by Apple coming in and saying, you know, for a variety of reasons - most notably our bottom line - we don't want to have this right to repair law. That's why I think it's something that could really spread, even though there are powerful interests aligned against it. And certainly, I understand the reasons for that. But you just - you know, I just think there's enough antipathy to these big tech companies, even among more conservative states, that I wouldn't be surprised to see lots of these laws passed in very different states in terms of their political geography. 

Dave Bittner: Yeah. You know, I always think of the automotive industry when I think about this because of - we have this tradition of, if you want to change your own oil, you can do that. If you want to change the filters - but also, if you want to go to a third-party shop, you can have your work done there. Or you can go to the dealer, right? 

Ben Yelin: Right. 

Dave Bittner: The dealer - you know, the dealer's going to have probably the most training on your specific vehicle, and as the vehicles become more and more computerized, you can understand how that might become important, and the dealers might have specific equipment that can interface with the - you know, I always think of - and I'm going - I realize I'm going down a metaphorical rabbit hole here - but I think about R2-D2 plugging into the Millennium Falcon. You know, like, the dealer has the tools that can interpret what the car is saying. But I'm also reminded of a story I saw earlier this week, which was - there was a gentleman who had had a trailer hitch attached to his Tesla. And just a - you know, a third-party trailer hitch. There's nothing exotic about a trailer hitch, right? You buy a trailer hitch. 

Ben Yelin: It is what it is, yeah. 

Dave Bittner: It bolts on to the frame of the car, and now you can tow things. Not so fast, Ben. Not so fast. Evidently, on a Tesla, the car requires a software update to be able to tow things. 

Ben Yelin: Of course it does. 

Dave Bittner: Now... 

Ben Yelin: Feels like a preview of our next story, by the way. 

Dave Bittner: Now, Teslas have tons of power, tons of torque, and torque is - electric motors have lots of torque, and torque is what you need for towing. And so evidently, this software update alters some of the software torque curves in the Tesla and how it rejuvenates power when it's towing something and all that kind of thing. So there's the case to be made that perhaps there is a safety issue here. But Tesla is also saying that they're not going to sell this guy the upgrade because he has a third-party tow hitch. 

Ben Yelin: Yeah, I mean, it all kind of feels like a scam. 

Dave Bittner: Doesn't it, though? 

Ben Yelin: And I think the automotive comparison is a good one. You know, if you go to the dealer for a repair, things might happen a little more quickly. They have access to the parts. They are all experts on your particular vehicle, your make and model. 

Dave Bittner: Right. 

Ben Yelin: So, you know, that's just probably an easier process. But you can go to Bob's mechanic shop down the street. 

Dave Bittner: Yeah. 

Ben Yelin: It might take him a while to get an order in for the parts if there are supply chain shortages, but he can do it. It's much cheaper. 

Dave Bittner: Right. 

Ben Yelin: It's much better for the consumer. If the consumer has a choice between going to the dealer and maybe getting the work done a little more quickly, more access to the parts, institutional expertise versus maybe wait a couple weeks and go to Bob's shop down the street, it's great for the consumer to have that option. And I just don't think we should be depriving them that option in the technological world. 

Dave Bittner: Yeah, and also, to go to your local shade tree mechanic - you know, your buddy who has an awesome set of tools, right? 

Ben Yelin: Yeah. 

Dave Bittner: Who could do the work in the garage these days if they have access to the proper - what is it called? - the OBD tools that interface with the car and a laptop - you know, they should be able to have - do the diagnostics as well - to have the car tell the system what's wrong. So why not? 

Ben Yelin: Right. And we do have authorized independent dealers for things like iPhones. They can repair your... 

Dave Bittner: Yeah. 

Ben Yelin: ...Broken glass screen. So it's not like we... 

Dave Bittner: The mall kiosk. 

Ben Yelin: Right. It's not like we're going to be creating an entire new industry. It's just going to be something that's beneficial for consumers from a cost-saving perspective. 

Dave Bittner: Yeah, yeah. Well, I think it's interesting. Certainly, we'll keep an eye on it. I just think - sounds like you and I are both pro right to repair (laughter). 

Ben Yelin: Yes. 

Dave Bittner: And I guess I just haven't - I haven't really heard a compelling case for the other side. I know people like to raise safety issues, and I think, perhaps in - particularly in the medical device world, I could see that being an issue. 

Ben Yelin: Right. The stakes are too high to trust independent mechanics. 

Dave Bittner: Yeah. 

Ben Yelin: Yeah. 

Dave Bittner: Yeah. You don't want some third-party firmware being installed on your... 

Ben Yelin: Your pacemaker. 

Dave Bittner: Yeah. 

Ben Yelin: Yeah. 

Dave Bittner: Or your infusion pump. 

Ben Yelin: (Imitating buzzing sound). 

Dave Bittner: But still, there should be ways around that. If someone's willing to take on the potential liability of that - I don't know. Like, I - I'm on the right-to-repair side, so. 

Ben Yelin: I am too. 

Dave Bittner: (Laughter). 

Ben Yelin: You know, I don't want to seem biased, but look, if John Deere came to me and offered me a few tractors and, you know... 

Dave Bittner: A few tractors... 

Ben Yelin: ...In exchange... 

Dave Bittner: For your... 

Ben Yelin: ...For me changing my mind on this issue, would I be opposed? 

Dave Bittner: I see. 

Ben Yelin: Although I don't really know what I would do with two tractors in... 

Dave Bittner: Right, exactly. 

Ben Yelin: ...Maryland's suburbs. So... 

Dave Bittner: Yeah, your suburban home. What you need are a couple of combines and... 

Ben Yelin: Yes. Exactly, exactly. 

Dave Bittner: ...Threshing machines. Yeah, absolutely. All right. Well, interesting. We'll keep an eye on that. I suspect nothing's going to happen till after the election, but we'll see (laughter). 

Ben Yelin: I would agree with you on that. Yep. 

Dave Bittner: Yeah, yeah. 

Dave Bittner: All right. Well, my story this week is the big one. And we're going to link to a article here in The Verge that's more of a editorial than anything else. But you may not have heard, Ben - but I'm guessing you have heard because I know you're very active on Twitter - that Twitter has a new boss - dare I say overlord (laughter). 

Ben Yelin: If there was ever an appropriate use of the term overlord, I think this is it. Yes. He came in with a literal kitchen sink as some sort of gag to walk into Twitter headquarters. 

Dave Bittner: Hilarious. 

Ben Yelin: Yeah, last... 

Dave Bittner: (Laughter). 

Ben Yelin: week, and it's only gotten more hilarious in the... 

Dave Bittner: Right. 

Ben Yelin: ...Succeeding days. 

Dave Bittner: Yeah, so Elon Musk owns Twitter now. You know, I have to say my take on this is much in the same way that I believe that the Donald Trump presidency was largely the result of a PR stunt that spun horribly out of control. Like, I don't believe Donald Trump was really interested in the hard work that comes with being the president of the United States. 

Ben Yelin: Yep. 

Dave Bittner: I think we're dealing with a similar thing here with Elon Musk buying Twitter. I think he came in with a bunch of bravado, which is part of his MO, and said, oh, I could do a better job at running Twitter - I think I'll buy it - and then found himself backed into a whole lot of SEC rules and couldn't back out. 

Ben Yelin: I think that's exactly what happened. I think he's an impulsive person. I think he has an inflated sense of his own ability to fix everything because he has been successful in business and has run a couple of successful companies. 

Dave Bittner: Right. 

Ben Yelin: I think he was sort of loosely affiliated with Twitter in the sense that, like many other celebrities, he was a frequent user of it. And it seems like - and I can hardly believe I'm saying this, but it seems like he was very offended by the fact that a conservative satirical website, the Babylon Bee, got suspended for misgendering somebody in the Biden administration who's transgender. That seems to have really set him off. Like, within a couple of days of that happening, he was like, you know what? I'm going to buy Twitter. And that caused him to spend... 

Dave Bittner: (Laughter). 

Ben Yelin: ...$44 billion for a company - I think its maximum valuation. In estimates I've seen, it's something like $12 billion... 

Dave Bittner: Yeah. 

Ben Yelin: ...And it hasn't been profitable in years. 

Dave Bittner: Right. 

Ben Yelin: So you're literally just setting money on fire because you are offended about a content moderation decision, and then you could tell that he had his regrets. He saw that the stock for his other companies were going into the toilet, and he did everything he could to try and back out. He said, I don't want any part of this deal because Twitter misled me on the number of bots they have, which was - seemed kind of disingenuous. And then Twitter called his bluff and took him to court, and he realized he had really been backed into it. And now, they've given him the keys. And it just seems like an absolute disaster waiting to happen - just a slow-motion trainwreck. 

Dave Bittner: Yeah. 

Ben Yelin: For one, he fired a lot of the C-suite at Twitter, which is understandable for a new person coming in. But he brought in a lot of his own people who, quote, "know how to code," but... 

Dave Bittner: (Laughter). 

Ben Yelin: ...Seem to not really know how Twitter works. 

Dave Bittner: I'd say anybody who's struggled with the interface inside of a Tesla vehicle might think otherwise, but go on. 

Ben Yelin: Right, exactly. 

Dave Bittner: (Laughter). 

Ben Yelin: And they certainly aren't experts on how to run Twitter. 

Dave Bittner: Right. 

Ben Yelin: And then, just in the several days that he's owned the company now, he seems to make impulsive, rash decisions based on reply tweets that he gets. And then, the most recent one is this idea to charge verified accounts $20 a month to keep their accounts verified... 

Dave Bittner: Right. 

Ben Yelin: ...Which he said in a tweet to somebody who complained about this, well, you know, we have to make money somehow. Advertising isn't going to do it. 

Dave Bittner: Yeah, I saw somebody did the math, actually, this morning. Before we recorded, somebody did the math. And I want to say that, if they did that, that would raise something like $80 million, which ain't a lot... 

Ben Yelin: That's really not... 

Dave Bittner: ...For Twitter. 

Ben Yelin: ...That much. 

Dave Bittner: No, no. It's not significant. It's not a game changer at all. 

Ben Yelin: It seems like a terrible idea to me. 

Dave Bittner: Yeah. 

Ben Yelin: First of all, the reason people are verified - at least originally, it wasn't for any sort of status symbol, even though that kind of - it seems like that's what Elon Musk thinks it is. 

Dave Bittner: Right. 

Ben Yelin: But it's because you want to know that you're receiving a tweet from the legitimate - the person who's actually named in that account, so... 

Dave Bittner: Right. 

Ben Yelin: I'm a big football fan. The ESPN reporter Adam Schefter is the first one who breaks breaking news on football trades, free-agency signings. And every year, somebody comes up with imitation Adam Schefter Twitter accounts. And you have to be very careful not to retweet those 'cause it's always false information. 

Dave Bittner: Right. 

Ben Yelin: That's a very low-stakes example, but it's why we need to know who is the real Adam Schefter, and that's why he has the blue checkmark. 

Dave Bittner: Right. 

Ben Yelin: Now, ESPN would probably pay the $20 a month to keep Adam Schefter's blue checkmark. But in other circumstances - maybe for local public officials - it's not worth paying the $20. And if there's a very essential, real-time emergency message coming from a mayor or a city attorney or even a governor, and that's from a non-verified account, it's going to be impossible to know whether that message is legitimate. And, you know, that's pretty inexcusable when it's not even going to make up for the revenue. 

Dave Bittner: Well, let's talk policy moderation 'cause I think, if that is the origin of Elon Musk's discontent here, we have seen, since he took over, there has been an avalanche of people sort of stress-testing the system, I guess, is a fair way to say it. 

Ben Yelin: Sure, yep. 

Dave Bittner: People just putting out horrible, horrible things, using horrible terms and horrible - objectively horrible things to see, will this get them banned? And, surprise, surprise, right now, it appears as though the content moderation folks have been put to a side. Nothing's happening. 

Ben Yelin: Yeah. Now, he says - he insists that he's not going to change any content moderation policies until he's convened this council that he's formed with many diverse viewpoints. 

Dave Bittner: He got rid of the board, by the way - made himself the sole member. 

Ben Yelin: Right. And as the sole member of this board, I think that's one of the many reasons why a lot of these very offensive tweets haven't been deleted over the weekend - 'cause they've lost that content moderation. The other thing is very prominent people are now tweeting @elonmusk, saying, hey, I got banned by the algorithm for tweeting this out. Can you reinstate me? And he's actually reinstating individuals using his power, even if they've tweeted things that are patently offensive. 

Dave Bittner: Yeah. 

Ben Yelin: He seems to think that he can solve all of these content moderation problems by himself. You know, for instance, a couple weeks ago, with those controversial Kanye West tweets, he just tweeted @kanyewest, saying, hey, let's talk. They had a talk, and he said, you know, I think Kanye gets it, so - all... 

Dave Bittner: Twitter should not be a billionaire's plaything. I'm sorry (laughter). 

Ben Yelin: Yeah. I mean, so that's ultimately what's scary about this. 

Dave Bittner: Right. 

Ben Yelin: It is - say what you will about Twitter. It is a hellscape. I admit it. I spend a lot of my time on it. 

Dave Bittner: (Laughter). 

Ben Yelin: There were a lot of problems with Twitter pre Elon Musk. But it is an important tool, particularly for journalists, political journalists, in just being able to get a message out that's curated. And if it becomes a cesspool of hate and unmoderated content that most people find obnoxious, people are going to leave the platform. There aren't great alternatives right now, but I think there certainly might be a market for it. 

Dave Bittner: Right. So there's a market opportunity here, perhaps. 

Ben Yelin: There is. If anybody wants to get in on the ground floor, I think there are plenty of companies that are trying to take off right now. What I'm concerned about is - so you'd think, OK, well, the advertisers will solve that problem because... 

Dave Bittner: Right. 

Ben Yelin: ...If they see that the problem has been overrun - if the platform has been overrun by Nazis, they'll say, get your you-know-what together or we're going to cut our advertising... 

Dave Bittner: Right. 

Ben Yelin: 'Cause we don't want to advertise... 

Dave Bittner: Right. 

Ben Yelin: ...Next to them. 

Dave Bittner: Or we find Twitter solely funded by the MyPillow guy. 

Ben Yelin: Right. Mike Lindell... 

Dave Bittner: (Laughter). 

Ben Yelin: ...Has... 

Dave Bittner: Right, right. 

Ben Yelin: ...Taken out his sixth mortgage to... 

Dave Bittner: Right (laughter). 

Ben Yelin: The problem with that is I don't know that Elon Musk, in this instance, is entirely motivated by money, which, in some sense, is scarier. I think he thinks of himself as, like, a free speech warrior. He's made enough money through Tesla, through SpaceX, that this really is his plaything. And I think he's - he might be willing to just, like, watch the entire thing come crashing down. We have seen that in the past. I mean, Peter Thiel bought the website Gawker because he didn't like Gawker. 

Dave Bittner: Right, right. 

Ben Yelin: And he just shut it down. I mean, he just destroyed it. 

Dave Bittner: Yeah. 

Ben Yelin: Do I blame the Twitter shareholders for accepting $44 billion? No, I do not. I don't think any of us who were in that position would reject that kind of money. Look, I want a second house with a pool, too. 

Dave Bittner: (Laughter). 

Ben Yelin: I'm not turning that down. But I think we have to hold out for the possibility that, because of his views on content moderation, I mean, this could just be a spiral into a pretty unpleasant, unregulated platform, where you don't know if the people you're following are actually the people they say they are. And even me, a devoted Twitter user, might have to look elsewhere. So it's been a wild ride these few days. It's not going well. The only good thing we've gotten out of this, Dave - and you found one of these articles - is there are so many - how do you say it? - schadenfreude... 

Dave Bittner: Schadenfreude, yeah. 

Ben Yelin: ...Schadenfreude... 

Dave Bittner: Schadenfreude, yeah. 

Ben Yelin: ...Of individuals across the internet just enjoying the spectacle and watching him really trip all over himself in a matter of days. 

Dave Bittner: Yeah, he's the dog who caught the car. I think that's what it is. 

Ben Yelin: That's exactly it, yeah. 

Dave Bittner: Be careful what you ask for. And unfortunately, those of us who enjoy Twitter - who find value in it - you know, for me, Twitter is quite often my - kind of my cybersecurity canary in a coal mine, you know? When a big breach happens, a lot of times that's the first place I'll hear about it. You'll see researchers saying, uh, folks, something's going on. We're getting - we're hearing chatter about this, that or the other thing, and then it'll bubble up and become real news. So I would hate to lose it. I don't know what the alternative is, but I'll tell you, people are looking around. 

Ben Yelin: Yeah. Yeah. I mean, I feel the same way. It makes me legitimately sad because Twitter has been very important to me. I've made real-life friends on Twitter. 

Dave Bittner: Yeah. 

Ben Yelin: And to see just a pretty reckless person purchase it on a whim, and now being in the position where he's the dog who caught the car, kind of ruining something that, you know - it's a community that a lot of us have spent time building up. Yeah, it just - it kind of sucks. 

Dave Bittner: We need to stop idolizing billionaires. Just because they have a lot of money doesn't mean that they're smart. 

Ben Yelin: I think that's been... 

Dave Bittner: (Laughter). 

Ben Yelin: ...Proven over and over again. I saw you tweeted that, and it's just... 

Dave Bittner: Which is not to say that Elon Musk is without skills or anything. He's not a moron. I mean, he is - he has had success in many areas. And I think his particular - I don't know - is it fair to say - can I say neurodiversity in a observational way and not judgmentally... 

Ben Yelin: Sure. 

Dave Bittner: ...That he has a certain way of approaching things? And I think his mindset has served him very well in some of the businesses that he has grown and taken over and that sort of thing. But my sense is that this ain't one of them. 

Ben Yelin: Yeah, I think it's an important lesson that just because you're a genius at some things does not mean you are a genius at everything. We've seen that proven over and over again with these eccentric millionaires and billionaires... 

Dave Bittner: Right. 

Ben Yelin: ...And it's an important lesson for all of us to remember. 

Dave Bittner: Yeah. All right. Well, we got no choice but to keep an eye on that one, right (laughter)? 

Ben Yelin: Ugh, unfortunately. Just sit back and watch the trainwreck. 

Dave Bittner: That's right. 

Dave Bittner: Ben, you recently had the pleasure of speaking with Christina Montgomery. She is the vice president and chief privacy officer at IBM. That's a big job at a big company and a big deal. And what an interesting conversation. So sit back, everybody, and enjoy Ben's conversation with Christina Montgomery. 

Christina Montgomery: IBM is the oldest technology company in the world. We've been around for over a century. And for the course of that century, we have been responsibly managing our clients' most important data and releasing new technologies into the world responsibly and with clear purpose. And we are not - we are an enterprise technology company. We're not a platform company. We're not in the space of social media. That being said, we power - our technology and our services power, essentially, the backbone of the economy. So we support 90% of the world's largest banks, I think 90% of all retail transactions. We support telcos, airline reservations, governments. We're essentially running the backbone of the economy. 

Christina Montgomery: And we were the first company in our industry to appoint a chief privacy officer as well. So we've had a chief privacy officer for 22 years now. As I say what worries me now - it's like - we've been at this for a long time. I - data is essential, and the free flow of data is essential to power the digital economy. It's essential to our clients. And I feel the discussions around personal information and privacy and data localization, for example - the splinternet that's coming - which are focused on real-world concerns that people have about the growth of a data monetization economy, essentially - businesses making money off of data - has eroded trust in the economy. And we're rightly looking at how to regulate that. Governments are rightly looking at how to have more control over the data of their citizens. 

Christina Montgomery: But my concern, as an enterprise company that's essentially powering the global economy in many respects, that those uses that people expect companies like IBM and like the companies they use on a daily basis when they do things like make an airline reservation or engage in a banking transaction - my concern is that, with the regulatory focus, with the splinternet, those real, necessary, real-world uses of data are going to get caught up in a regulation that's going to hamper the economy. 

Ben Yelin: So how do you strike that balance between a regulatory scheme that does not hamper the economy, but one that takes personal information seriously - where people can have some level of confidence that, when they use a third-party vendor, their information is going to be secure? How do you at IBM, and I guess broadly in the industry, try to strike that balance? 

Christina Montgomery: It all comes down in part to trust and to having privacy practices in place that protect - privacy and security practices, frankly - in place that protect data in real-world ways and real-world uses. So privacy and security by design, data minimization practices, transparent terms and conditions when using client data, contractual terms in place, with not only the clients that IBM supports from a technology perspective, but the suppliers that we use in our own data ecosystem, and a culture of privacy compliance. So the way we are structured at IBM - from a privacy perspective, we have sort of - we've taken this very integrated view of data - that there is risk associated with data use, but there's also obviously great potential associated with data use. 

Christina Montgomery: And so me - in the chief privacy office, I work very closely with our chief data office, with our policy teams, with our enterprise and technology security teams, our CISO and the team supporting security on a daily basis. All of us working together underpin that privacy program to give us a holistic view of the data that we use across the IBM enterprise and to build those protections and that privacy and security by design in place across the enterprise. We also have a program - so we take sort of a tops-down and bottoms-up approach here, recognizing, again, the potential for the use of data, while at the same times addressing the risk. We have what we call a privacy advisory committee. That's our senior vice presidents. They sit on top, help us set the risk profile and the like. The chief privacy office - and what we do in the chief privacy office is essentially give the business units within IBM - the users of data in the company - the guidance that they need, the tooling that they need, the visibility that they need in order to comply with privacy regulations. 

Christina Montgomery: And now, we've gone from the GDPR to over 130 comprehensive privacy regulations around the world. Now, we're a global company. We operate in 170 countries, approximately, around the world. That's a lot of laws and regulations that we have to comply with. And at the same time, we - even in laws where there is no comprehensive data protection regime in place, we always want to ensure that we are maintaining the trust of our clients, right? So it's important to us to have practices in place to protect confidential information. So again, like, we facilitate compliance by the business units. But we need to give them the ability to ensure that their data is protected, as I said, through tooling, through visibility, through guidance, through education - supporting that culture of privacy compliance. So that's how we execute it here, again, at IBM - in a global company that's operating in every jurisdiction, pretty much, around the globe. 

Ben Yelin: I guess, from a policy perspective, most companies are not like IBM. Some companies don't have chief privacy officers. I think, especially some of the startups don't have the same type of culture of privacy. So from that perspective, how do you see the potential of federal data privacy legislation? Do you think, for your company, it would be beneficial, or do you think that your culture of privacy has advanced past the point to which federal legislation would make a difference? And then how do you see that broadly across the industry? 

Christina Montgomery: Yeah. I mean, we've been long advocates for national privacy legislation. You know, we believe that consumers should have consistent rights across the United States - rights to understand what data is being collected on them, to have visibility to that data, to be able to correct it, to be able to delete it. And so we've been, again, advocating for consistent federal protection. I think a national law is something we absolutely support. Right now, we're going to be faced, in the absence of a national law, with five different states and rolling that out across our enterprise. 

Ben Yelin: Right. 

Christina Montgomery: So... 

Ben Yelin: And the number is growing of these states that have - yeah. 

Christina Montgomery: And it's growing, and it will continue to grow. So I think national privacy legislation is critically important. That being said, I mean, the two points that have been a bone of contention in the whole dialogue and debate over national privacy legislation - preemption and private rights of action - we don't feel have been addressed sufficiently in the ADPPA. So what we've been doing is sharing our perspective, either through trade associations or directly as an IBM company. I went in, I think it's July - this summer's sort of a blur - and met with, you know, some of the members of the Hill who've been involved in privacy legislation to share IBM's perspective. I mentioned at the onset of our conversation this concern that I have about overregulating the economy that exists for the benefit of all of us today - that we just kind of take for granted the fact that, when we make an airline reservation, sure, our personal information is collected, but when we make that reservation online and we show up at the airport, it'd better be there. There's necessary information to process a credit card transaction, for example. 

Christina Montgomery: And so I've tried to share IBM's point of view around data and what is necessary versus where do we see the real concerns with respect to trust in, essentially, an economy that, you know, data is collected multiple times a day. From the time you wake up in the morning till the time you go to bed tonight, you have probably shared personal information with multiple companies. Some of that you do because you want to. Even when you think of something like a restaurant reservation - you know, Open Table. I used it this past weekend. I was in an area that I wasn't familiar with, and I wanted to know - is there a good restaurant nearby? I share my location, but I do that with my consent, and I get value back as a result of that. So that's a whole nother set of discussion - you know, what should be the guardrails around that? - just sharing our perspective and having those conversations and making sure that we're not overregulating, so that, when a law gets enacted, we're not taking, essentially, away the ability to use data in a way that consumers have expected companies are using it today. And I think, in large part, it's that expectation. 

Ben Yelin: Yeah, I mean, I do think we take it for granted 'cause we don't think about the implications of all of those transactions. I mean, I want to order a sandwich from Jimmy John's - that - they have saved my credit card information. They've saved my previous orders. They can tell me where my nearest sandwich shop is. All of those things are extremely convenient. So I see that risk of overregulation - that people might conceptually support greater privacy protections. But if those benefits are taken away, I think people haven't really considered the implications of that. Is that how you see it as well? 

Christina Montgomery: In part, but also, it's - so that example is more in line of the data economy. When we think about - you know, Open Table exists because people share data with it. It's an app, right? But I'm thinking also in terms of - what do businesses need? What do service providers to the banking industry need in order to ensure that, you know, you can deposit checks - that you are aligned with your account - that I can go into my app and check and make sure my paycheck was deposited on time. I could check my balance on a regular basis. All of that requires personal information. Supporting banking transactions requires personal information. 

Christina Montgomery: And, you know, with the ADPPA, for example, there's a very prescriptive list of how data can be used. So on this point of the prescriptive nature of the ADPPA, there are some uses that companies make of data today that is necessary to power the economy - that consumers expect companies are making of data today that may not be captured when you've got a delineated list of 17 things that you can use data for, for example. So I think we need to be very careful about addressing the real concerns consumers have, but recognizing the fact that data is used to power the economy and to do things you do on a daily basis that don't involve, necessarily, the pure monetization or sharing of your data, but they're needed to power the global economy. 

Ben Yelin: Right. Right. Again, it's an interesting balance that Congress is going to have to strike, and I think we're on a pretty consolidated timeline on this piece of legislation. I mean, I think it's do or die in the next four months, while this Congress is still in session. 

Ben Yelin: Before I get to AI, do you have thoughts on the prospect of this national data privacy law being enacted this year? 

Christina Montgomery: I can't speculate (laughter). I mean, you've seen the conversations recently around - is it even going to come to the floor for a vote in the House because of the preemption concerns regarding California? 

Ben Yelin: Right. 

Christina Montgomery: So, you know, I think it's been a valuable dialogue and discussion that we've seen Congress engaged in. And if anything, even if it doesn't move forward this session, we've got - we've made progress as a country, I think, in the conversation. 

Ben Yelin: Right - laid some of the groundwork for sure. 

Ben Yelin: To switch gears a little bit, I want to talk about your work on AI ethics. How do you see AI ethics integrating into your work as chief privacy officer and the culture of privacy? What's the connective tissue there? 

Christina Montgomery: You know, the privacy teams have the track in place. We understand the risk as well as the potential. I think one of the benefits - and we understand governance. And I think, in part, that's why, you know, I came from - prior to the role that I'm in today, I was the corporate secretary to the company. So I spent, essentially, almost five years speaking with the board of directors, sitting in on board meetings, speaking with investors, thinking about and executing on governance - whether it's from a ESG perspective or the financial reporting that we do. And so I built a governance program formally around privacy, as well as operationalizing the AI ethics board. 

Christina Montgomery: And the way we've established the board - because the issues with AI - again, they're not all having to do with personal information, but they do have to do with data, and they do have to do with that integrated view of data that I mentioned earlier. And they also require a cross-disciplinary, diverse and inclusive approach. And so the board that we built at IBM has representatives from across the company - every business function. So it has data scientists' background, it has legal background, communications, product expertise, sales expertise, research expertise, etc. Our chief diversity and inclusion officer sits on our board. So we built a board that brings perspectives from all across the company. 

Christina Montgomery: And we built a program, essentially, that not only has that tone from the top - so it's the same privacy advisory committee that has that oversight that we use in our privacy governance - but also, at the same time, a very robust network of AI focal points, both from a formal roles perspective - so focal points in each business that are sort of our eyes and ears and consultants to the business on the ways that IBM would like to see trust and transparency - our principles - built into the ethical use of data, but also a network of volunteers because we've got a lot of people in IBM that are very passionate about the topic of AI ethics and are doing some great work individually, in their spare time, or in their day jobs - right? - around the ethical governance of AI, ethical technology. So we've got researchers that built some of the very first tools to detect bias, to impart explainability in AI, to quantify the accuracy of AI models and the like. So we wanted to tap into that network. So people can volunteer to participate in projects that the board runs as well. 

Ben Yelin: So given your experience and your leadership here, if you were speaking to the C-suite of either a Fortune 500 company or perhaps even a smaller company, what would be your advice - kind of your tangible advice as they build out their AI? What are some of the pitfalls of AI that they should be cognizant of, and how would you recommend that they approach those pitfalls? 

Christina Montgomery: So I think every user, every developer, deployer who's using AI needs to have principles in place that underpin the decisions that you make - not just with respect to the technology that you're deploying or using, but whether you are going to deploy and use technology to solution for something in the first place. So at IBM, we have principles of trust and transparency, we call them, and they're fundamentally that AI should augment, not replace human intelligence - that data belongs to the creator, which is essentially our clients. That goes back to the business model point. We don't take client data and populate AI models without their consent, without their visibility. It's their data. And that new technologies, including AI, should be transparent and explainable - and we've built pillars around that. They're very consistent, transparent, explainable, fair, secure and robust and privacy-preserving. 

Christina Montgomery: So I think, if you start from that and you start with a set of values, then you can build operationally out a culture of ethical compliance with respect to AI. So we make decisions as a board and as a company about - so for example, COVID-19 is probably a good example - lots of people throughout the world - lots of people within our own company thinking about how technology could be used to help address the pandemic. And this is going back, you know, now - I can't believe it's almost 3 years... 

Ben Yelin: Two and a half years, yeah. 

Christina Montgomery: (Laughter) And we had to make some decisions at the time about not just what we were capable of doing from a technology perspective, but whether we would deploy technology and whether we would use data to - maybe it could address some problems. But what were we willing to sacrifice as a company, including things like - would you build something for a client that could enforce quarantine restrictions by tracking whether somebody is in their home? We looked at vaccine passports and what we were willing to do in that space and not willing to do in that space, all with a lens toward privacy and security, but also toward those principles of trust and transparency that I mentioned earlier. 

Christina Montgomery: So we think every company needs a set of values and needs a set of principles. And then, what we try to do at IBM is taking those principles, making decisions based on those, operationalize them across our company in a way that works for our company and the way that it's structured. It may not work the same way for every company. We're large, multinational, with a very diverse business, but an enterprise business. So how we operationalize ethics may be very different from how, you know, a smaller company just based in, you know, one country may operationalize ethics. 

Ben Yelin: Sure. There's been talk, I know at the federal level, about AI or algorithmic transparency, where regulators - whether it's the FTC or somebody else - would get a look under the hood to make sure that algorithms and AI technology is compliant within whatever parameters Congress sets up. Do you have thoughts on that? Do you think that would be an overregulation or something that's a good idea? 

Christina Montgomery: So we've actually been advocating for the precision regulation of artificial intelligence for, again, almost three years now. We put out a - we have a policy lab at IBM, and we try to be - offer up concrete and actionable policy recommendations - so not just talking about what are our principles, but demonstrating how you can translate those into practices and how government should potentially look at those. So we believe that regulation of AI ought to be risk and context based. So I do think to say you're going to open up every algorithm, regardless of the implications or the use in a very low-risk use case, for example - that algorithm - that is overregulation. 

Ben Yelin: Right, right. 

Christina Montgomery: So, you know, we've been pointing to - is regulation needed? Yes, in high-risk cases, on a context basis. The EU AI Act, for example, is consistent with our approach - our recommended approach in terms of precision regulation of AI. 

Dave Bittner: All right. Interesting stuff, Ben. I mean, for me - first of all, great interview. 

Ben Yelin: Thank you. 

Dave Bittner: And Christina is really a great guest. A couple of takeaways for me - one is I think this really helped for me to put into perspective the scale at which a company like IBM operates when it comes to things like privacy - when it comes to things like security. And I think it's very easy to think about the small- to medium-sized businesses, and we should think about those businesses. But when you're a company like IBM, and you are truly global, and you have a long, storied history, the scale of the problem is different than a lot of those smaller companies. And so - and also, your organization's ability to have influence on a global level, on a policy level, you know, is immense. 

Ben Yelin: Right. The changes you make at IBM are not going to be limited to IBM because other companies are looking at you as a role model. And thankfully, they have somebody like Christina Montgomery. I mean, it was great talking with her because she clearly knows what she's talking about and has, I think, a really interesting vision on the future of privacy, the future of AI ethics. So... 

Dave Bittner: Yeah. Really good explainer, too, so that - you know, I appreciate that. 

Ben Yelin: Yes, absolutely. 

Dave Bittner: Yeah. The other point that was particularly interesting to me was just something we've touched on here before many times, which is trying to thread that needle between necessary regulation, but also not undue shackles on organizations trying to do business. And that's hard. 

Ben Yelin: It's a hard balance to strike. 

Dave Bittner: Yeah. 

Ben Yelin: I think we're 20, 30 years into this grand internet experiment, and I don't think we've necessarily been able to strike that balance perfectly. So I just give kudos to anybody who's tried. 

Dave Bittner: Yeah, absolutely. 

Dave Bittner: All right. Well, our thanks to Christina Montgomery for joining us. Again, she's vice president and chief privacy officer at IBM. We do appreciate her taking the time. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.