Caveat 4.7.21
Ep 72 | 4.7.21

Regulation of the Internet is evolving.

Transcript

Konstantinos Komaitis: We are seeing an increase in regulation of the internet, and some of that regulation is justified; some of it is not.

Dave Bittner: Hello, everyone. And welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner. And joining me is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: On this week's show, Ben describes a class-action suit against Google alleging the sale of personal data. I've got the story of a prison surveillance company keeping tabs on people outside the prison walls. And later in the show, Konstantinos Komaitis from the nonprofit Internet Society on the unintended consequences of uninformed online regulatory policies. 

Dave Bittner: While the show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben. We've got some good stories to share this week. Why don't you kick things off for us? 

Ben Yelin: Yes. So, of course, both of our stories come from the same source - Motherboard from Vice. 

Dave Bittner: (Laughter). 

Ben Yelin: Mine is by long-lost friend of the pod Joseph Cox. 

Dave Bittner: Right. 

Ben Yelin: Another one-way friendship. 

Dave Bittner: Yeah, we really need to send him a fruit basket. 

Ben Yelin: I know. 

Dave Bittner: (Laughter). 

Ben Yelin: Christmas-slash-Hanukkah purchases next year for Mr. Cox... 

Dave Bittner: There you go. 

Ben Yelin: ...And professor Kerr. 

Dave Bittner: Yeah, yeah. 

Ben Yelin: Going to run up quite a tab. But this article is about a class-action lawsuit filed by a couple of academic types on behalf of them and millions of other people. And the lawsuit alleges that Google is selling users' data contrary to the way that they present themselves in public. So the lawsuit centers around what is known as real-time bidding data. So companies place bids to win advertising space on both, you know, google.com and every website that uses Google's advertising services. So it's expensive and valuable to get that space during people's web browsing sessions. There is sort of an auction format. There's this bidding process where hundreds, if not more, companies will bid to get that advertising space. 

Ben Yelin: There are a couple of issues with this as it relates to personal data. Even if you don't win the bid, you're still going to get access to what they refer to as a dossier of data on the individual that you're trying to gain access to. So you can join the bidding process without the intention of actually winning just to obtain extremely valuable and potentially personal information on people so that you can target them for future advertising. 

Dave Bittner: Right. Let me just pause for a second, back up and sort of explain it the way I understand it, just for point of clarity, that basically, Google has this service where they reach out to these advertisers, and they say, hey, I got a slot on this website, and the person viewing this website has all of these attributes. 

Ben Yelin: Right. 

Dave Bittner: They live in this country. They make this amount of money. They have these religious affiliations. They have these medical conditions. They may be married. They may be divorced. There are thousands of - I believe Google calls them verticals that they slot you into. And they send all that information about the potential person to put the ad in front of to the person who's bidding, to the organization that's bidding, and they say, this is what we got; are you interested? And the bidders respond and say, ooh, yeah, that's exactly who I want to put my message in front of. And they make a bid. And all this happens in milliseconds. It's all automated. 

Ben Yelin: Right. There's that little guy behind the screen constantly who's like... 

Dave Bittner: (Laughter) Right. Yeah. 

Ben Yelin: ...Ooh, I really want this Dave Bittner guy. Yeah. 

Dave Bittner: No, right. It's not like the stock market or something, you know, I mean, where you see a bunch of people on the floor (laughter), even though, I mean, that's mostly automated these days, too. So it happens in the blink of an eye, literally. But the point that they're making here is that I think the direction of the flow of information - it's not like the people who are looking to place the ads are saying, hey, Google, I would really like to put this in front of a person with these attributes, and then Google says, oh, yeah, I got one of those, and then that's how it happens. No, Google goes out with the attributes and says, hey, I've got a person, and they've got all these attributes, including their IP address, right? 

Ben Yelin: Yeah. 

Dave Bittner: Location information. And that's how it works. So continue, Ben (laughter). 

Ben Yelin: So, yeah, that gives you an idea of how all of these companies are collecting what they call bidstream data. And the number of companies collecting this data, it's potentially hundreds or thousands because, like I said, there are companies that join these bidding wars with no intention of winning; they just want to get a bite out of this apple. 

Dave Bittner: Right. They're vacuuming up that user information, that surveillance information. 

Ben Yelin: Exactly. So maybe one of the big companies is going to win the actual advertising slot, but another company might enter and then obtain your data, your IP address, your location information, and maybe sell that to, you know, a local law enforcement department in your area or the federal government, the Department of Defense. So there's certainly the potential here for abuse, and that's the basis of this lawsuit. It's being filed in the Northern District of California, so it's a federal court. And it's a class-action lawsuit, meaning you're going to have to have a class of plaintiffs that's similarly situated. 

Ben Yelin: And this is sort of your perfect class-action case because since we know this is automated and we know that we're talking about a universe of millions of people who have these little dossiers compiled about them and sent to these companies, the facts are really identical as it relates to all plaintiffs. Usually when you're talking about class-action suits, the big first hurdle is, can you find enough plaintiffs that are similarly situated? 'Cause companies have been successful in lobbying Congress and the courts to make it very hard to file class action suits. So this case has an advantage, I think, over other class action suits. 

Ben Yelin: In terms of the merits, they are alleging violations of a bunch of different statutes, including California's unfair commercial practices statute, the implied covenant of fair dealing, a potential invasion of the California constitutional (ph), breach of confidence - all of these things that imply, basically, fraud on the part of Google because they present themselves as we do not sell your data. 

Dave Bittner: Right. 

Ben Yelin: And if this - if the facts in this article are to be believed, that is a misrepresentation. Another word for it would be lying. 

Dave Bittner: (Laughter) I was going to say, dial down your lawyerly words, Ben. It's a lie. It's a lie. 

Ben Yelin: Yes. Luckily - you know, you have to be very careful using that word in a court of law, but I think on a podcast... 

Dave Bittner: Right (laughter). 

Ben Yelin: ...We don't have to bleep that out. 

Dave Bittner: (Laughter) OK. I don't know, gobsmacking, maybe, is just how - if what they're alleging is true. I mean, Google makes the point many, many times throughout the documents that you are supposed to read that, of course, nobody does... 

Ben Yelin: Yeah. 

Dave Bittner: ...But they say front and center, we do not sell your personal information to anyone. And according to this lawsuit, that's just not true. 

Ben Yelin: Yeah. And I should also mention, this is not the first time we've seen these allegations. The senator we bring up most on this podcast, Mr. Ron Wyden of Oregon, has sent a letter to the FTC last summer basically alleging the same thing. And I'm quoting here - "few Americans realize that companies are siphoning off and storing bidstream data to compile exhaustive dossiers about them, including web browsing, location and other data, which are then sold by data brokers to hedge funds, political campaigns and even to the government without court orders." So that's sort of the breadth of the potential danger here. 

Ben Yelin: But it's one thing - you know, obviously, a senator sending a letter to the FTC can have a pretty profound effect. When you start talking about a class action lawsuit, that might actually be something that would get Google to change its behavior. And this article notes that the law firm representing the plaintiffs here has actually had success in winning settlements from big companies. So it's possible all of us are going to get, you know, $3 in our bank account... 

Dave Bittner: (Laughter). 

Ben Yelin: ...In three or four years... 

Dave Bittner: Well, yeah. 

Ben Yelin: ...For the data that's been collected. 

Dave Bittner: Oh, goody. And the lawyers will buy a new vacation home (laughter). 

Ben Yelin: So, I mean, those vacation homes aren't going to purchase themselves. 

Dave Bittner: (Laughter) Yeah, that's right. That's right. Well, explain some of the details here. Because as a class action suit, it is not a criminal allegation, right? 

Ben Yelin: No, it is a civil case. Yup. 

Dave Bittner: OK. So what they're after here is both money and could they also be after an agreement from Google to change their behavior? 

Ben Yelin: Yeah. So they're looking for both potential money - damages - and also perhaps an injunction, which would force Google to stop engaging in this behavior. The original brief that was posted as part of this article is actually more of a preliminary brief that's simply asking for a jury trial. And not to get too much into the weeds here, the Seventh Amendment of our Constitution guarantees a jury trial for civil trials, but that has not been incorporated against the states. So there's no guarantee to a jury trial in most states on civil matters. 

Ben Yelin: And, you know, I think the reason they're looking for a jury trial here is this is something that could be very sympathetic to the six laypeople who would be considering a civil case on this matter... 

Dave Bittner: Yeah. 

Ben Yelin: ...Because those six individuals very likely are Google users, and they themselves have probably been a victim of this data being siphoned off. Now, knowing that, that could potentially - you know, if the court grants this motion for a jury trial, that might motivate Google to say, you know, our chances aren't very good in the courtroom. Let's settle this. 

Ben Yelin: Usually, I would say the settlement would come with an injunction just because damages might not be adequate relief for consumers when you divide it - when you divide the damages by the millions of people who use Google. So I think an injunction forcing Google to stop engaging in this behavior might be the end goal here on behalf of the plaintiffs. 

Dave Bittner: I guess big picture, too, I mean, this points to this larger trend that we're seeing where folks are going after these large companies that are vacuuming up all this information and trying to get some relief here. 

Ben Yelin: Yeah. I mean, it is kind of interesting. We've talked about a number of these cases, but even in the last, like, three or four months, we've seen a lot of lawsuits filed against our biggest tech companies for privacy-related issues. And I'm just kind of curious how many of these cases are going to break through because certainly it won't be all of them. So, you know, I'm just curious. And this seems like one of the more compelling cases to me, as you said. So I think this is certainly one that we're going to be watching out for. 

Dave Bittner: Yeah. All right. Well, my story this week also comes from Motherboard over on the Vice website. This is written by Aaron Gordon, and it's titled "Prison Mail Surveillance Company Keeps Tabs On Those On The Outside, Too." Let's say you have a friend or a loved one who is behind bars, who's doing time. And one of the ways you keep in touch with that person is writing letters. And this is a method, I suppose, as old as people being in prison. 

Ben Yelin: Yup. 

Dave Bittner: It's sort of a standard, understood, accepted sort of process that those letters get read by the folks who are guarding the prisoners to make sure that you're not planning a prison break or (laughter) doing something, you know... 

Ben Yelin: Yup. 

Dave Bittner: ...Doing something you shouldn't be doing. So that's an... 

Ben Yelin: You got to speak in code when you write those letters. 

Dave Bittner: Right. It's an accepted thing. 

Ben Yelin: Yup. 

Dave Bittner: What this article is about is a company called Smart Communications who's been offering up a service for prisons where they will ingest all of the mail coming to prisoners, they will scan it and then they will make those scans available to the prisoners to read. And there are a number of benefits to this. Among them, they say this could cut down on contraband making it to prisoners. Evidently, there was a story about, I guess, folks - it's sort of funny. They mention in this article - folks trying to, I guess, embed the paper with cannabis... 

Ben Yelin: Yeah. 

Dave Bittner: ...You know, like THC and things like that. It seems like maybe that story has been debunked, so there might not be something to that. But it's the kind of thing that you could see - if someone were able to do that, OK, not having the actual paper get to them might have some value. But the other thing that this company is claiming to do - and I should mention in this article, they got their hands on a proposal from Smart Communications to - I believe it was the Virginia Department of Corrections. 

Ben Yelin: Yes. 

Dave Bittner: And they would scan all of this information but also analyze it, do, you know, OCR on it and put everything in a database. So every letter that gets written, you know, who wrote it, where it came from, how frequently it was, the contents of it all are now searchable. And Smart Communications makes the case that this allows for cross-referencing of things. You can establish associations, perhaps gang activity, things like that through the use of this database. 

Dave Bittner: Now, on the flipside of this, folks are pushing back and saying, you know, if you're someone who is in prison who has committed a crime and you're in prison and you are in good faith trying to pay back your debt to society, you know, one of the things that could be a strong encouragement for you to stay on the right path might be these interactions with your loved ones. You know, imagine a crayon drawing from a child... 

Ben Yelin: Right. 

Dave Bittner: ...You know, something like that. To actually have that in your hands, in your child's own hands, that artwork that could have a lot of meaning to somebody... 

Ben Yelin: Right. 

Dave Bittner: ...Much more than viewing a scan of it on a computer screen. And so the case is being made that denying prisoners that might not be fair to them, to their rights as a prisoner. But beyond that, I think the bigger issue here is building this big database of who's communicating with whom and of people who are not in prison - right? - so people who are on the outside merely sending a letter through the U.S. Postal Service. 

Ben Yelin: Yeah. We expect that people who are incarcerated who have been sentenced are going to lose some of their rights. 

Dave Bittner: Right. 

Ben Yelin: That's kind of the definition of being put behind bars. 

Dave Bittner: Yeah, yeah. 

Ben Yelin: But we don't expect that those rights will be deprived of innocent people who just happen to be communicating with people who are incarcerated. And I think that's the concern here. I mean, it is, as you say, really two separate issues. You have what I think is kind of the more sensitive and compelling issue to me - is the kind of lack of autonomy, privacy, expression and associational rights - and those are in the words of a representative of the Electronic Frontier Foundation - among people who want to communicate with loved ones. They are losing that privilege. And it could be for things that are sentimental, like that letter sent from - that crayon drawing sent from a child. You do really... 

Dave Bittner: Yeah. 

Ben Yelin: ...Lose that connection. And then there's the fact that innocent people are being caught up in this new dragnet surveillance tool, which is potentially collecting, you know, rather personal information about people who don't think that their personal information is being collected. They think they're sending a letter. 

Ben Yelin: So you really have those two separate issues here. I think one of the problems in this case and in many other cases is the organizations themselves, Smart Communications in this instance, are so tight-lipped about how this actually works. They will refuse to comment for articles like this. They'll note, you know, this is a proprietary arrangement with the Department of Corrections. You know, and I think that creates more questions. When you seem like you're trying to conceal so much about how a program works, I think that naturally leads people to become a little bit more suspicious. You know, saying, this is confidential; this is a trade secret - that almost casts more of a shadow on why and how this technology is being used. 

Dave Bittner: Yeah. The article points out that the Virginia Department of Corrections opted not to go with this company, but they did track down some other correctional facilities, organizations, counties who had gone with them and with mixed results. Some of them, I think, expressed frustration that it just didn't go very well. I'm sure they have other clients who are probably quite pleased with what they offer. I don't know what to make of this. On the one hand, I can see the value in it. But like you say, it's - I guess it's that slippery slope of yet another place for data to be, you know, gathered up and monetized. 

Ben Yelin: Right. This is not any sort of legal analysis. It just kind of strikes me the wrong way. 

Dave Bittner: Yeah. 

Ben Yelin: You know, you have people who are trying to have intimate communications. Then you have this middleman coming in, introducing a management console and uploading files as a .PDF that people can view on devices. I mean, I almost feel like that defeats the purpose of a lot of mail in the first place, that sort of personalized aspect of it that really was written by the person who signed the letter. So it just - it strikes me... 

Dave Bittner: Yeah. 

Ben Yelin: ...The wrong way in that regard. And, you know, this is a company, at least according to the filings that have been shared with Motherboard here, that's going to make millions of dollars based on this type of arrangement. I mean, this a profit-seeking opportunity. Of course, obviously it is. Otherwise they wouldn't do it. But that's just another reason... 

Dave Bittner: Yeah. 

Ben Yelin: ...Why it kind of rubs me the wrong way. 

Dave Bittner: Yeah. Well, and it's - they do note that they charge the people who are sending the mail. They say they charge 50 cents per message and a dollar per photo for the people who are sending it. So the prisoners don't have to pay that, but the loved ones who are sending it do. And, you know, that could be an undue financial burden on people who may not be in a position to pay that. You know, the price of a stamp is one thing, but to limit the number of photos you send along because they're a buck a pop is just another - I guess another little bit of insult to injury, if you will. 

Ben Yelin: Yeah. I mean, it also kind of seems like an equity problem to me. There shouldn't be an added cost on what I think should be a basic right for people outside the prison walls to communicate with inmates. 

Dave Bittner: Right. 

Ben Yelin: And you are just adding this extra 50 cents to a dollar on top of what you already have to pay for a stamp. 

Dave Bittner: Yeah. All right. Well, again, it's from VICE - Motherboard site on VICE there. And we'll have links to both of those stories in the show notes. 

Dave Bittner: We would love to hear from you. If you have a question for us, you can call in and leave a message. The number is 410-618-3720. You can also send us email. It's caveat@thecyberwire.com. 

Dave Bittner: And I recently had the pleasure of speaking with Konstantinos Komaitis. He's from a nonprofit called The Internet Society. And our discussion focused on unintended consequences of online regulatory policies. Here's my conversation with Konstantinos Komaitis. 

Konstantinos Komaitis: I think the conversation regarding regulation of the internet has changed dramatically over the past few years, and I would predict that it's going to continue changing over the next few years. What I mean by that is that, you know, 25, 30 years ago when the internet emerged as a technology with great potential, if you remember, a regulation or any discussion of regulation was a little bit of an anathema. We have John Perry Barlow's, of course, "Declaration of Independence" because the internet was seen as a tool that is somewhat untouchable from all the bad and evils of the world. 

Konstantinos Komaitis: Over the years, because the internet evolved, societies evolved, the nation-state has changed and a lot of geopolitical shifts have also taken place, we are seeing an increase in regulation of the internet. And some of that regulation is justified; some of it is not. But it all comes down to this whole idea that the nation-state actually starts getting more involved in the internet. I think that is the main difference in comparison to how things used to be. In the beginning, if you want, governments were not really understanding a lot about the internet, but there have been some crucial events - the Arab Spring being one of them, the Snowden revelations being another one - where it has been quite a shock to the system for many of the governments because they were able to understand the potential and the power that the internet holds. 

Konstantinos Komaitis: So of course, as it always happens - and you know, it's pretty standard procedure, if you wish - they said, I want a piece of that pie as well. 

Dave Bittner: I hear people talk about this notion of a splinternet, you know, where - talk about things like the Great Firewall of China and some of the restrictions Russia places on the internet. And indeed, even - as we're seeing right now, the restrictions in Myanmar where access is limited, does that threaten the global nature of the internet itself? I mean - that countries are taking these sorts of actions? 

Konstantinos Komaitis: Absolutely. And I think that's what's interesting to note here is that we have different levels of splinternet, if you wish. In the beginning, again, we were focusing on or we were thinking that the Great Firewall of China is a way to splinter the internet. Then came Russia, where - which it introduced some a little bit more draconian measures when it came to the way information was supposed to flow. I'm not sure whether you remember a few years back this whole LinkedIn saga and then the saga with Telegram as to, you know - and the data localization practices, the policies better yet that Russia wanted to impose. 

Konstantinos Komaitis: But recently, I think that, again, splinternet is taking a completely different dimension. And even though it's not necessarily intentional, the way at least we see it in China or in - coming out of China or Russia, it still has the same effect. Regulation that is coming out of Europe could potentially become part of the problem vis-a-vis splintering the internet. The GDPR, for instance, is - was a very good and very well-intentioned regulation. The spirit of the law was absolutely great. It talked about privacy, talked about the ability of users to control their data. Who doesn't want that? We have been having this conversation for many years. However, when we went into the implementation and enforcement details, there we saw some issues emerging. And one of those issues was the potential fragmentation that it caused in the internet simply because some websites that - decided not to comply with the GDPR, which meant that they're not resolvable in Europe. 

Konstantinos Komaitis: So what I'm trying to say is that we should start thinking of the potential of a much less global internet, which is actually the case not only at the extremes when a government shuts down the internet. You've mentioned Myanmar. In India, we had exactly the same. In parts of Africa, we are having exactly the same - or when China or Russia are doing something that is contrarian to the way we understand government structures. But we also need to start thinking of those well-intended regulations that fail, perhaps, to understand the technology and the architecture of the internet and unintentionally create those consequences that could lead to a much less global, interoperable and open internet. 

Dave Bittner: You make a fascinating point, which is that this notion that the internet is not a business sector, that it's closer to an ecosystem or even a built environment. Can you take us through that line of thinking? 

Konstantinos Komaitis: Sure. So the internet - we keep on saying and we keep on repeating that the internet is a complex system, and we don't go through the process of trying to explain exactly what that means. When we talk about an ecosystem, we talk about a system that is constantly evolving. And the internet, in many, many ways, is very similar. So it is difficult to understand the challenges of the internet unless you really take a step back and look at the internet in a much more holistic way and you stop treating it as a monolith. And I think that this is where the disconnect, if you want, with the nation-state or government sort of occurs. 

Konstantinos Komaitis: I read this fascinating book by James C. Scott, "Seeing Like a State." And there he argues that the key role of states is to make more of life legible, essentially to better record and measure human affairs - like, for instance, taxation - and then which makes it easier to manage those affairs, right? But he goes on to say that we strive for legible or readable structures that can easily be understood, oftentimes with one fatal flaw. And that is in the top-down drive to simplify and formalize our understanding of complex systems, we sometimes disregard the local and practical knowledge that are critical to managing this complexity. 

Konstantinos Komaitis: And he offers this great example of scientific forestry - right? - where essentially he says that in an effort to change the state, in an effort to change the structure of forests in order to get the best out of it, essentially it did destroy the forest. But the implications and the consequences were only visible many, many, many years later. 

Konstantinos Komaitis: So now think of the internet. It is a decentralized network of networks. It is more complex and fast-moving than we can fully appreciate. So any attempts to engineer it by applying notions of sovereignty that we see popping up a lot or regulation - right? - are bound to cause unintended consequences, and they risk killing the very things that make the internet valuable. So all in all to say that there needs to be a much better understanding of what we mean with the term internet because one of the things that really frustrates me is that we all use the term, but I'm not sure we all have the same understanding. 

Dave Bittner: It's a fascinating idea, and I'm thinking about something in terms of, like, climate science. If I am a nation who's doing my best to reduce my impact on the climate but the nation next door is, you know, opening a new coal-fired power plant every day - well, we share the same climate, you know? Like, we - and the borders are kind of meaningless if that, you know, soot comes over the border via the air. And I wonder if that metaphor applies to what we're talking about here with the internet. 

Konstantinos Komaitis: It's funny you've mentioned the climate because for the past couple of years, I myself believe that there are some lessons to learn from environmental governance. Again, if you think about the environment 25, 30 years ago, nobody was really caring - right? - caring to the point that we see now organizations and businesses and civil society really caring and taking a stance on the environment. And suddenly we came face to face with a situation that was almost irreversible. 

Konstantinos Komaitis: So we all came together. Collaboration became key. A clear understanding of what climate change in the environment means also became key. And then regulations became a reality that actually started conducting an impact assessment analysis on how that regulation, that specific regulation at any given time would affect the environment. So in this context, I think that environmental governance offers us a cautionary tale. Again, we are at a stage where there is a lot of ample regulation, and we know that it's chipping away slowly the values and the benefits of the internet. 

Konstantinos Komaitis: One of the things that I was told by some engineers - and it has really, really stuck with me - was that if the internet is going to die, it's not going to die of one cut. It's going to die of a thousand cuts. And this is where we are headed in some ways, right? There is this wave of regulation that is constantly happening around the world. And in many - in some ways, there is even a regulatory competition amongst governments. If this trend continues without us reaching consensus - right? - on how best to collaborate and under what principles and under what values we want to collaborate, things might be irreversible. 

Konstantinos Komaitis: So it is important as we move forward - because this regulatory activity is not going to end, of course. As we move forward, I think it becomes more and more crucial to start calling and start demanding those impact assessment analysis to take place within internet regulation because I really do not want to find ourselves in the position where we are told, you know what? The situation is a little bit irreversible, and you might need to live with an internet that is less global, less open, less interoperable and where less innovation is happening. 

Dave Bittner: I mean, do you suppose we need some sort of international effort, you know, a Geneva Convention for the internet? 

Konstantinos Komaitis: That's a very interesting question, and there has been quite a bit of a discussion with that over the years. I think that one problem with moving too fast with any sort of international agreement is that we are not, amongst ourselves, very much aware of what we want to protect and whether the principles and the core properties, if you want, of the internet that we want to protect. Right? That is the first thing. The second thing is that international agreements have the tendency of being extremely bureaucratic and extremely slow. And by the time we get to that agreement, we don't know what the Internet might look like. And then there is also a third party, where I think it is mostly reserved in a technology like the internet. It was not a creation of one person or one entity or of one country. The internet is an outcome of collaboration. 

Konstantinos Komaitis: And unfortunately, if we start looking at international law and the places where international law is happening, those are very much reserved for one group, which is governments. And that's OK. But it really begs the question whether those are the appropriate fora for decisions about a technology where knowledge and know-how is spread across a diverse set of stakeholders is going to happen with this monolithic institution that only reserves places for governments. 

Dave Bittner: What would your message be to legislators here in the United States who are trying to take on some of these issues? 

Konstantinos Komaitis: Understand the internet. And if you don't, ask around. There are a lot of people that know and would be able to explain to you. You need to find - I think that governments - it's very important for them to find their starting point. Who do they want to be when it comes to the internet? Do they want to be a government that upholds an open, interoperable global internet? If that is the case, then they really need to understand that those things are not a given, and they should not be taken for granted. And therefore, a good - a very good starting point in order to be able and demonstrate your commitment to those things would be to actually do an impact assessment analysis and see how your regulation, A, could - does or does not, better yet, affect the architecture of the internet and, B, is it as focused as you think it is? And does it address the problem that you want it to address? Because one of the things that we, again, seeing over and over again with regulation - with recent regulation is that we are trying to address societal problems through technical fixes. And that rarely works, as we all know. 

Konstantinos Komaitis: So we need to understand that the internet is a very particular and fascinating technology that is based on some very, very robust and fundamental properties. And once you start understanding that and you take steps to protect them, then regulation even becomes easier because you are able to focus. And you are able to say, OK, I still have the technology, so now this is my problem. And I can squarely focus on that - on solving that because my regulation has not affected the technology by creating unintended consequences. 

Dave Bittner: All right, Ben, what do you think? 

Ben Yelin: So very interesting conversation. I think there are unintended consequences of well-intentioned regulation. You know, one thing I came across in the conversation is that the internet is still relatively new, and it kind of attains this mystical status for the first 20 or so years of its existence that it's - we're dealing with such a vast amount of information that that can travel instantly. You know, we're granting access to information for the first time to billions of people. This is awesome. Like, why would we do anything to try and curtail this in any way? 

Dave Bittner: Right. Right. 

Ben Yelin: And now, over the last five to 10 years, we're seeing, OK, now we're starting to discover the excesses of this, whether it's consequences to people's personal privacy or what we've seen with Russian disinformation campaigns. 

Dave Bittner: Right (laughter). It could be the end of democracy. 

Ben Yelin: Yeah - China stealing trade secrets. 

Dave Bittner: Right. 

Ben Yelin: So I think what he was getting at is we just have to strike a balance - not regulation for regulation's sake, still trying to preserve that original free spirit of the internet, but doing so in a way that helps us avoid some of these more negative consequences. 

Dave Bittner: I mean, it's such a tough balance to strike, though. There's a natural tension there that I suppose, in the end, is probably a good thing. 

Ben Yelin: Yeah. I mean, I think that tension is a good thing. I don't think we should just be rubber-stamping any proposed regulation of the internet. And I think that's why we have these institutional bodies. It was the European Union and its institutions that developed GDPR, our Congress that has been slow but has eventually come up with ways of regulating the excesses of the internet. And so - yeah, it is always about striking those balances. 

Dave Bittner: Well, again, our thanks to Konstantinos Komaitis from the Internet Society for joining us. 

Dave Bittner: That is our show. We want to thank all of you for listening. "The Caveat" podcast is proudly produced in Maryland at the start-up studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.