CyberWire Live - Q2 2020 Cybersecurity Analyst Call
There is so much cyber news that, once in a while, all cybersecurity leaders and network defenders should stop, take a deep breath and consider exactly which developments were the most important. Join Rick Howard, the CyberWire’s Chief Analyst, and our team of experts for an insightful discussion about the events of the last 90 days that will materially impact your career, the organizations you’re responsible for, and the daily lives of people all over the world.
Transcript:
Rick Howard: All right, boys and girls. I think we can get this thing started. Welcome, everybody, to the CyberWire's quarterly analyst call. My name is Rick Howard. I'm the chief security officer here at the CyberWire. I also host a podcast called "CSO Perspectives," and I'm also going to facilitate this discussion for this webinar. One programming note, though - we were originally slated to have Dave Bittner, one of our producers and host of, jeez, a hundred podcasts for the CyberWire. But he got a sports injury this weekend, so he's laid up in bed. Now, don't worry - nothing serious. He's going to be fine. He's just happy that he can claim a sports injury for missing work.
Rick Howard: So I had to reach out to get a replacement. So I found my best friend and colleague Steve Winterfeld. Steve's over on my right. He is the advisory CISO for Akamai Technologies. He's also a member of the CyberWire's hash table expert members. All right. So welcome, Steve, to the program. Thanks for doing this.
Steve Winterfeld: Yeah, good to be here. Thanks.
Rick Howard: And the other one is Ben Yelin. Ben's on my - top of my screen. That's how I'm doing it. Ben, I have to read your title 'cause there's so many words in it.
Ben Yelin: Unfortunately, yeah.
Rick Howard: You are the program director for the Public Policy and External Affairs at the University of Maryland Center for Health and Homeland Security. Does that all fit on a business card?
Ben Yelin: Barely, yeah. It has to be very small fonts.
Rick Howard: (Laughter) All right. And you're also a co-host with Dave for one of our other podcasts called "Caveat." This is our second show in the series where we go back 90 days and try to pick out the two or three stories that are - had the most impact or they're most interesting. And oh, my. In this last 90 days have there been some really interesting things.
Rick Howard: And Steve pointed out to me that we would be remiss if we don't highlight the fact that on 4 May was the 20th anniversary of the ILOVEYOU bug, all right? And according to Time magazine at the time, that bug infected some 50 million users in about 10 days. And that was a big deal back then. And I know, Steve, that when I received my email from you, then you told me that I love - that you love me. All right. I totally opened that email. What about you?
Steve Winterfeld: So I'm going to pretend that I wasn't taken over and that I did send that and meant it heartfelt. But more importantly, it was kind of the first indicator that I had long-term job security in information security.
Rick Howard: Oh, yeah. Very good.
Steve Winterfeld: And so I was still blown away by the realization that, once they found the people that wrote it, there were no laws against what they did. And so it was basically just a resume builder for the two people that created that.
Rick Howard: (Laughter) I would call it a success, then, for the ILOVEYOU bug. Some of the stories we considered for this program were around COVID, like the improvisation of the Zoom CEO. His name is Eric Yuan. I think that's how you say it. He suddenly discovers that his SaaS application for web video is the thing that everybody on the planet wants. And then immediately after, the network defender community decides that it's wanting in the security department. But by all accounts, some of the moves he's made to fix all those situations have been positive, so we're going to cut him some slack.
Rick Howard: We also considered the contact tracing apps that government agencies and some commercial organizations have been developing to track the pandemic. And it is because for, you know, privacy and security for us, maybe not for the pandemic 'cause we kind of need it, but we're kind of worried about what happens afterwards. So we'll keep our eye on that. All right. And the third thing we were looking at, too, was how cyber criminals came out and said, we're not going to take advantage of this pandemic thing to make money. And then that lasted about 10 seconds before they went after hospitals and the like. So that didn't happen.
Rick Howard: But for this discussion, we decided that we didn't want to talk about COVID-19 because everything that was needed to be said about how to do security in the pandemic is pretty much covered in the early months of the old grind. So we decided we wanted to get away from that. We know that network defenders are just kind of grinding through all that stuff. But we're going to choose topics not particularly located, you know, around the COVID topic.
Steve Winterfeld: But I don't think it could be a podcast right now if we don't at least say the new normal once.
Rick Howard: OK.
Steve Winterfeld: So some of the people out there playing that game can have a drink.
Rick Howard: So that's the right - the top-right square on your bingo cards. OK. So - all right. So before we get started, though, if the audience has any questions about anything we were saying, just go up there into the webinar question panel and type. And we'll try to answer them as they come through. And we'll try to get answers to all of them at the end if we don't get to yours in particular. All right. So let's begin. Steve, let's go with you. Your big topic for this webinar comes out of Australia. So tell us about it.
Steve Winterfeld: Yeah. I find it interesting. There have been some great technical challenges, but I think some of the policy challenges are more far-reaching. And so this was the Australian Prime Minister Scott Morrison and his defense minister, Linda Reynolds, coming out and announcing that the country's government critical information infrastructure and commercial industries were being targeted by major cyberattacks and from a nation-state, stopped short of attribution.
Rick Howard: They kind of waved their hand at China, though, a little bit. He didn't really say it.
Steve Winterfeld: And there were some off-script comments about China but just no attribution there. You know, stated that the need for talking about this right now was the increase in frequency they're seeing, the scale and sophistication. And they're worried about those impacts, in the interests of everybody, wanted to bring this up to raise awareness. It was interesting. They talked about, you know, their 2016 strategy, the over $200 million in investment they've done to become a regional cybersecurity leader. It got some - a little international press. It got probably a week of high press coverage in Australia. They did put out a technical advisory called Copy and Paste Compromise, derived from the use of open source tools. It's about 48 pages long, kind of followed the kill chain in the thought process of how they broke out the different challenges.
Rick Howard: Yeah.
Steve Winterfeld: I thought that was interesting. The one thing that that was not in here but in separate stories was the - just like with the U.S., you know, our GAO organization, the Australian equivalent of their audit agency has been listing all of the challenges that they have around cyber and being cyber secure. So what do you guys think as far as why they were talking about this and why now?
Rick Howard: Well, I'd like to jump in here first because when I listen to the prime minister say, hey, we're, you know, we're shocked that nation-states are using cyber to do espionage and things, it reminded me...
Ben Yelin: Really?
Rick Howard: ...Of the old "Casablanca" movie - right? - when the guy comes out of the gambling room and the guy who just wins a lot of them - he said, I'm shocked, I'm shocked, I say, that there's gambling going on here. So the question I have is, why is he announcing this now? See, that's my question. Ben, you got any ideas about that?
Ben Yelin: I mean, I think there is some important political context here. The Australian government had been particularly critical of the Chinese government's response to the initiation of the COVID epidemic. And this is sort of - at least the context of some of the articles I've read is this has been interpreted as a bit of a retaliatory attack for that. And so, you know, it's kind of a new venue for political warfare between countries. You see that in the physical world. And I think, Steve, you made this point in some of our discussion offline, you know, we killed Soleimani, for example, in Iran. And they're compelled to respond by bombing one of our military bases. It might be that a country like China feels compelled to respond, and these are the tools that they have. So when they're criticized, when, you know, they feel threatened geopolitically, as a state actor, you know, this is their tool of soft warfare, if you will.
Rick Howard: Well, I think it's interesting that - oh, go ahead, Steve. I'll let you go. Go ahead.
Steve Winterfeld: Well, I think my point will be much more interesting, so I appreciate you letting me go first.
Rick Howard: That's why I pay you the big bucks.
Steve Winterfeld: I think one of the things that's interesting is originally, you know, cyber was more espionage and spy. Over time, I think it has become an official, you know, one of the five domains of warfare. And I see this as we're transitioning from espionage to reconnaissance. And I think what this indicates here is, while they stated there was no harm done, this is more actual reconnaissance against both critical infrastructure, government capabilities and commercial intellectual property. And so, you know, if we transition from reconnaissance - and looking around, you know, we saw Iran and Israel in this last quarter, cyberattacks against both a port and water and sewage. Please don't attack my water and sewage. You know, so we've seen this go from, you know, actual reconnaissance into raids. Rick, I know you were reading "Sandworm." What are you seeing out of that book?
Rick Howard: Well, I mean, I love "Sandworm." And it really paints a picture of this continuous low-level cyber conflict that many of the nation-states that we all know and love have been involved with for the past 10 years, right? "Sandworm" is mostly about the Russians and how they've gone after Ukraine and their critical infrastructure, all right. But peppered through that book are all the interesting big breaches, big cyber operations that we've seen over the last 10 years like Stuxnet and OPM and things like that. And it just describes in detail, OK, that this kind of thing is going on, that this short-of-war activity - OK? is going on between nation-states, and they can get away with a lot of stuff just because it's in cyber - all right? - that we don't have to actually do some sort of physical war. I think - and I totally recommend this book. What's it called again? You just told me, and I lost it.
Steve Winterfeld: "Sandworm."
Rick Howard: "Sandworm." Right. I've totally just - I need you to put me out to pasture. I can't remember anything.
Steve Winterfeld: (Laughter)
Rick Howard: I would say that book and David Sanger's "Perfect Weapon" - OK? - who has a broader view of the world. He covers all the nation-states, as opposed to just Russia. That really gives you a picture of what is really going on here. And, Ben, let me go to you. This is a big change legally, all right? What are the lawyers saying about this kind of activity that is just short of war.
Ben Yelin: Well, I mean, in some ways, it is just short of war, but in some ways, it's not because when we're talking about something like critical infrastructure, that can have real-world physical impacts. In terms of the law around cyberwar, cyberwarfare and, you know, terms of engagement, it's really a developing field that I think has - you know, we're behind in the international community, where we are in terms of all other types of warfare. And I think it's going to take more incidents like this until, you know, there's some sort of consensus developed on rules of engagement. And...
Rick Howard: Hey, can we throw up the - we prepared a poll question for this topic. Can you throw that up there, Stefan? And then - I'm sorry I interrupted you, Ben.
Ben Yelin: No, no, go ahead. Ah, OK.
Rick Howard: So what do you think? You know, from the panelists' viewpoint, what do you guys think is the answer here? Steve, what do you think?
Steve Winterfeld: So most concerned, most active - I mean, most concerned I think transitions over time.
Rick Howard: That's true.
Steve Winterfeld: And during the elections, one of these would be higher. You know, during times of natural disasters or international disasters, like the COVID, you see some of these that focus more on phishing emails as their methodology would become a little higher. And then as some of these, as they want to be - to gain international attention, will become more active. And so it's hard to say one of these is steady state. What do you think?
Rick Howard: Ben, I'll go to you first.
Ben Yelin: Yeah, I think China has the greatest technological capabilities. I think Russia has the most human resources - the, you know, institutional intelligence agencies that are willing to carry out cyberattacks as a political weapon. Obviously, China is not far behind there as well. That's sort of where I would lean on that. I mean, Iran - probably where it lacks in infrastructure and technological expertise, it has gotten increasingly bold. And I think that's an area - that's certainly a threat that we have to watch out for going forward.
Rick Howard: So let's close the poll and get the answers. And while we do that - the one I'm more afraid of than all of these is Russia, OK? They seem to be not only trying to steal stuff and having capability later to do, you know, leverage operations on it, but they seem to be going after the fabric of our society. Those are the ones I worry about. Go.
Steve Winterfeld: So it's interesting you say that. Ben, I want to go back to something you said earlier. You know, this could be, to some degree, retaliatory on the larger political scale. The other I've seen - other article in the last quarter that I thought was interesting is, out of the U.K., where they put in a mandate that only 35% of the 5G infrastructure could come out of China. And so there are some real supply chain issues that are coming up in national debates. Have you seen much of this and have thoughts on that?
Ben Yelin: I did just see today that the FCC designated Huawei as a adversarial organization, which could have consequences in terms of their ability to contract in terms of building out a 5G network in the United States. Didn't get a chance to read that carefully. It literally came on my screen an hour before we got on this call. But yeah, I mean, I think that's certainly something to look at.
Rick Howard: So the poll question came back with 59% thought China was the biggest threat, 32% Russia, 9% Iran. So that's very interesting. We've got a question from Cody - I can't even say your name, Cody. Sorry. Here's the question - is open cyber conflict between nations an established norm? He says no country, including the U.S., seems to want to take it off the table as an option for themselves.
Steve Winterfeld: I think norm is interesting, you know. It took hundreds of years to understand what the norm for land battle was, the norm for a war at sea. You know, the concept of privateers, nation-states, hiring pirates to act on their behalf at the time wasn't legal, then became legal. And so I think that one of the challenges - when you say norm - is, legally, it's hard to keep up with what's going on. And as far as principles of practice, again, it is happening much faster than diplomats can come up with the norm. So I would say the speed is preventing a norm from being established.
Ben Yelin: There is an effort. I think the United Nations, towards the end of 2019, put together a preliminary framework on what sort of acceptable rules of engagement would be for cyberwarfare. You know, how that can actually be enforced against nation states is one question. Then when you talk about non-state actors, it becomes even harder to enforce. And it's just frankly much easier for non-state actors to engage in cyberwarfare than it is for them to engage in traditional guerrilla warfare or, you know, literal boots on the ground. And so that's, I think, what makes this threat so particularly unique.
Steve Winterfeld: Yeah - lower cost of entry, a lot more people can play. My bigger concern from a warfare point of view is collateral damage. How many commercial companies out there are collateral damage, either from their infrastructure being used or their technology being stolen or any one of many ways?
Rick Howard: So we got a comment from James Dawson (ph). He says he thinks that China has the best ad (ph) attacks and APTs. I'm not sure I - I'm not sure I go along with that. Of course, I'm buried nose deep in the "Sandworm" book. And the Russia attacks on Ukraine - two different ones, by the way - to actually turn the power off in that country seems pretty badass to me. So I'm not going to give China the championship round just out of - just because they've been active for so long, I don't think.
Ben Yelin: Going against 59% of the popular will, Rick - that is a...
Rick Howard: I'm a naysayer. What can I say?
Ben Yelin: ...Bold and adversarial.
Rick Howard: (Laughter).
Steve Winterfeld: I think some of it's regional. I mean, if you want to know who the cyber power is in South America, it's Brazil. You know? So some of it's regional, and some of it's around, you know, techniques. I think, politically, Russia's more cyber-active than others. Intellectual property, I don't know that Russia would beat out China.
Rick Howard: Yeah, that's a good point. We could talk about this subject for the next 20 hours, but we need to move on to the next one. OK. Ben, let's swing it over to you, OK. You are - the story you picked out comes out of Section 320 - ah, I can't even say it - the 230 of the community - I - just go ahead because I can't remember what it is about.
Ben Yelin: Sure. I - happy to take over here. So Section 230 of the Communications Decency Act, probably something that many people in this country have been hearing about for the first time over the past several months. This comes from a 1996 piece of legislation. The legislation was actually enacted to regulate content on the internet. But there's this carve-out, Section 230, which protects content platforms on the internet from liability for their content management practices.
Ben Yelin: And companies like Facebook and Twitter and Google rely on Section 230 as a shield from legal liability. There are some exceptions, you know, as it relates to things like sexual exploitation, child pornography. But for the most part, these companies are able to make content management decisions rather freely without the threat of endless litigation. And so this has fostered, really, the free and open internet as we know it. Without the constant threat of litigation, these companies are able to regulate what type of content can be posted, exercise their First Amendment free speech rights.
Ben Yelin: Recently, this provision of that act, Section 230, has come under increased scrutiny from both the political left and the political right in this country. The criticism from the political right is that these social media companies are biased against political conservatives. And this shield of liability is unduly protecting these companies when they are not adequately allowing for broad viewpoints on their network. Basically, the accusation is that these content platforms are censoring conservative viewpoints, labeling conservative content as against the terms of service.
Ben Yelin: That kind of went to a head last month when Twitter reacted to a couple of President Trump's tweets by putting little warnings under them saying, you know, this tweet is particularly inflammatory, but we're keeping it up because it has public significance. He is a public figure. I think they did that for one of the tweets he sent about the George Floyd protests. And then they've labeled a couple of his other tweets as potentially misleading as it relates to mail-in voting.
Ben Yelin: In response to this, the president enacted an executive order. It was largely a toothless executive order. But it called on the relevant federal agencies to pass regulations to start to chip away at Section 230, saying that in order to maintain this shield of liability, these companies had to make more of a conscious effort to be free from political bias in terms of their content moderation. And we've had proposals from Republican members of Congress saying that there should be some sort of bipartisan commission oversight group making sure that the content management decisions on these platforms aren't coming from a place of political bias.
Ben Yelin: And then we're also seeing attacks from the political left, including from the presumed Democratic nominee Joe Biden, basically saying that Section 230 has allowed - you know, has given Facebook and Twitter free rein to post inappropriate or potentially damaging content, including, particularly in the context of Facebook, fake news. And because they have that shield of liability, it keeps them from moderating content, keeping dangerous information off the internet, and, you know, potentially poisoning our democracy.
Ben Yelin: So we have this executive order. We've had a couple of competing proposals in Congress to further regulate or cut away at Section 230 of the Communications Decency Act. I think it remains to be seen whether we're actually going to see some legislative action. There are certainly limits as to what can be done purely at the executive level. But it's just kind of been a fascinating view into this 20-year-old law which has really allowed the internet to flourish but potentially comes with these downsides. And I'd be happy to kick over the discussion to my colleagues here and take your questions on it.
Rick Howard: I'm kind of confused about this. I think I understand it, but it feels like there's three pieces of this and not just two. Right? And what I mean by two is there's the social media companies that don't want any kind of regulation. They want to be considered as a platform and have no responsibility of the content.
Ben Yelin: Right.
Rick Howard: And so they have this liability protection because of that, right? But on the other side, people are saying that they're not letting us do the - for what the small number of things they do check against, they're not letting us say all the things we want to say. So we're going to take away your liability limitations. I'm not sure how that fits.
Ben Yelin: Yeah. I mean, it's sort of the stick in the carrot-and-stick approach. It's a way to get, in the views of some on the political right, these companies to pay attention to conservative complaints that content is being unfairly moderated - that, you know, certain individuals, certain accounts, have been de-platformed, even if they have not explicitly violated the terms of service on this website. And the only stick they really have in the bundle - the only way they can actually have an influence on these companies, is to say, hey, how about we look at that law that shields you from legal liability? How would you like to be sued, you know, for every content management decision that you make? It would obviously hurt your bottom line. So unless you want that to happen, you should listen to us.
Rick Howard: Well, see, that's the third - that's the third part that confuses me. Right? So if you take that away, the liability piece - right? - doesn't that mean they have to be more aggressive in checking that kind of speech - right? - or speech that some people would consider offensive and things? Or am I wrong about that?
Ben Yelin: No, you're absolutely right about it. I mean, taking away the liability would have the practical effect of removing a lot of content from these platforms just because, with that fear of liability, you wouldn't want to post anything that could expose you in a legal sense. So anything that could be considered defamation, anytime that there is, you know, a threat against a public official, any time there's another actionable piece of political speech, you know, you'd be so cautious about putting that on your platform that you would not - you would just decide not to. And that would be a content management decision in and of itself.
Ben Yelin: So yeah, there really is kind of a disconnect between the complaint, which is that, you know, you are censoring a certain type of political content, versus the remedy, which is, you know, we're going to put - we're going to expose you to legal liability. You know, I think the idea is, by threatening them with taking away this liability shields, these companies will start to be more conscious of potential political bias in their content management decisions.
Rick Howard: So Steve, I'll come to you in a...
Rick Howard: Yeah, go ahead.
Rick Howard: Let me come to you in a second. But let's put the poll up to poll the audience about what they think about this. And go ahead, Steve. You were going to say.
Steve Winterfeld: You can go ahead and talk about the poll real quick.
Rick Howard: Just that you guys can read that, just make your choice. Do you think social media platforms should be protected from liability for their content management practices? And we'll let the audience answer. But Steve, go ahead. Make your point.
Steve Winterfeld: Oh, so first, thanks for giving me another opportunity to go read U.S. legal code - always exciting.
Ben Yelin: Thrilling, thrilling. Yeah.
(LAUGHTER)
Steve Winterfeld: The section was surprisingly short, you know, talking about Samaritan, talking about hate speech and some safety issues. So the intent's there, but the intent is not able to link up with the forum that we're talking about. And I love Bruce Schneier's you know, point where he talks about, there are people that know how to write laws; there are people that know how to make, you know, technology work and what's practical. The cross-section of those is just minute. And so when you...
Ben Yelin: Extremely, yes.
Rick Howard: (Laughter).
Steve Winterfeld: And so when you have someone write a broad law, I'm really worried about the law of unintended consequences. And here's a prime example. If you make me liable, then it's going to change my business model. There are not a lot of automated ways to do this. So now do I have to have a person in the loop? Now what is my criteria? What is my legal criteria for bias? It just gets very complicated when you make this an issue that I have to change my entire revenue model. And you're going to see a lot of people walk away from things that give a voice to people that didn't have a voice before.
Rick Howard: So OK, let's take the poll down. And I can tell you the answers. Fifty-six percent of the audience said that it should be regulated, and 44% said no. All right. That's really interesting. And I got a question from James Dawson from the attendees. He suggests that since they're making money, they have the responsibility to self-regulate of those kinds of things that we don't want to hear on those platforms. I don't know if I agree with that.
Ben Yelin: I don't - yeah, I don't necessarily agree with that either. I mean, I think the only way they're ever going to self-regulate - I just - maybe I just don't trust companies to self-regulate in that manner if they don't think it's going to be a threat to their to their bottom line. I think the only way that they would engage in self-regulation is if they were facing legal liability.
Rick Howard: So some threat that they don't want to have to manage through the government, they'll do it themselves. That's right?
Steve Winterfeld: And they will regulate towards audience desires.
Ben Yelin: Absolutely. That's a very important point, yep.
Steve Winterfeld: You know, so I'm going to put out content that sells. And I'm going to, you know, respond to what my customers are clicking on. And so you talk about self-regulation, and yet the audience is driving some of the views.
Ben Yelin: Not just the audience but the advertisers, as well. I mean, we've seen...
Steve Winterfeld: Yes.
Ben Yelin: In the past week, there have been a number of advertisers who are saying, we're going to pause our advertising on social media because we're not sure, for example, that Facebook is doing an effective job moderating its content. There is hateful content on that platform. You know, we don't want to be associated with it. That's a form of regulation that comes from the private sector and not from the public sector. I mean, that can really, you know, encourage companies like Facebook to change their behavior. But certainly, you know, granting people the ability to sue these companies would be the ultimate way we could coerce these companies into action.
Rick Howard: So I'm - I struggle with this one because I understand the viewpoint from the platform maker - right? - that they have - their job is not to regulate speech, OK? That's not what they're in it for. But we can point to - geez - where it may have contributed to some really hurtful things, OK? People even getting...
Ben Yelin: Doxxing, yep.
Rick Howard: Or even people getting, you know, physical violence against them because people work themselves up into a furor, right? So I don't think that social media people can - or the platform leaders can, you know, absolve themselves of any kind of ownership of this. They have to take some stand here, right? And I think it's easier when you have ideas that we can all agree are onerous - right? - when a political leader comes out and just outright lies about something, right? It gets harder to do things when it's grayer, you know, when it's - when you have to, you know, pick apart the material.
Steve Winterfeld: So what's the legal...
Ben Yelin: Yeah, and I think...
Steve Winterfeld: I'm sorry.
Ben Yelin: No, go ahead.
Steve Winterfeld: What's the legal difference between, you know, a web platform and a newspaper, then?
Ben Yelin: So a newspaper is considered a publisher. They do not have that liability shield. And everything that's published on a news site - so even an editorial from The New York Times - that they're treated as publishers, and that means they're not shielded from liability. So if one of their op eds defamed somebody, you know, they could be subject to a lawsuit for libel or defamation, whereas a content platform like Facebook or Twitter would not be legally liable because that's the user who would've posted that content not the content platform. And so that's the legal distinction.
Steve Winterfeld: Does that include the comments I put in under a story on a news website?
Ben Yelin: Yeah. So, you know, that's actually a point that's brought up frequently. It's still - the news site is still making content management decisions. They are still considered a publisher, since the original content that you're commenting on is something that they've published on their site. Meanwhile, the other thing that's important to mention is Twitter or Facebook or whomever can act at - even though they're platforms and that's protected by Section 230, could act as a publisher in certain contexts.
Ben Yelin: So, you know, if they were to write a long diatribe on their own platform railing against a president's political decision - if there was a valid cause of action, they could be sued for that because they are, in a sense, acting as the publisher. But when they are just moderating, when they are just the platform for the content, I think the public policy decision we've made is we want as much content out there as possible. Let free speech flourish. All Twitter is doing, all Facebook is doing is, you know, opening up this public square for public debate.
(CROSSTALK)
Rick Howard: We're getting A follow question from James Dawson, the - his counterpoint to what you and I were saying is if an organization makes money on the views - the content makes them money - they must self-regulate their users. There's no question on that. No suit liability release. Is that making sense?
Ben Yelin: Yeah. I mean, that's certainly a legitimate argument. I mean, I think there are a lot of companies who make money on content that all of us would probably find very objectionable. And unless they're liable for that, I mean, there's really no reason why that entity would take down that content. So, you know, the reason these companies are making content management decisions now is because they want to protect the integrity of their platform. As you said, Steve, they want to be loyal to their readers and their users and the values of those users. And they want to be loyal to their advertisers.
Rick Howard: So I got a question from a user who calls himself Duncan Idaho. So nice "Dune" reference there. OK. So, Ben, do you foresee congressional action on modifying or overturning Section 230 this year or in the near future? And why and why not?
Ben Yelin: So I think it is unlikely because the parties are - even though both parties are - political parties are skeptical of Section 230 and, you know, might think that it is ripe for reconsideration, they're coming at it from such different angles. And the problems that they have with the statute as it exists are so distinct that I don't see, especially in a divided Congress in an election year - I don't see them coming to some sort of consensus. Obviously, the political situation could change next year. We have no idea what that's going to look like. But in terms of what we can expect in the very near future, I think Section 230 is here to stay, even despite the president's executive order.
Rick Howard: So we need to change subjects again. That was a good one, another one that we could probably spend the next couple hours talking about. The next topic is my topic. And I literally jaw dropped when I read it. It's coming out of Germany, right. The German Constitutional Court said that intelligence gathered on foreigners coming across their network should be treated the same way as traffic from their own citizens. Let that sink in for a second. That's the first country that I've ever seen have that perspective on collecting intelligence on foreign bodies. All right? So it's amazing to me.
Rick Howard: And the story about why that's come to fruition is kind of interesting, right? Back after 9/11, the NSA organized information sharing arrangements with lots of European countries to include Germany. They made a arrangement with - and I can't - how do you say this, Ben? We've never talked about this before - the Bundesnachrichtendienst. Did I get that close?
Ben Yelin: I think you nailed it, yeah. I have a ringer who is a colleague of mine who's actually from Germany who sent me a pronunciation. But I can't recreate his pronunciation myself, so yours will have to do.
Rick Howard: OK. So I'm going to be the expert. Right? So we can refer to them as the BND. They're basically the NSA of Germany. Right? And they were discovered - they were collecting more intelligence than the 20% allowed by law. All right? And so their congress, they just fixed that by just saying they can collect everything. And they also gave them the permission to collect intelligence on journalists, which - that's kind of startling. Right? Well, the German constitutional court took a look at that and said, that's not the way it should be. There should not be a double standard. If we're going to have privacy rights for our own citizens, they should apply to everybody that traverses our networks. And that is - that is a new thing. Right? That's why I wanted to put it out there. And I wanted to get your guys' take to see if you think this will hold. They have agreed to fix the situation by 2021. So Steve, I'll come to you. What do you think about this? You're an old intel guy. How do you feel about that?
Steve Winterfeld: And so I mean, it's - why would you put handcuffs on...
Rick Howard: Oh, here we go.
Steve Winterfeld: ...What intelligence you collect?
Ben Yelin: Here come the angry emails, yeah.
Steve Winterfeld: And so I'm publicly taking the devil's advocate viewpoint here so we can have a discussion. And I think it's important that devil's advocate viewpoint. So why would you handcuff yourself in what you collect? Why don't you put your regulations around how it's used? And so, you know, it is interesting that, culturally, I think the most protective of privacy is coming out of Europe right now. They're setting a lot of standards. You're seeing a lot of countries pick up that banner. And most of that that's been viewable is in industry - you know, regulation on what a corporation can collect. And the other half of that goes to, you know - how many times have we seen organizations find a way, a loophole, around those laws?
Rick Howard: Well, I'm an old intel guy, too - all right? - so - old Army intel. And I totally get why we want to collect intelligence on, let's say, bad guys and maybe bad guys that might be bad guys in the future. All right? So I understand that. But I've always had the issue with this kind of double standard where we think that laws like GDPR, where citizens have - can accept or expect rights for their privacy but it only applies to people within this really arbitrary physical boundary - all right? - we don't apply that to all humans on the planet. It only applies to our citizens. So I have issues with that, and I have not come to a decision about that. Ben, you've got to help me out here.
Ben Yelin: Yeah. So you're in the wrong country with that viewpoint. The relevant case law in the United States is United States v. Verdugo-Urquidez - I think it was a 1989 case - which held that the Fourth Amendment does not apply to non-U.S. persons outside the United States when the United States is the party doing the search. And the rationale is basically when the Fourth Amendment was drafted, it was intended to apply to our national community. And that community was U.S. citizens, of course, also. U.S. residents or anything that happens on U.S. soil. And so all of our surveillance programs are centered around that ideal that there are really two different separate standards for U.S. persons and non-U.S. persons.
Ben Yelin: I'm reminded of a - I show my students this clip from John Oliver back in 2015 when he was talking about NSA surveillance. And he was interviewing Edward Snowden in Russia, and Snowden was going on about all of the ways that our government surveils foreign governments. And John Oliver was like, no one cares - I don't care; no one cares.
(LAUGHTER)
Ben Yelin: Unless you're talking about surveillance on Americans, you're not going to get any person in this country interested in, you know, pervasive foreign surveillance. It's part of what our intelligence community does. And you know, I think that's sort of the value that we've espoused in this country. It's certainly different and interesting to see Germany go in such a different direction.
Rick Howard: Well, let's pull up the poll question and see what the audience feels about this. Can you pull it up for us, Stefan? Right. So in terms of privacy, do you think all governments should treat citizens' and foreigners' data with the same privacy laws? So have a hack at that if you will.
Rick Howard: I just find that - I understand what you're saying, and I've been brought up in the United States, too. And I under - I've always had dual minds where intelligence collection is different than, you know, spying on U.S. citizens. But if you raise it up a level, all right, why is it different? If we think that humans have basic rights for their data, why is it only applied to the citizens in our country? And Dawson - you know, I think James is our - he's on our guest list - I think he's either a wannabe lawyer or a lawyer. All right? So he said but they're all required - even if they do what they're saying, all those countries, they're still required to meet all 99 articles of GDPR. That's an interesting point.
Ben Yelin: Right. So they're still bound by that. You know, even though this was a decision by the highest court in the country of Germany, they're, of course, part of the European Union. So they're still bound by GDPR. And another thing that's interesting from - I know that I'm making this very U.S.-centric, and you can criticize me for that - but our surveillance regimes are completely different, whether that surveillance is taking place within the United States or whether it's taking place overseas.
Ben Yelin: So before, I talked about the difference between collecting info on U.S. persons and non-U.S. persons. There are a lot of regulations, even if we're trying to get info on foreigners. But, you know, those distinct communications are passing through our internet infrastructure. They're pretty specific delineated rules. And we could go into those. If we're going into another country and kind of sharing data with a foreign government or getting into their data centers, you're bound by a general executive order governing surveillance but not much more than that. So it kind of is the Wild West in terms of surveillance.
Ben Yelin: So I guess I'm just so used to that value that we've placed on foreign surveillance. It's just surprising to see another Western democracy take that approach. I think the question you ask is an important and philosophical one. You could really apply that to, you know, any area of the law that you want. Why are we protecting U.S. persons and not protecting everybody across the world? That could apply to what we do to prisoners at Guantanamo Bay. That could apply to...
Rick Howard: Don't get me started.
Ben Yelin: ...You know, social warfare benefits. I know. Yeah, it's just I think you're opening a can of worms there, Rick.
Rick Howard: Well, it was interesting to watch the poll results come in because it was 50-50 for a long time for everything. But at the end of the poll, 53% said all data collected should be the same. And 47% said governments should make exceptions. So I think I'm - I slightly win the argument a little bit, all right? So...
Steve Winterfeld: So the other thing...
Rick Howard: Yeah, go ahead.
Steve Winterfeld: The other thing I find interesting here is - and you touched on this, Ben - is, you know, the Five Eyes agreement - so those five countries that agree to share intelligence. You know, how do you - when you put this law in of what they can collect, then does that apply to what they can access through the Five Eyes? Does that affect the Five Eyes agreement and what's expected in partner countries to be able to leverage? So it'll - I find it interesting the impact on the broader intelligence community, as well.
Rick Howard: Well, that's interesting you say that - and I'd like to answer, but we've got this question from a user who calls himself 007, OK? And - it's a great name, too, right? So is there a distinction here theoretically between allies and enemies? I mean, I don't know how you would do it, but if you could pick out NATO allies on one hand and evil empire countries on the other, is this German law about spying on allies or just spying in general?
Ben Yelin: I mean, the way I...
Rick Howard: Yeah, go ahead.
Ben Yelin: You can go ahead if you want. The way I read the decision is that it applies to foreigners in all other countries, whether they're part of the Five Eyes or not. Maybe I'm misreading the decision. I think we're also not going to know about the full contours of how this is going to work in practice until 2021, when the Bundestag, the German legislature, has the opportunity to revise the law. And my guess is they'll probably try to come up with something similar, but it'll be written in a way that comports with these new constitutional privacy requirements.
Steve Winterfeld: Yeah, knowing the nationality of somebody from their IP address is a neat trick.
Rick Howard: Well, yeah, we didn't realize - it didn't say how it would be done. Just assuming that you could figure out what was going on, that would be interesting. So we're not in consensus here, right? We - I think we had Ben the liberal lawyer saying that the U.S. should have ability to collect on foreigners. And we're having the old Army guy, me, saying maybe we shouldn't do it that way. Steve, can you be the tiebreaker here?
(LAUGHTER)
Steve Winterfeld: So I think the tiebreaker is already - I mean, it's already broken. I mean, they've made their legal decision. I will be surprised if other countries follow suit.
Rick Howard: Yeah. I'd be surprised...
Steve Winterfeld: And I think as far as as our general opinion, I'm very situationally dependent. So I would tend to want a resource and try to govern how that resource is utilized, being fully cognitive of the number of abuses that have happened in the past.
Rick Howard: So I'd like to peel back that a little bit, Steve. You mentioned this before, and I meant to ask you about it. You were saying we shouldn't put laws on collecting things because that's kind of hard. It's hard to do technically. Your point is that you want to do it on how we actually use the data later. Sift through it and make decisions there. Can you expand on that? How would that work?
Steve Winterfeld: So, I mean, if you've collected data, then just like any other investigation, you would need to justify why you're accessing that data. And you would need to set a scope of what you're going to do with that data. And so the - you know, as an intelligence officer, I would love to do, you know, digital archaeology. Just go dig around and see what I can find. As...
Rick Howard: We're fronting on that a little bit, right?
(LAUGHTER)
Steve Winterfeld: As a person who was being investigated, I would prefer that they have to have a reason and a scope. And so, you know, that point of view is big. Ben, I think you were going to say something?
Ben Yelin: Yeah. I mean, so much of the controversy surrounding Snowden and his disclosures was the sort of indiscriminate collection, where there was no suspicion. So until a few years ago, the NSA was collecting the metadata from all domestic phone calls in the United States, and they had it at their disposal. Now, you know, I think what you're saying is we should have rules about whether a database like that could be searched, how it could be searched, how we could minimize that data and do individualized targeting. I think what critics would say is that you already have the data. So if there's one enterprising analyst there who wants to perform that search, they are able to do so. And I think that's why due to this sort of Snowden backlash that we've faced in the past several years, our governments and our legal system has tried to set up systems where before you even collect data, you have to have some sort of reasonable suspicion that you're going to find useful intelligence information in there.
Rick Howard: Yeah, because once the, you know, the genie is out of the bottle, it's hard to put it back in. I mean, the reasons that the Germans got caught collecting the data they shouldn't be collecting was they were - this came out of the WikiLeaks dumps - they were searching documents and seeing things from our allies - right? - and from organizations within the country that you would think, hey, these are our side. Why are we watching them? So - but if you're an intel analyst, once you have that, it's kind of hard not to see it anymore and, you know, act on it, I think is the problem.
Ben Yelin: It's like when my wife tells me I just bought 10 bars of delicious chocolate, and they're in the cabinet. I'm going to look if they're there, you know?
Rick Howard: I can't keep it out of my head (laughter).
Ben Yelin: Yeah, exactly.
Steve Winterfeld: And so - and that comes to the challenge, though. So now the flip side of that is, according to what you guys have said, we should do no collection until we have a reason to start the collection. At which point, as an intelligence officer, I'm going to tell you that you are now a reactive organization, not a proactive organization. So, you know, how is that helpful?
Rick Howard: I will just push back and say that when you hear security experts talk like that, they assume that the intelligence collection is useful, and that is, you need it regardless of anything else, all right? And I'm thinking that if privacy rights or just basic data rights is a bill of right, then maybe they should apply to everybody, and not just, you know, local people. I could be wrong here, all right?
Rick Howard: So we need to transition to the next thing. We want to do some general purpose questions and answers, folks, because some of the attendees gave us some questions they wanted the panel to ask. So let me take the first question from - I love this name - Wubba Lubba Dub Dub (ph). OK, so, Steve, that's your - my new nickname for you, OK? James Nolan (ph), what levels have you seen organizations embrace remote work, and have there been any inventive methods or implementations?
Steve Winterfeld: So I'll start. My general sense is, you know, most companies went from around 10% of their employees being remote to probably 90% if they were anywhere in the technology field. The economists said that before this, 1 out of 50 was a remote worker. Now it's 2 out of 3. So, I mean, both my experience and the economists are saying the vast majority are doing that. I think some of the innovation has been around staples, just getting basic access to resources. Some has been cultural. How do you maintain a corporate culture? We've seen people encourage, you know, be on video where bandwidth allows it. If your kids are upstairs playing Fortnite - that's where one of my colleagues today said, I can't be on video. My kids are playing Fortnite, and I don't have enough bandwidth. And then the last thing is, I know a lot of companies are doing things to help people build out their home office.
Rick Howard: Well, I've talked to a lot of CISOs over the last couple of months, and the transition has been a lot smoother than I think we all would have anticipated. They basically just extended what we normally do for remote workers to the entire workforce. Some of the problems we've run into has not been that difficult, mostly for the older folks who aren't used to doing this kind of thing - all right? - and just making sure the folks at home had the right, like you said, Steve, the right horsepower to run this stuff because if they're still running their old Windows 95 machine at home, maybe that's not going to work so much.
Steve Winterfeld: I talked to one CISO whose workers took their desktops home, so that didn't work out as well.
Rick Howard: (Laughter).
Steve Winterfeld: And so part of this was we did move to this because of risk, not because of the lack of technology. So our risk hand was forced.
Ben Yelin: Yeah. I've also just been struck at how agile the companies have been that have been involved in remote work. I mean, there was a pretty large security flaw exposed with Zune, for example, during the first month of our never-ending stay-at-home adventure, and they worked pretty quickly at coming up with a patch. And I think all of us have had to be agile - finding the right platforms that fit, you know, what we need to be doing at home and making the best of this pretty difficult situation.
Rick Howard: All right, gentlemen, we are at the end. We need to say goodbye for now, OK? So I want to give thanks to Steve for coming on and doing - filling in for Dave. Thank you, sir. Ben, it's always a pleasure. Thank you all for participating as attendees to this. And we'll see you at the next the CyberWire quarterly analyst call. Thanks, guys. We appreciate it.
Steve Winterfeld: Thanks. Have a great one.
Ben Yelin: Thank you, everybody. Take care.