Bing backend exposed, for a bit. CIA thinks Russian influence ops are top-directed. TikTok Global spin-off may not be enough. Destination automation. Hacks that weren’t, and one big guilty plea.
Dave Bittner: In an unusual lapse, Microsoft briefly left a Bing back-end server exposed online. Sources say the CIA has concluded that Russian President Putin is personally involved in setting the direction of operations designed to influence the U.S. elections. The deal to spin out TikTok Global to avoid a U.S. ban may not be enough. Europe looks for more control over tech companies. Activision's hack seems to be a mere rumor. Ben Yelin on Section 230 of the Communications Decency Act. Our guest is Ramon Pinero from Blackberry on the challenges of coordinating public services during the pandemic. And a Dark Overlord cops a plea.
Dave Bittner: From the CyberWire studios at DataTribe, I'm Dave Bittner with your CyberWire summary for Tuesday, September 22, 2020.
Dave Bittner: Researchers at WizCase on September 13 found an exposed back-end server that exposed data from Microsoft's Bing mobile app. Data is believed to have been exposed between September 10 and 16, at which point Microsoft secured the server. The server sustained several Meow attacks while it was exposed. ZDNet calls it a rare security misstep for Redmond, but notes that no particularly sensitive personally identifiable information appears to have been compromised.
Dave Bittner: WizCase said they found the following data exposed. Search terms in clear text, including the ones entered in private mode. Location coordinates. The exact time the search was executed. Firebase notification tokens. Coupon data. A partial list of the URLs the users visited from search results. Device model for the phone or tablet. Operating system. And three separate unique ID numbers assigned to each user found in the data - ADID, which appears to be a unique ID for a Microsoft account, deviceID and device hash.
Dave Bittner: Sources tell The Washington Post that a CIA assessment completed at the end of August concluded that high-level Russian leaders, including President Putin, were directly involved in attempts to influence the U.S. presidential election. The Post reports that President Putin, while interested in disruption and fissure generally, is seeking to denigrate former Vice President Biden. This is consistent with either a desire to see President Trump reelected - and with an outraged opposition - or a desire to see former Vice President Biden take office in a severely weakened political condition.
Dave Bittner: ByteDance's arrangement to retain a majority controlling stake in TikTok Global, with most of the remaining shares going first to Oracle and second to Walmart, may not pass muster with the U.S. government. According to The New York Times, the administration has signaled that it wants ByteDance out of the picture as far as control is concerned and that the large chunk of ByteDance shares owned by American investors won't cut it. It's not enough to allay concerns about Chinese control of the social platform.
Dave Bittner: A Wall Street Journal article sees U.S. animadversions about TikTok and WeChat as an instance of a continuing trend toward the fracturing of the internet along national lines. China's Great Firewall is the best known of such efforts, but other national and supranational groups are moving for various reasons in similar directions. The European Union, Computing reports, is seeking expansive authority to regulate tech companies. Facebook says, according to Vice, that if it has to put up with the restrictive data handling practices the EU's one-stop shop for the company, Ireland's Data Protection Commission, is seeking to enforce, Facebook may just stop doing business in Europe altogether, leaving some 400 million users wanting their Facebook fix.
Dave Bittner: FedTech preaches automation as the next frontier of a zero-trust cyber offensive. The Department of Defense, a cutting-edge cybersecurity player, just ordered a new tool that deploys advanced probability-based mathematics to mime decision-making. Automation can detect and classify threats, halt incursions and data transfers and free up human analysts for other tasks. As an added bonus, groups that invest in automation end up spending an average of $3.5 million less on breaches. That's nothing to sneeze at.
Dave Bittner: So, gamers, do you play Activision titles like the popular "Call of Duty?" Well, there's a rumor floating around that about half-a-million "Call of Duty" player accounts have been exposed by parties unknown who've hacked Activision. Now, Activision has consistently denied that it was hacked and that any accounts were lost. The story seems to be spreading in social media, notably YouTube, but there seems little to it. The claim is that account owners get locked out, lose their progress in the game and so on. Even though Activision reassures its users that there's nothing to worry about, the company does urge vigilance and sensible precautions against losing control of your account.
Dave Bittner: And, finally, remember the Dark Overlord? Or the Dark Overlords? It's hard to distinguish the fallen cyber angels when they're working together, and in any case, their name is probably hashtag Legion. Anyhoo, one Dark Overlord, Nathan Francis Wyatt, 39 years old and a British subject, took a guilty plea yesterday to U.S. federal charges of conspiring to commit aggravated identity theft and computer fraud, The Washington Post reports. He was involved in the theft of medical records, client files and personal information from companies. The Dark Overlord demanded between $75,000 and $300,000 worth of bitcoin to return the information. The companies didn't pay, although they incurred costs associated with restoring data and operations.
Dave Bittner: Mr. Wyatt received five years and was ordered to pay $1.5 million in restitution to his victims. In fairness to Mr. Wyatt, he's said to have shown signs of remorse during his allocution and sentencing, telling the court, quote, "I can promise you that I'm out of that world. I don't want to see another computer for the rest of my life."
Dave Bittner: Among the many things the global COVID-19 pandemic has brought to the fore is the need for fast, secure trusted communications between government agencies at all levels with their constituents. Ramon Pinero is vice president of services at BlackBerry, where he works directly with public safety organizations across the country to ensure officials have the tools and support necessary to communicate across teams and with citizens in real time.
Ramon Pinero: I think that we find ourselves in a unique place, certainly better where we were before, which is - and I am from California. And the way I grew up is where public safety was really being delivered to normal people in the way of interrupting my Saturday morning cartoons. I'd be eating cereal and be watching cartoons. And all of a sudden, I'd see the emergency broadcast message occur on the screen. It would blare and interrupt what I was watching. And to that end, I - really was the extent of it. Right? There was some letters that scrolled across the screen that said, hey, in an emergency, you're going to hear a loud tone, and you better heed our warning, and that was really about it.
Ramon Pinero: Now, though - and I think that with the advent of different technologies and with greater public awareness, the community at large expects a more comprehensive message, more rapid and more real-time information about any threat that faces them around public safety. So that could be anything from hazardous vapor or material leak, right? Let's say that there's a plume near my house or an earthquake or, you know - quite topical - COVID-19 guidance. Right? I live in a county that was hot - it was a hot spot here in Northern California. And the way in which my county communicated with me was very rapidly through my mobile device and sending me messages about safety guidelines.
Dave Bittner: You know, just yesterday, I was on my way home, and we were having some heavy rains here. And sure enough, up on my phone popped an emergency alert message that said, you know, we're under a flood advisory. And so is that the kind of thing that we're talking about today in the modern age?
Ramon Pinero: That's right. And so that flood advisory - and we can automate all of these workflows, if you will. But that flood advisory is important because not only is it providing you with the awareness of where to avoid the flood, if you will, but the community comes to expect now this type of learning. The fires that occurred here in Northern California, if you will, the community really wanted to know - and some systems were utilized quite effectively - around, how do I - what's my escape route? Right? When should I evacuate? And such that, if I'm not receiving that message anymore, these phone calls are going straight to, you know, office of emergency services, to municipalities saying, hey, you didn't warn me this was happening. What do I do? So it's quite an expectation that's been built up in the public.
Ramon Pinero: There's a county here in Northern California, Contra Costa County, they exercise their systems every Wednesday. And if for one reason or another a Wednesday gets skipped - right? - or they suspend, they really hear it from the community - hey, we didn't hear it. We didn't hear the system this Wednesday. Is everything OK? Right? And that's music to our ears because that means that the community's invested. They know what to do when they hear a warning. And the office of emergency services in that particular county is just kind of ready to go when the next crisis occurs.
Dave Bittner: That's Ramon Pinero from BlackBerry.
Dave Bittner: And joining me once again is Ben Yelin. He is from the University of Maryland Center for Health and Homeland Security and also my co-host over on the "Caveat" podcast. Hello, Ben.
Ben Yelin: Hi, Dave.
Dave Bittner: Article from WIRED - this is an editorial written by David Chavern - caught my eye. And I thought it'd make for good discussion between you and I. It's titled "Section 230 Is a Government License to Build Rage Machines." That's a provocative title there. Ben, can you take us through what they're going at here?
Ben Yelin: Yeah. It certainly grabs your eye - doesn't it? - as soon as you see that headline.
(LAUGHTER)
Ben Yelin: So what he's talking about is Section 230 of the Communications Decency Act. That statute protects what are called interactive computer services, like search engines but also like social media companies, from any legal liability resulting from the posts of their users, meaning, you know, if somebody gets wind of a conspiracy from Facebook and commits a murder, Facebook can't be held accountable for that because they're not held accountable for their editorial decisions. The rationale for this is to allow these companies to do proper content moderation without worrying about legal liability. So they can make their own interpretation of which content to ban and which content to allow. And that, you know, at least in theory, will foster a more robust free speech community online.
Ben Yelin: What this article is getting at, which I think is a very serious problem, is because companies like Facebook don't face this threat of legal liability, they make decisions really to drive their own profits, which for them means getting page views. And to get page views, you want to steer people to sensationalized stories, which is what, at least it's alleged here, that their algorithm does. And that's leading people to, frankly, some bizarre, conspiratorial, false information. And it's really corrupting our political discourse. You know, you'll see these anecdotes of interviews on the street where people will talk about conspiracy theories. And they're always sourced back to Facebook. My Facebook friend posted this, and I posted it to my 300 followers. And, you know, all of the sudden it goes around the world. And what this op-ed is saying is that's not healthy for our democracy. Facebook should not be shielded from this liability. If they are going to have this freedom to make editorial decisions, they should be held accountable for the resulting harm. And I certainly think there's merit in that, whether you ultimately agree or disagree with the conclusion.
Dave Bittner: What's the flip side of that? If Facebook does have legal liability, how would that possibly change the way the service runs?
Ben Yelin: Well, I mean, I think what they would say is they would constantly fear lawsuits. It would affect - they would ban more accounts. They would stifle free speech because they'd always be worried about liability. And, you know, they'd be so worried about content moderation decisions that they might as well, you know, not have a platform in the first place. So, you know, they wouldn't have that public arena to foster the marketplace of ideas. There is certainly something to that. And again, that's why the law was justified in the first place. But I think you have to strike a balance here. You know, it's one thing to allow them leeway in good faith to moderate content as they see fit, to make their own decisions about what is and is not appropriate on their website. But, you know, I think a company like Facebook, which has as large of a reach as it does and permeates so deeply into the fabric of our society, needs to be held to account in some way for its role in corrupting our democracy with false information.
Dave Bittner: And it does seem like Facebook is a - the way that the Facebook algorithm works, as you said, in order to drive engagement, it just amplifies this stuff.
Ben Yelin: Right. Right. It doesn't play a passive role in spreading these conspiracy theories, it plays an active role. And it's not just, you know, it's not just true for Facebook. You see it with things like YouTube, where, you know, I've observed this phenomenon where particularly young men start to search for video game, you know, I don't know what the young people do these days. But largely, for some reason, they seem to like to watch other people play video games.
Dave Bittner: Right.
Ben Yelin: And, you know, because of the way the YouTube algorithm works, that leads them to some pretty dark political videos. You know, things like white nationalism and the alt-right movement. Just because a lot of people who have been into gaming have felt they're isolated and are looking for a community, they've been attracted to those types of videos. And the algorithm kind of does its thing. And that's not good for any of us. So, you know, I don't think there is an easy solution here because overturning Section 230 would have its own complications. I don't think that's an easy answer. But I think the first step is recognizing this problem, that there is a phenomenon of misinformation out there and that these companies are playing an active role in spreading this information, even if they're claiming that, you know, it's not our fault, we're doing our best, et cetera, et cetera.
Dave Bittner: And it turns out they have a financial incentive to do so.
Ben Yelin: It turns out they certainly do. I mean, I think it's - you know, it all comes down to the bottom line. And they know they can make money with views, whether those views are for legitimate news stories or whether they're for conspiratorial nonsense. It's still about making money. And, you know, in a lot of industries in this country, we put regulations on people that stop them from maximizing their profits because they do harm to the public good. You know, I'm sure if we had no environmental standards, for example, more companies would be extremely profitable because they could just, you know, dump all their coal or whatever in our river streams. But we've decided as a society to put some guardrails on that, and perhaps it's time to apply that sort of logic to online disinformation because I really do think it's becoming a larger and larger problem.
Dave Bittner: All right. Well, good insights as always. Ben Yelin, thanks for joining us.
Ben Yelin: Thank you, Dave.
Dave Bittner: And that's the CyberWire. For links to all of today's stories, check out our daily briefing at thecyberwire.com. And for professionals and cybersecurity leaders who want to stay abreast of this rapidly evolving field, sign up for CyberWire Pro. It'll save you time, keep you informed, and it's hypoallergenic. Listen for us on your Alexa smart speaker, too.
Dave Bittner: The CyberWire podcast is proudly produced in Maryland out of the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our amazing CyberWire team is Elliott Peltzman, Puru Prakash, Stefan Vaziri, Kelsea Bond, Tim Nodar, Joe Carrigan, Carole Theriault, Ben Yelin, Nick Veliky, Gina Johnson, Bennett Moe, Chris Russell, John Petrik, Jennifer Eiben, Rick Howard, Peter Kilpe. And I'm Dave Bittner. Thanks for listening. We'll see you back here tomorrow.