Who's responsible for protecting Internet users against cyberattack?
The panel's question comes down to one of balance: what's the right balance to strike within a responsibility that's clearly shared among governments, the private sector, and individuals?
Chaired by Andrew Whitaker (Her Majesty's Consul General to San Francisco, and a veteran of UK cybersecurity establishments) the panel included Betsy Cooper (Executive Director, Center for Long-Term Cybersecurity, the University of California, Berkeley), John Mills (Director, Cybersecurity Policy, Strategy, International, Defense Industrial Board, Office of the Deputy Chief Information Officer for Cybersecurity, US Department of Defense), Andrew Stalker (Chief Information Security Officer, Barclays International), and Alejandro Majorkas (Partner, Wilmer Hale, and former Deputy Secretary, US Department of Homeland Security).
We have yet to experience harm that would change behavior.
A problem with expecting too much from ordinary users, in Cooper's opinion, is that consumers haven't yet been generally affected, beyond nuisance levels. Stalker noted, from his perch in the financial sector, the rigors of international compliance, but even there, "alarm fatigue sets in when repeated warnings are followed by small effect."
International norms: we're not there yet, either.
Whitaker asked about the range of possible responses to cyberattacks.
"People haven't yet, as Cooper said, suffered the harm that would change behavior," Mayorkas thought. The Sony hack was an inflection point for government. There's an obvious spectrum of possible response to nation-state hacking of the kind Sony experienced: naming and shaming, retaliatory cyberattack, and ultimately kinetic action. Kinetic retaliation shouldn't be dismissed as an obvious non-starter, either. Majorkas said he heard people advocate at least consideration of a kinetic response to the Sony hack, asking, for example, if the North Koreans had blown up a building without loss of life, wouldn't we have considered a retaliatory strike? And if so, how is the cyberattack Sony Pictures sustained relevantly different?
Mills pointed out that It took twenty to thirty years to establish norms of behavior for nuclear powers. These things take time. Ambiguity is a bad thing, when it comes to roles and intent. When we say "red line," we have to mean it.
Developing regulatory regimes.
Cooper, whose organization is interested in consumer awareness, thought there's also a range of regulatory responses for protection.
There should be some level of government regulation, Stalker observerd, but currently there are so many regulations that compliance is very difficult, and very difficult for consumers to assess. Some form of labeling might help: "Even a notice that says you're compliant, on a product, would be a good step forward."
Majorkas offered some cautions about government regulation: It's very difficult for the government to establish standards across many areas of expertise, including inevitably expertise that the government may be short of. He also warned against moving from "small-r" to "big-R" regulation. This is always problematic, but especially so in cyber: the domain is too dynamic.
Managing risk without forsaking the benefits of connectivity.
Whitaker asked the panel about risk management. You want benefits that connectivity and a strong, efficient supply chain bring, but these carry risk. How can it be managed?
Cooper argued that cyber insurance will play it's foreseeably important role in risk management. Liability is now under debate, and emerging liability structures will also contribute significantly to enhanced risk management. Mills agreed, and suggested that cyber liability would become well-structured when it became a board responsibility.
Majorkas demurred. "It's woefully inadequate to define a standard of care in the crucible of the courtroom." But if we don't establish baseline standards of care and build from them, the courtroom is where they'll be defined. We still need to distinguish system responsibilities from individual responsibilities. "It's irresponsible to push out an unpatchable product. Eventually we'll learn, and we'll learn on the back of some company."
Stalker closed the discussion, advocating a common understanding served by a common lexicon. And he offered some final advice to companies on information-sharing: by all means share. "Don't compete on internal security."