2nd Annual Senior Executive Cyber Security Conference 2015.
N2K logoSep 10, 2015

2nd Annual Senior Executive Cyber Security Conference 2015.

The Johns Hopkins University's Whiting School of Engineering and Information Security Institute, with COMPASS Cyber Security, held their second annual Senior Executive Cyber Security Conference at the university's Homewood campus in Baltimore on Thursday, September 10, 2015. The Conference took up questions raised by information sharing measures currently under consideration by the US Congress. Not only did the conference organizers see the tension between information sharing and privacy as a "quandary," but the symposiasts also looked at other implications of information sharing, including its prerequisite: collection. Here's some of what we heard.

Whose data are they?

Hodding Carter, author, journalist, and longtime policy advisor to several Administrations, delivered the first presentation of the conference. Noting his long experience with government, he took advantage of his status as a "dinosaur" to offer some historical perspective. Fears of terrorist threats from the late 20th Century were, he said, exaggerated and overblown. They remain so today: to think terrorism is the biggest threat facing the US is "nonsense, nonsense, nonsense." It doesn't begin to compare to the threat posed by either the Nazis or the Soviet Union, and we ought to be on our guard against committing folly in response to an exaggerated threat.

The cyber threat should be understood in a similarly measured way. Carter argued that defense against cyber threats can't be allowed to alter the nation's fundamental values. He offered a long review of the history of official assaults on civil liberties in the name of security (COINTELPRO, etc.) and he had some (measured) kind words for Edward Snowden. While Snowden should pay a price for his theft, his positive motives should also be recognized.

He noted an "eternal debate" between two camps who stood for two central Constitutional values that exist in tension: the civil liberties described in the Bill of Rights and one of the Constitution's other fundamental goals—security. He called for transparency, observing that the Constitution favors neither the government nor the press, and that these two should check and balance one another. But the necessary open conflict between them is now missing.

He closed with one observation—"all in cyber is inescapably volatile"—and one counsel—"to secure information, consider reducing needless declassification."

The White House: privacy activism, cyber standards, and civil liberties.

Ari Schwartz, Director of Cybersecurity, National Security Council at the White House, spoke on privacy activism and civil liberties. Noting his own background as a privacy advocate, Schwartz lamented our "tendency to see privacy and security as conflicting, when our goal is to do both at the same time. Most of the time, they're mutually reinforcing."

He discussed cyber legislation currently before the House and Senate, and reviewed the Executive Orders the President has issued to advance cyber security. The NIST Framework, he believes, is especially important to this effort. The Framework has improved risk management, and cyber security is essentially risk management.

Another one of the Administration's goals is to improve what has been "a very deficient" incident response capability. Schwartz sees progress here, too.

The third aspect of the Administration's approach has been to improve information sharing. He said that sharing cyber information among companies has been the hard problem here, particularly given corporate fears of running afoul of anti-trust law and regulation. Such fears, he said, seem to have abated in response to clarification from the Government. Liability protection remains important, "but we must not protect negligent inaction."

Schwartz addressed current concerns about Congressional action. "Many fears of cyber legislation stem from fear it will facilitate surveillance," he said, and he described the safeguards being built to prevent this. One important safeguard he very sensibly urged was "minimization": "As information sharing develops, we see a healthy limitation of the amount of information collected."

He closed with some thoughts on threats and response. "The threats will LOOK as if they're getting worse," but he urged us to remember that our tools to detect threats are getting better too. And, commenting on the Office of Personnel Management breach, he asked us to reflect that "the person who got fired was the one who found the problem. What does that say about us?"

Legal perspectives on data privacy and information gathering.

Bruce Heiman, of K&L Gates LLP, asked whether you should be afraid to share cyber threat information with the Government? "The good news is, there are lot of good reasons to share. But there are a host of reasons to not share." He described the downside of sharing: when you share threat information with the Government, especially when you haven't sorted out the threat, you control of your information and you invite regulation, oversight, prosecution, and class action lawsuits.

He noted how active the Federal Trade Commission (FTC) has become in cyber matters. The FTC hasn't really been setting standards, but rather undertaking enforcement action against what it construes as inherently unfair practices. There are many sources of standards, but the basic standard is "reasonable and appropriate technical safeguards." Heiman thought that pending legislation seems to have fairly good liability protection for good-faith sharing.

He closed with some positive speculation about the future of hacking back (why not go out and take back the data that's been stolen from you?) and even (reaching for historical examples) the coming of something like cyber letters of marque and reprisal, which he sees adumbrated in pending legislation.

Hacking the Internet-of-things.

Avi Rubin, of the Johns Hopkins University Information Security Institute, opened the afternoon session with a presentation on hacking the Internet-of-things, offered after the manner of NPR's "Wait, Wait, Don't Tell Me" quiz show. He began with some scare stories (complete with pictures) that invited the audience to consider the implications of hacked wearable technology and its potential to inflict physical damage on the wearers. Medical devices, like wearables, are all things, Rubin noted, and hence part of the Internet-of-things.

He invited the audience to distinguish real stories hacks from fakes. (His stories showed various clever proof-of-concept hacks of device components, extraction of information from environment, etc. The true stories included demonstration of the ability to capture audio from a device's gyroscope or keystrokes from its accelerometer, extraction of encryption keys by apparently innocent physical contact with a device, and the capture of audio from video—in this case, video of a potato chip bag slight vibrations caused by nearby conversation.)

Rubin's finished with some lessons: with the Internet-of-things, build in security from the beginning, and always consider the security implications of any device. And finally, "just because you can connect something to the Internet…doesn't mean you should."

Information sharing and data privacy: a deeper dive.

A panel moderated by COMPASS CEO Robert Olsen took a look at the prospects for data privacy in the coming world of information sharing. The panelists included Robyn Greene (Policy Council, Surveillance and Cyber Security Issues, New America Foundation), Bob Butler (Senior Advisor, The Chertoff Group), and Matt Green (the Johns Hopkins University Information Security Institute). (Not all of them were as sanguine about information sharing as Ari Schwartz was in his morning presentation.)

Olsen began by asking whether information sharing was the way to turn the tide on data breeches, or whether strengthening security was preferable. And where, he asked, did privacy fit into issue?

Robyn Greene answered by saying that "information sharing legislation has a superficial appeal, but it doesn't get at the real threats, which are preventable." She observed that pending legislation seemed to have few or no requirements to protect data, and that the sort of "blanket liability protection" some of the proposed bills include further places personally identifiable informaiton (PII) at risk. "We shouldn't duplicate the number of places sensitive information resides."

She also warned that it would be difficult to promote cyber information sharing without enabling the Government to use the informaiton shared for purposes other than cyber defense (and that would include using it for law enforcement).

Her preference would be to facilitate hardening defenses. "Information sharing legislation is not by any means an effective way, let alone a comprehensive way, of addressing risk using tools we all have access to." The pending bills in effect incentivize oversharing.

Bob Butler had a more positive view of the potential value of information sharing. He stressed that such sharing can be particularly valuable when it extends to exchanging solutions and lessons learned. He advocated building trust, and protecting privacy would be "a huge element of that." We should identify champions and adopt common operating principles.

Matthew Green advocated focusing on the security of open source software. "We're living in glass houses. There are kids out there throwing rocks. There will always be kids throwing rocks, and telling them not to throw rocks isn't the solution. We need to fix our houses." We should have, he said, a "Manhattan Project" for making software secure. (Subsequent discussion made it clear that what he intended by this was increasing funding for security research and engineering that would fix vulnerabilities in widely used software. He cited the Linux Foundation's Core Infrastructure Project, which offers grants to people to fix software libraries, as an example of what he'd like to see replicated on a larger scale.)

Green observed that information sharing has its uses, especially when the threat is state-sponsored activity. But for crooks or hacktivists? Not so much.

The panel approached a consensus: cyber hygiene and secure design are most important, and threat intelligence is vital for national security matters (that is, for defense against state-sponsored activity).

Robyn Greene returned to Congressional and Administration efforts designed to foster intelligence sharing. She said that the private sector is already sharing a lot of information. The legislation before Congress falsely suggests or assumes that there's a great of opacity to overcome. She again noted Congress's failure to either protect PII or incentivize its protection. The information sharing proposed by Congress and the Administration are likely to produce more noise than signal.

A questioner objected that PII are already gone, compromised, and so why should we worry about them? Aren't vulnerable enterprise resource planning (ERP) systems the real business risk? Greene took the question, explaining that privacy advocates are concerned with the definition of PII and other sensitive content. At risk aren't just items like Social Security numbers and dates of birth, but your location, your activities, your financial transactions, and so on. And there are major worries about the potential for surveillance, with its attendant threat to civil liberties.

Matthew Green, offering some good words for Google (with respect to its Chrome browser) and Apple as two companies who've succeeded in making security a selling point. The panel concluded with thoughts on the importance—and relative scarcity—of high-end security engineering talent. Any approach that relies on a concentration of such talent within an organization won't scale. We should, as Green put it, "commoditize security."

Information-sharing and cybersecurity education as a cybersecurity enabler.

Michael Echols (Joint Program Management Office National Protection and Program Directorate, Office of Cybersecurity and Communications, Department of Homeland Security) Talked up information sharing and cyber education as a cyber security enabler. He offered an overview of Government cyber roles and missions, and argued that best practices can only be instilled through education. He also contended that cyber security is fundamentally a form of risk management, and he stressed that "the Government can't protect you." Instead, the Government's role, and goal, is to make tools you can use to protect yourself.

Global cyber risk and data privacy.

Curtis Levinson, US Cyber Defense Advisor to NATO, discussed global information sharing in an engagingly louche and charmingly world-weary presentation (reminding the audience several times that the views he expressed were his alone, not those of the US Government, NATO, or anyone else).

He began with a description of the cloud as a generally unrecognized return to the old world of the centralized brick-and-mortar data center. We moved data from data centers to client servers, and now to the cloud, which in effect means back to a data center. Only in this case it's someone else's data center. Your cloud data can reside in some six to eight data centers worldwide, in friendly countries, hostile countries, Iceland, etc. "Iceland is a nice country," he emphasized, but some of those places aren't so nice. "Russia is a criminal nation," he said in response to a question about Russian cyber actors and companies. "Tsar Putin is kept in power by criminals." So where exactly are your data? Do you control them? If you switch providers, how do you get data back?

Levinson reviewed various threats to people's data. The Ashley Madison affair, he thought, represents a new form of hacktivism. "And geography is cyberspace is a funny thing. What would stop me from finding, taking someone's IP address? And by hijacking their address, assuming their personality?"

On bots, Levinson said, "I generally assume there's a 100% bot infection rate on machines that touch the public Internet." A virus is a piece of software that's installed and then does what it's been designed to do, but bot is like a drone. ("Drones are popular today—every idiot has a drone.") Bots are controlled by botmasters. The only way we can find a bot is by "anthropomorphically" listening for it to ping its botmaster.

He asked in what sense, any more, we can be said to own our data? He repeated the familiar joke: "What's the difference between Google and NSA? NSA's a nonprofit." If you don't encrypt your data, intelligently, then you don't own it. And when you consider where your data are, and who will have access to them, you should ask whether you really want to aggregate those data for easy access by attackers. Remember classification by aggregation. "There is no privacy on the Internet, whatsoever, unless you use strong encryption. And there are no takebacks."