Georgetown's annual Cybersecurity Law Institute offered its customary mix of expert presentations and panels. Here’s a summary of the two-day event.
The Top Ten Things You Need to Know (a Primer).
The Institute opened with a preliminary session giving attorneys new to the field of cybersecurity law some quick insight into the terrain in which they’ll operate. The panelists included Harriet Pearson (Hogan Lovells), Randy V. Sabett (Cooley), and Rena Mears (DLA Piper). Pearson, who moderated, promised “a whirlwind introduction,” and opened with a quick overview of some of the bigger categories of threats attorneys can expect to encounter: APT, ICS attack, cybercrime, hacktivism, and, of course, insider threats. With that, the panel moved to its Top Ten.
#1. Cyber risk governance and duty of care. What are the legal standards against which officers and directors are measured? Begin with an assessment of compliance obligations, but recognize that this is only a beginning: securing an organization isn't a matter of completing a compliance checklist. Periodically review relevant standard-of-care sources, and confirm that your organization can meet, and show, reasonable compliance. (And consider documenting this.)
#2. Compliance. This is a regulated space, and you should closely consider regulatory actions from the FTC, NTIA, FCC, and other bodies. These tend to enjoin very basic measures, so ensure that your organization follows best practices with respect to authentication, network segmentation, etc., and keep your technology current. There may be something of a “turf war” underway between the FTC and FCC; in any case, they’ve promised to use their "full authority" against noncompliant companies. For publicly traded companies, SEC has issued guidance (so do follow it). The Department of Justice has put a helpful list of best response practices. And the states have also offered guidance. California, for example, says failure to implement the 20 Controls "constitutes a lack of reasonable security.”
#3. Counseling the C-Suite. Counsel is expected to provide executives legal advice at both the strategic and the day-to-day operational levels. Recognize the problems the C-suite faces, and note that the CXOs’ interests are disparate, and not necessarily aligned. Realize that counsel is there to advise on—and help manage—risk, and that managing communications is a very important aspect of risk management. Assess risk within the context of the company's risk portfolio. Develop a strong relationship with the CISO.
Technical people talk freely, and so you should help them understand the implications of communication from a legal perspective. In particular, it will be very helpful if you can explain what communications are under privilege and what are not (and this is not always an easy issue to address). You also will find yourself serving as a translator within the C-suite, helping them communicate across cultural and jurisdictional differences. Develop a common language and communication protocols, benchmark industry practices, and express requirements in operational language.
Your counseling will include an educational and training dimension. Don’t neglect tabletop exercise, and focus them by level. The worst time for the C-suite or the board to learn their responsibilities is during a breach. Counsel should participate fully in cybersecurity training.
#4. Corporate and commercial transactions. Don't forget deal activity: cybersecurity takes on a heightened importance during such periods. Part of diligence should always be an assessment of other parties’ cyber security. Should you hear, in the course of due diligence, a party say, "we've never had a breach," that’s a red flag. It may indicate a low level of organizational attention or self-awareness. Don't ask yes-or-no questions—get the other parties to respond in more open-ended discussions. Don’t forget to look at third parties, and watch downstream liabilities. Remember that “you'd much rather deal with any nastiness during diligence than after you've acquired the other party.” Finally, as you think about risk, remember that high-stakes business deals, including mergers and acquisitions, are major targets of threat actors.
#5. Information governance. Consider your duty, the value of the information, and the asset that contains information.
#6. Workforce matters. Reinforce a culture of cyber security awareness and accountability. Doing so requires considering privacy, labor, and other legal implications of implementing measures like continuous monitoring and background checks. You may not really be able to train an organization and its people into security, but there are some reasonable expectations you can bring to the problem. One Important goal of training is to help winnow out sloppy language: every “incident” is not an “attack” or a “breach,” for example. Lawyers typically have a bird's eye view of an organization, which gives you an opportunity to spot gaps in training and practices. Consider how your organization should handle internal investigations, and develop protocols for conducting them. Consider what should be done in the event of a violation of rules. What happens to the person who was responsible?
#7. Partnering with privacy. A paradox: the more secure you are, the more surveilled you tend to become. Security and privacy people come from very different backgrounds (respectively, technology and policy). But the relationship between them has increasingly come to be seen as a symbiosis, not opposition. Help translate between privacy and security. Talk about aligned interest and identified risk. Recognize that security measures have different cultural and regulatory implications, and they should always be considered in that context. And be sure to think through the sort of control an organization may need to exercise over the devices individuals in that organization use.
#8. Guiding information sharing and law enforcement interactions. Security professionals believe that sharing of indicators of compromise and cyber threat intelligence is valuable for collective defense and enhanced resilience. In many sectors sharing is now part of a standard of care. It’s important to develop a strategy for sharing information. Establishing contact with law enforcement is a good place for counsel to understand what might be disclosed. Have a strategy, implement it, look at CISA, and participate in some industry information-sharing mechanism. But watch privacy as you do so—consider sanitizing personal data where appropriate. Know why you're going to talk to law enforcement. Law enforcement can help in many ways.
#9. Cyber insurance considerations. The market for cyber insurance is still maturing, but insurance in this sector is used, as it is elsewhere, to reduce the cost of risks associated with doing business. Approach and understand it in this light.
#10. Preparing for an handling an incident. Your response to an incident should be swift, coordinated, and thoughtful. This means response should be planned, and that its plans be exercised.
Congratulations, You’re Responsible for Cybersecurity Legal Matters! What Must Corporate Counsel Learn?
Chaired by cybersecurity consultant Christina Ayiotis, this panel took up some of the challenges corporate counsel face. Panelists included Michelle Beistle (Unisys), Suzanne Rich Folsom (United State Steel), Christin Flynn Goodwin (Microsoft), and Karen I. Moreno (Exxon Mobil).
Discussion opened with consideration of leadership challenges general counsel faces. Beistle (who said she came from a background in privacy, which afforded a natural path into cyber security), advised getting to know your CISO, and recognizing that tension isn’t inevitable.
Folsom thought it important not to delay “jumping into a cyber problem when you need to do so.” She recounted some lessons from the Chinese Peoples Liberation Army hack of US Steel. When the PLA came in, their second biggest target, after technical intellectual property, was legal information. We are, she feels, in a cyber war. Once the PLA engaged US Steel, all aspects of their digital lives received attention from Chinese operators. We now speak about cyber security, Folsom observed, at every board meeting.
Goodwin argued that shaping the risk management conversation is an important role for counsel, especially insofar as this helps an organization unify around a common approach. This shapes the conversations the board must have, and that conversation cascades down throughout an organization in virtuous ways. “There's a language to cyber,” Goodwin remarked. Counsel should comfortable with that language, and security people need to be comfortable with counsel’s team.
Noting that cyber security law is a relatively new field, Moreno pointed out that education is vital. We may know what to do in fields like anti-trust, but cyber law is still taking shape.
Cyber security issues cut across all aspects of corporate life, and counsel needs to recognize this fact. The very important role of CISO is a relatively recent development—IT, security, and legal are major stakeholders, but the CISO takes point. Document your activities (“It’s sad,” remarked Folsom, “to see people fail to document.”)
What about information sharing? Folsom recommended that whoever has responsibility for information sharing with the government should have some access to classified information. Whether or not you have a cleared facility, there are ways of partnering with the Department of Homeland Security and the FBI to get someone cleared with appropriate access. Goodwin offered a qualification and a warning: “Classified information can be a double-edged sword. If you can't action it, you may not want to know it.”
To consider what sort of intelligence might be actionable, Folsom returned to the PLA’s hack of US Steel. US Steel wanted very much to know who hacked them (and why) so they could understand how to implement better controls. Knowing that the PLA officer (handle “Ugly Gorilla”) was at the hostile keyboard helped.
As conversation turned to the NIST Framework, the panel seemed to agree that this framework is emerging as the de facto reasonable standard of care. Moreno has used it at ExxonMobil, and she recommends it to all corporate counsel. In Goodwin’s view the NIST Framework is a very valuable starting point, in part because it’s so easily consumable. But you can’t stop there. You must also be comfortable with other verticals you deal with. Start with the NIST Framework, but build a strong control-set mapping team, and take a lifecycle approach to standards. “Standards do evolve, and they age out.”
Goodwin sees law firm security as effectively a supply chain issue. How old is the equipment in your legal supply chain? Do you understand the risk of not patching, for example? Folsom agreed, adding that you should have cyber security conversations with everyone in your supply chain. If they won't cooperate, then they should go. Ayiotis noted that the risks aren’t just third-party risks: “they’re nth party risks.”
Turning to the question of how counsel should work with relevant internal and external constituencies, Folsom advised against discussing sensitive matters in email or online. Goodwin noted the value of a capable, quick, incident response team. Returning again to lesson learned from the PLA attack on US Steel, Folsom advised being able to deal with matters outside the corporate network. Once US Steel brought suit against Chinese firms in relation to this incident, they noticed a "tremendous" amount of attack traffic hitting them. These approaches are creative and difficult to recognize. She described one attack mounted via LinkedIn—a catphish from a foreign country who arrived with a plausible story.
As the panel concluded, Ayiotis noted the frequency with which training in incident response has come up during this institute. She recommended gamification of training wherever possible. Beistle advised attorneys to focus on developing their knowledge. Talk about cyber, and teach other lawyers in your shop to do so as well. Moreno advised making friends with your IT folks. Folsom summarized her advice for corporate counsel as follows: “Be involved, be prepared, and be the voice of reason.”
Interview with a Former FTC Commissioner
The second day opened with a discussion of regulation.
Kimberly Peretti (Alston and Bird LLP) interviewed former FTC Commissioner Julie S. Brill (now with Hogan Lovells). Brill began reviewing the events of last few years to establish a context for her reflection on the role of regulators like the Federal Trade Commission (FTC): a Fortune 500 CEO was fired over breach, Sony was hacked, we’ve seen a rise of arguably moralistic hacking (as in the Ashley Madison breach) and so on. Thus cyber security has evolved rapidly in some unanticipated directions.
Brill sees a role for the FTC in setting a standard of reasonableness in data hygiene and data security. Achieving reasonable security, as defined by the FTC, will solve a lot of problems. Consumers should have assurance that their data are generally protected, and their devices generally secure.
The FTC has demonstrated that it's not afraid to litigate, if it feels it has a strong case. Every case presents learning opportunities, and the FTC "is very much a learning organization." But Brill foresees no large changes in the agency’s direction.
Unfairness jurisprudence is central to FTC. The FTC looks for injunctions that will force companies to remediate their systems. It can also can seek restitution. The Commission’s penalty authority is more limited, but it does have some penalty authority in violations of consent orders, statutes, or rules.
Brill thinks it a good thing for regulators to “get together to solve problems consumers face.” The rapid expansion of connected devices, for example, will have to be addressed. Connected devices offer benefits as well as vulnerabilities, and we need to remember the (enormous) benefits. Brill would advocated applying frameworks for data security to device security, and hopes to see companies properly incentivized to correct and patch problems.
With respect to connected devices and their design, she spoke with approval of the “Start with Security” program, a “roadshow” that hopes to send a message to startups, which security will help toward exit. Start with Security "is designed to tap into fear in the development community,” turning this into positive gains for design security.
Jurisdictional restrictions keep agencies in their lanes. But there's inevitably a lot of overlap. This isn't, Brill observed, a new story. Compare anti-trust. In most cases it's clear who the responsible agency is; in others some discussion may be required. She found turf wars, during her time at the Commission, to be “miniscule." And regulatory overlap, she believes, can even be a positive thing, as a complex regulatory environment with multiple silos enables effective tailoring of regulation to individual cases.
She concluded by saying she's optimistic (although no Pollyanna) about cyber security threats. Constant attention should provide with grounds for optimism.
In response to a question about backdoors (alluding to the current debates in the crypto wars) Brill said she worried about creating "a door" through encryption that is itself designed to protect consumers. She made the point that no one really thinks it possible to magically create a door usable only by good guys. Absent such magic, she thinks we should do everything we can to strengthen encryption. She noted that the Government has no monolithic position here, and she also sees a regulatory interest in pushing strong encryption.
Asked about international reaction to the Snowden leaks, she said that she doesn't see a great difference between US and EU reactions. On both sides of the Atlantic, people are trying to protect their consumers.
Other questions addressed responses to vulnerability reports, and the nature and scope of regulatory authority. Part of reasonable security, Brill said, is appropriately addressing vulnerability reports. She believes the FTC wants to see process, and that an organization’s reaction to vulnerability reporting should be part of such a process.
To the final questions about regulatory authority, Brill expressed some skepticism about the need for more fully articulated rules. She believes the FTC's blog should be very helpful to people. “Why would making the blog a rule help?” She also believes flexibility in regulation to be a good thing. Regulatory agencies need agility.
And above all, she thinks reputational concerns are the best motivator. While it may be "a very good thing" that companies want to be “outside the FTC's crosshairs,” they should think first of their reputation in the marketplace.
The National Security Side of Cyber Intrusions
Not all, perhaps not even most, national security cyber incidents involve attacks against Defense or Intelligence Community targets. Businesses are involved with surprising frequency, and this isn’t going to stop any time soon. Moderated by Andrew Tannenbaum (IBM), the panel included John P. Carlin (Assistant Attorney General for National Security), Rajesh De (Meyer Brown), and Steven R. Chabinsky (CrowdStrike).
Carlin opened by describing what he saw as agencies cooperated in cyber defense. He saw how good NSA was at mapping cyber threats, and he said that the picture that mapping showed was “horrifying.” The Government changed its approach to sharing intelligence, shifting (a bit) from need-to-know toward responsibility-to-share. This shift enabled the first indictment of foreign officers for hacking US companies—the case brought in Pennsylvania against members of China’s Peoples Liberation Army who were charged with hacking US Steel and other companies. “Hacking was the day job of the PLA uniformed officers involved. We had to change that behavior.” An our next challenge, after improving interagency cooperation, Carlin noted, was working with the private sector to understand the theft. We had to figure out how to receive information from the companies involved.
Most organizations in the private sector, De said, had worried about theft of information (including intellectual property). But since 2012, we've begun to see disruptive attacks, and since 2013 we’ve seen an increased public focus on destructive attacks. Now we’re beginning to see growing concerns about manipulation: either data or things could be manipulated. So as a target, one needs to ask, why might I be a target?
Chadinsky said, “We've seen so much of the world become a national security target” that increasingly government and private interests are converging. This convergence has changed, and continues to change, the way the sectors interact. Most people think that attacks on critical infrastructure would be the most serious, damaging threat. But, while serious, such attack might actually be less worrisome, “than loss of citizen confidence.”
Reviewing intrusions and how we've handled them afford a way of discussing progress. Consider, Carlin suggested, the Sony hack. “Even the entertainment industry is a nation-state target.” Sony did several things right, in Carlin’s view—above all, they knew whom to call for help. Attribution, publicity, and imposition of consequences were possible from the cooperation that ensued. The gaming industry has also been hit with destructive attacks. We've also seen blended threats—theft plus extortion under threat of destruction. Carlin described one such case (noting that too many victims continue to pay extortion demands): a retailer was hit by ISIS-connected hackers, who would have used stolen personal information to prepare assassinations. The lesson Carlin drew from this is that you often don’t know who's behind the attack on your business. Determining that, and determining what motivation lies behind the attack, can be valuable.
We've seen an increase in high-level attention to cyber issues. Recognition is clearly there, De said. The Government is organizing itself for information sharing, and the private sector is following suit. “Now the question has become, how well will government and business execute?” We must recognize, however, that it's not crazy to worry about the reaction of regulators to an approach. “Regulators scare businesses in ways law enforcement agencies do not.”
Chabinsky urged thinking strategically about the diverse roles of ISPs, government agencies, vendors, manufacturers, and so on. “We're not strategic about what we're doing.” He considered the role of indictments. “Prosecution is a tool in the kit. It should be focused on deterrence.” Investigation, attribution, and specificity are important to deterrence. “So, figure out who did it, publicize it, and impose consequences” including indictments or sanctions.
The panel turned to threats to infrastructure, illustrated with the now-famous hack of the Bowman Street Dam in Rye, New York. Tennenbaum showed pictures of the dam (partly, one thinks, out of local pride). The Bowman Street Dam “essentially stops a brook from babbling." It’s small, and it’s basically designed to keep some basements from flooding (and perhaps inter alia keeping an adjacent Little League field playable). But the incident shows the opportunism of nation states: everything can be a target. The attack on the flood control dam was attributed to Iranian actors. While they were intruding into the dam’s control system, however, a more serious attack was under way. “While the Bowman Street Dam was being hacked, so was the financial services industry,” Chabinsky said. “And they got no help from the Government.” If there’s an intrusion, report it, can’t be the answer. “How many times,” Chabinsky asked, “are you going to let someone rattle the handle” before you take action?
Carlin noted that much remains unknown about the motivation for hacking the Bowman Street Dam. “For all we know, it may have been a mistake”—there is, after all, a very large Bowman Dam in Oregon—or it may have been a trial run.
But “enough with the data; we've got the data,” Chabinsky said. Whether those data are on China, Iran, or any other actor, what we need is a policy. And formulating such policy should be a contribution to risk management. He offered two suggestions, the first for useful research: “Most of what we're seeing is done through botnets. Why not ask how we could eliminate botnets?” Pursuing such a line of inquiry, whether or not it was fully successful in yielding a solution, might well give us valuable insights into defense.
And his second suggestion involved an analogy he thought might be instructive as we evolved our model of public-private partnership in cyberspace. Consider—this university, like most others, has its own police force. And this is unproblematic. No one, least of all the DC Metropolitan Police, thinks Georgetown is running a rogue operation of mercenaries or vigilantes. Rather, what the private sector does in physical space is stabilize a situation until government authorities arrive. Then it turns that situation over to law enforcement. So what’s the analogy to cyberspace, and how can we align deterrence and stabilization?
As the panel closed it took a final question: why isn't the answer to all this end-to-end encryption? De reminded the audience that the human is the weakest link. No one technical solution works everywhere, all the time. Carlin reminded people that keys can always be stolen, and Chabinsky observed that data had to be decrypted eventually if data are to be used.
Responding to a Data Breach: How to Run a Cyber Investigation and Learn from the Breach
The final general session of the institute drew lessons for incident response. Erez Liebermann (Prudential Financial) moderated a panel consisting of W. Scott Nehs (BlueCross/BlueShield), Timothy Ryan (Kroll), and Tara Swaminatha (DLA Piper).
Discussion opened by noting a common problem: IT departments’ tendency to want to clean up and restore a system after an attack. You, on the other hand, want to preserve evidence, and handling evidence is very different from handling data. You can't just move, replicate, and delete evidence.
Blocking an IP address implicated in an incident is typically IT’s first reaction. That's usually a mistake, noted Ryan. Swaminatha added that there's a tendency for people to panic, and talk to lots of people about what to do. Don't necessarily start talking to employees on day one. Rather, engage your vendors under a privileged engagement, and establish privilege protocols. Remind people what they should do, and what they should say. If there's a chance the attackers are in your email system, establish an alternative channel of communication.
Nehs advised finding out whether the enemy's still inside your perimeter. Notify your core team and prepare for a two-week slog. Activate not only the core team, but their backups. Call in people on retainer—outside counsel, forensics, crisis communication specialists. Nehs said that his organization already had master service agreements in place such outside assistance. Swaminatha noted that there’s range of actions outside counsel might take: “We may go into crisis management mode, or we may simply monitor.” Work with forensics to determine greatest legal exposure. Preserve evidence. Stop log deletion. And then, contain the threat. Deleting malware is not containing an attack. It's just removing footprints.
Privilege is of course important, but it’s problematic. Contrary to what clients may think, in their preliminary and inchoate way, privilege isn’t a kind of super power attorneys bring to every engagement.
We don't, Swaminatha observed, have a great deal of case law on cyber forensic investigation and privilege, yet. The forensic team's work may or may not be discoverable, so take care when committing things to writing. Counsel doesn't need to look over forensic investigators’ shoulders, but counsel but should be in on major calls, and copied on all email. (But, she cautioned, note that this doesn't guarantee privilege.)
Nehs advised thinking through the questions you'll be trying to answer as you assemble and manage your team. Tabletop exercises can help inform that thinking. Be clear about what you're seeking to protect, and “don't make it a kitchen sink.” Recognize that you’ll need to communicate with a lot of people, and that not all such communication will be privileged.
“Don't insist on being on every call if you can't be there, Ryan warned. “You'll slow it down.” The panel’s consensus was to minimize written reports during incident response. Swaminatha advised clearly articulating that investigation and remediation are distinct projects. Liebermann observed that technical people “want to fix things,” and implied that you may need to reign in this tendency during an investigation.
With respect to outside help, Ryan recommended considering “capacity, capability, and conflicts” as the three triggers for bringing outsiders in to handle the incident. Liebermann asked how one should evaluate the right forensic vendor? Ryan divided vendors into three classes: (1) dead-box vendors (very expensive and often slow), (2) scanning vendors (who look for artifacts), and (3) answer-plus-assurance vendors. Ryan likes the third class, and warns against the first.
Swaminatha judges vendors by the individual persons they bring to the investigation She wants to know how long they’ve been at it, and how long they’ve been doing the same thing. “You've got to have the ability and the guts to kick people off the team.” You may have to switch to a new forensic team, if necessary, during the investigation. Nehs agreed, adding that you need to establish clear responsibilities.
When you’re bringing in outside counsel, Nehs observed that many firms are announcing they've got a new cyber practice. In judging them, you want experience, you might want someone whose closer to being a digital native than many senior partners are, you may want somebody who's done something other than practice law, and you might want to find someone who has contacts with law enforcement.
A lot of people in the industry, Ryan noted, come from the Intelligence Community, and they may not really understand investigation. (“They tend to watch.”) Liebermann advised always hiring the A-team, not the B- or C-team. You need to know who the best people are. The panels consensus was that you should hire the person, not the brand and not the marketing.
Asked about attribution? The panel thought that one might have some obligation to investigate, but realistically, you're unlikely to reach firm attribution in a major incident. The panel as a whole was tepid about attribution except insofar as it can inform prevention. Knowing where to look and what to protect are valuable, but beyond that, unless you’re selling newspapers, attribution is probably less interesting. So don’t get caught up with “it was PLA unit so-and-so from Shanghai” (but on the other hand, you sometimes do look better if it turns out your hacker was a well-resourced nation state). You also should interested in knowing if your attacker was an insider.