Developing an Insider Threat Program: Risk Mitigation and Compliance
Wednesday, November 30, 2016, marks the deadline by which affected contractors must comply with new US Government insider threat mitigation requirements. The US National Industrial Security Program (NISPOM) mandates measures companies must take to secure classified information. On May 18, 2016, the Department of Defense issued Change 2 to NISPOM, significant because it requires contractors (defined as any "industrial, educational, commercial, or other entity that has been granted a facility security clearance (FCL) by a Cognizant Security Agency") to implement an Insider Threat Program no later than November 30, 2016. We're two weeks away from that deadline, and yesterday the Chesapeake Regional Technology Council convened a forum in which experts discussed mitigating the insider threat at the Chesapeake Innovation Center in Odenton, Maryland, to give companies some perspective on what NISPOM Change 2 means to them.
Chuck Ames, Director of Cyber Security at the Maryland Department of Information Technology, chaired the panel, which consisted of Mike Miller (VP Insider Threat, Tanager), Shawn Thompson (founder and president, Insider Threat Management Group), and Keith Moulsdale (Whiteford Taylor Preston).
As he opened the discussion, Ames offered a few statistics to underscore the seriousness of the insider threat. According to FireEye, he noted, 90% of attacks originate with email. Microsoft holds that some 85% of attacks are mitigated through sound privilege management. And according to Verizon's data breach report, 97% of breaches originate in some third party. So here's one way of thinking about the insider threat: it may not be your own people as much as it is the people your people hire. With that he turned the floor over to the panel.
NISPOM Change 2: an overview.
Mike Miller saw NISPOM Change 2, which, recall, offers industrial security guidelines, as amounting to a shift in focus. Without dismissing external threats, it now gives due recognition to the risks insider threats pose. And those insider threats may be malicious, but they can be inadvertent as well. Any data within an organization can be at risk to insiders—classified information, personally identifiable information (PII), intellectual property, and so forth. He reviewed the relevant requirements of NISPOM Conforming Change 2, which apply to companies that have facility clearances. They include:
- Establish an insider threat program.
- Designate an Insider Threat Senior Official, who must be an employee, a US citizen cleared in connection with and to the level of the facility clearance.
- Report insider threat information to the Cognizant Security Authority.
- Train relevant personnel.
- Provide pertinent records.
- Implement protective measures.
Industrial Security Letter 2016-02 provides guidance on how to comply, but Miller cautioned that this isn't a simple process: there are more than forty references in NISPOM and the Insider Threat Handbook. (It's worth noting that many of the requirements will have a familiar look to anyone who's dealt with personnel security.) The November 30 deadline requires companies to self-certify their compliance with NISPOM.
A problem generally recognized, but (still) not widely addressed.
Shawn Thompson spoke next. He said that, while the insider threat problem might be well-recognized, it's still not widely addressed. Around 80% of the dollars spent on security still go to the external threat despite general acknowledgement that the insider threat is probably the more serious. Why the disconnect? We recognize the problem, but Thompson thinks we don't appreciate it. We don't appreciate the harm insiders can do, and the costs they can impose on an organization. There are also, he believes, historical reasons for the tendency to give external threats priority in terms of resources. We've tended, in information security, to focus on networks, and that tends to induce a kind of target fixation on external threats. We've also tended to try to solve what's fundamentally a human problem with technical solutions.
Consider the laws passed in recent years—HIPAA, Sarbanes-Oxley, etc.—all of which stress protection of information. Thompson foresees an expectation that you'll have an insider threat program to emerge as a general standard of care. To a question about how that might affect those organizations that don't have facility clearances, and may not even be Government contractors, Thompson advised structuring a program by looking at these objectives: know your people, know your assets, obtain visibility, and mitigate risk. He noted that unintentional, non-malicious activity can constitute an insider threat, and such activity could also expose a business to liability. The more—and better—you manage risk, the less your exposure.
He closed by recommending that any organization assess itself programmatically, and then decide where it should apply its security resources. And, Thompson said, insider threat management has significant cultural implications for any business: privacy, human resources policies, recruiting, and morale are all in play.
The law is by no means simple (in part because there's so much of it).
Keith Moulsdale offered the perspective that the law requires businesses to take financially reasonable measures. The courts recognize that mom-and-pop businesses "aren't Northrop Grumman."
There was considerable discussion from the audience about the ways in which security and productivity are often in tension. In particular, when security degrades productivity, people will look for ways to work around security policies they perceive as impossibly burdensome. Ames agreed, and noted that expanding a business also expands a business's regulatory burdens, but that businesses must still in the end deliver good products. He suggested that if you leave the CISOs and CTOs to craft your HR plan, you'll probably wind up with something unsatisfactory.
Moulsdale resumed by observing that, when we preach security in a business, we're preaching to the choir—everyone knows we have to take appropriate security measures. But we don't always appreciate the consequences to compliance. These can be both large and surprising, given the complex cross-currents law and regulation have created. He reviewed several of these cross-currents.
Security law is often in tension with privacy law. Electronic monitoring laws—mostly drafted decades ago to cover things like wiretapping the Mob—will still affect what you can do in the way of monitoring. There are generally exceptions to restrictions on monitoring for service providers, but it's no longer clear, as networks are currently constituted, who counts as a service provider. Google? You, if you provide web access to your employees? And, of course, consent to monitor is a good thing to obtain, but even here jurisdictions differ: how many parties to an exchange must consent to monitoring?
Computer crime laws (like CFAA) also apply, and these, too, offer some contradictory guidelines awaiting Supreme Court resolution.
How do you address comingling of data on company and personal devices, he asked. Bring-your-own-device (BYOD) policies complicate these issues further. And, when you're buying products and services, are you complying with all applicable laws? Privacy tort laws bring exposure, as do employment and credit laws, which affect and limit what you can do in an insider threat program. So do Equal Employment Opportunity Commission guidance, state social media account laws, and National Labor Relations Board guidance.
Private contracts, often shaped by company culture, must also be considered in program development. Finally, of course, there are simple conflicts of law, especially internationally. Consider US-EU issues—we're no longer under Safe Harbor, but rather Privacy Shield. And Privacy Shield itself is no sure thing—it's under legal challenge inside the European Union.
Thus we see a legal and regulatory system that continues to evolve, and to grope toward generally acceptable and understood standards of care. Compliance with NISPOM Change 2 is a requirement for contractors with facilities clearances, but it probably also represents a prudent next step for business as a whole.
Follow-up: any advice for the perplexed?
We followed up this morning with Keith Moulsdale, asking him what a business should do, given the difficult and puzzling legal maze it must negotiate. He offered this advice:
"Indeed, the laws are complex and daunting. If only the federal government would step in with an omnibus law and put us all out of our misery ….
"In the meantime, the best advice for a small USA business is to keep in mind that written notice, coupled with express consent, can solve most (but not all) privacy-related risks arising from industry standard data security programs implemented in the workplace. So, small businesses should adopt a written company-wide policy, signed by all employees, that creates a business-only workplace. Generally speaking, that policy would: (a) designate company accounts + assets as non-personal; (b) limit business activities to non-personal accounts; and (c) give the employer the right to monitor, review and use all accounts and communications for security and other business purposes. This policy would not likely protect employers against violations of certain employment-specific laws (like NLRB rules which prohibit spying on “protected and concerted” employee activities concerning the terms and conditions of employment, or employee-owned social media account laws), or applicable contracts (such as union contracts), but would likely do the trick with respect to claims based on privacy tort law, computer crime laws and electronic monitoring laws.
"Special scrutiny would need to be applied for any business with employees in other countries, but a recent ruling (on Jan. 12, 2016) by the EU Court of Human Rights gives me hope (Barbulescu v. Romania). In the case, the court ruled that monitoring personal messages on a work-related internet messaging account did not breach an employee’s right to privacy."