At a glance.
- More on the SEC’s new cyber rules.
- US Senate approves online child safety bills.
- US lawmaker says Microsoft should be held accountable for recent data breach.
More on the SEC’s new cyber rules.
Government and industry experts alike continue to respond to the US Securities and Exchange Commission’s (SEC) new cybersecurity rules for publicly traded companies. As we discussed previously, the SEC announced Wednesday that covered firms will now be required to report details about "material" cyberattacks within four days of the event, describing "the incident's nature, scope, and timing, as well as its material impact or reasonably likely material impact on the registrant."
Black Kite offers a summary of the SEC’s new rules and discusses the part that materiality will play in assessing the necessity for disclosure. Writing for Fortra, cybercrime researcher Graham Cluley notes that many firms will likely be less than pleased with this requirement, especially given that the term “material” can be very broadly defined. (For an example of how a judgment of materiality might be made, Evotec discusses how a recent data breach at the German drug development firm impacted revenues and how this was communicated with stakeholders.) What’s more, it could be very difficult for a firm to assess the full impact of a breach within the first few days. The organization risks underestimating the scope, which would mean having to issue a corrected assessment later, or overestimating it, and reversing the damage of such an announcement could be impossible. That said, Cluley admits, “Disclosing breaches in a ‘more consistent, comparable, and decision-useful way’ (the words of SEC chair Gary Gensler) does sound helpful, and should enhance transparency.”
The Wall Street Journal notes that the new rules also include a change about how much companies share about the make-up of their boards. No longer must a firm report on the cybersecurity expertise of their board members. The regulations proposed by the SEC last year required companies to disclose if any of board directors had significant knowledge of cybersecurity, and this was met with pushback. Several organizations, along with the National Association of Corporate Directors (NACD), argued that the board wasn’t the right place for cybersecurity knowledge. Speaking about the SEC’s new decision, NACD president and chief executive Peter Gleason stated, “I’m thankful that it was dropped because I think it would have presented an undue prescription on boards. Risk management resides in management.”
Darren Williams, CEO and founder of BlackFog, foresees a major shift in how companies report incidents. "These new regulations should dramatically change the way companies report breaches since they are now mandatory requirements. BlackFog has tracked the ratio of unreported to reported ransomware attacks since January of 2023 and has typically seen a 10:1 ration. We hope to see this drop dramatically with these mandatory reporting rules. Data exfiltration is the preferred tactic of virtually all ransomware today (89%) and something that nearly all companies have overlooked. Consequently, attacks are now at an all-time high and organizations have not kept pace with new methods to prevent these attacks. We hope these rules stop the general trend of trying to hide any attacks for fear of retribution as well as stop ransomware payments to cybercriminals in the process.”
Ronen Slavin, co-founder and CTO of Cycode, also welcomed the new regulations. “The SEC's new rule is a welcoming step towards protecting investors and the financial markets from the risks of cybersecurity breaches. Requiring companies to disclose breaches faster will help other companies learn and take steps to protect themselves from similar threats, as well as help companies improve their own cybersecurity posture and contain issues that impact industry-wide. In addition, this rule will force companies to be more prepared for such events, as they will need to have a plan in place for how to respond to a breach and how to share information with the SEC and other companies. This change will make the financial markets more resilient to future attacks.”
Reed Loden, VP of Security, Teleport, thinks the rule is overdue, and that it should spur organizations toward greater transparency. “Yesterday’s rule adoption by the SEC to require companies to disclose cybersecurity incidents is long overdue, and something the industry has needed for a while now," Loden writes. "This rule emphasizes the need for companies to quickly disclose “material” cybersecurity incidents and share the details of their cybersecurity risk management, strategy, and governance with the commission on an annual basis. Other requirements include describing an organization’s process for handling cybersecurity threats and risks and ensuring cybersecurity is addressed at the board level; this aims to give security professionals at companies more of a say in ensuring risks are heard and addressed. I’m hopeful that this rule will act as a catalyst for all organizations to remain open and transparent about their incidents and share as much information as possible. Sharing information means other organizations can learn from other’s mistakes to better address their own issues; in my opinion, this full disclosure should eventually extend to private/public collaboration as well." Like others, however, Loden has questions about how "materiality" will come to be defined. "While I do agree this ruling is a great start, it does leave some unanswered questions around the materiality of the incident. Because it leaves it up to each company’s discretion, it allows for some leeway. That said, I do think it’s likely the right approach, as it’s hard to set more specific requirements because each organization operates and is impacted differently. I suspect we’ll find some organizations may be less willing to disclose things, so it’ll be interesting to watch how forceful the SEC will be with this if it’s later revealed that certain companies failed to disclose a serious security incident." And he suggests that the private equity community might take a page from the SEC's book. "As of right now, this ruling only applies to public companies; I’d like to see VCs and other groups enforce similar requirements on their portfolio companies, even if it just means notifying their investors of material security incidents. This would help prepare them early on and make security more of a priority from a company’s start.”
(Added, 8:15 PM ET, July 28th, 2023.) Christopher Prewitt, CTO of Inversion6, is another industry leader who welcomes the SEC's new regulations. "After years of rumor and innuendo, it’s great to see the SEC act, requiring disclosure. This may force some needed attention on the criticality of cyberattacks on companies. More and more organizations fully depend on IT to perform almost every business process, and the interconnected nature of business in 2023, it can sometimes feel like a house of cards seeing the impact of an event," he wrote, adding that enforcement of the rules through a system of fines would become an important feature of the system. "It would be expected that there will be associated fines for those who don’t meet the 4 day window. The other requirement of disclosing on an annual basis material information regarding cybersecurity risk management, I believe, is an even more important action. This will likely bring the cyber security program to the table in the board room in a more effective manner."
Ani Chaudhuri, CEO of Dasera, offered some extensive comment on the SEC's move. Transparency, in Chaudhuri's view, is the important feature of the regulations. "The new rules implemented by the SEC are a notable stride towards transparency in a world where cybersecurity incidents are increasingly common. With digital assets becoming increasingly critical to businesses, timely and comprehensive disclosure of such incidents to shareholders is pivotal." Chaudhuri offers an account of materiality, which won't be exclusively or even principally a technical issue. "Material incidents are those that have a significant impact on a company's financials, operations, or reputation - elements which shareholders would indeed consider crucial in making an investment decision. The same principles apply whether we're talking about a physical asset like a factory, or digital data. Cybersecurity is no longer a domain exclusive to IT professionals; it's a concern for everyone."
There are also some questions about how the new rules will affect businesses and the larger markets in which they trade. "While the SEC's approach is admirable, it does bring a set of new challenges to the table. The reporting timeline may indeed seem tight, especially for complex incidents where an understanding of the scope and impact may take longer than four days. Given the technical and complex nature of cyber incidents, it's important to strike a balance between providing timely information and ensuring that information is accurate and complete," Chaudhuri wrote. "The additional 180 days granted to smaller companies is also a thoughtful concession, acknowledging that not all entities have the same resources to manage and report cyber incidents. However, it is the clause about the potential postponement of disclosure in instances where it might pose a significant risk to national security or public safety that can be more contentious. While the intent is certainly valid, the execution must be handled carefully. Defining 'significant risk' might be a potential gray area, and companies should not misuse it as a loophole to delay disclosure. Furthermore, while the rules require companies to provide a concise description of the incident, its impact, and the data compromised, they do not require companies to disclose specifics of their incident response plans or details about potential vulnerabilities. In this sense, the rules are a missed opportunity to push companies towards better preparedness and proactive planning. The more information available, the more we can learn and improve our defenses. Lastly, let's not forget that this rule is reactive. Disclosing an incident after it has happened does not prevent the incident in the first place. The real need of the hour is to invest more resources in proactive measures that would make our systems more resilient and reduce the chances of such incidents happening in the first place."
Chaudhuri summed up, "The SEC's new rules are a positive step towards more transparency in handling cybersecurity incidents. Still, valid concerns and potential challenges must be addressed in implementing these rules. As we continue to rely more heavily on digital assets, the onus is on us to evolve our approach towards cybersecurity, making it a key part of strategic decision-making."
We also heard from Maurice Uenuma, VP & GM at Blancco. Like many others, he welcomed the promise of greater transparency. “Overall, the new SEC disclosure rules will increase visibility into governance of cybersecurity and should enhance data and systems security by placing greater pressure on boards and management teams to ensure they have the necessary expertise and oversight." He also found the requirement for quick disclosure a positive. "The prompt public notification of a breach is also a good step, as it pertains to better informing investment decisions and ensuring shareholder confidence in the governance of public companies. However, it is not (yet) clear how this breach reporting requirement will lead to enhanced cybersecurity overall. If these notifications inform collective learning – namely, the process by which public companies (and other entities) glean actionable lessons learned from the latest breach – then this is a good step for the protection of sensitive data and critical systems. But that, in turn, depends on the type of information that is disclosed, and how that information is turned into actionable lessons learned." There may, he suggested, be a role for the Cyber Safety Review Board. "Perhaps the new Cyber Safety Review Board can serve a key role in leveraging disclosed information for collective benefit, in much the same way that the NTSB’s investigations and recommendations following an airplane crash enhance air travel safety. But, if these breach notifications just become more noise for a world becoming numb to the steady drumbeat of breaches, the effort won’t yield much benefit.”
(Added, 6:00 PM ET, July 31st, 2023.) Mariano Nunez, CEO and Co-Founder of Onapsis, draws an apt comparison between the new SEC rules and the Sarbanes-Oxley Act's effect on enterprise resource planning (ERP) security regulation. "The SEC ruling emerges two decades after the Sarbanes-Oxley Act (SOX), which triggered the initial wave of ERP security requirements. In today’s business landscape, where ERP systems and other critical applications are vulnerable to cyberattacks, the importance of ERP security aligns with both the new SEC ruling and the underlying principles of SOX," Nunez wrote. "For ERP and other business-critical applications, the materiality of a cybersecurity breach becomes particularly relevant due to its central role in a company's operations and financial management. These applications often handle sensitive financial data, customer information and critical business processes. Any cyber incident that disrupts or compromises these vital systems could have far-reaching consequences, making it “material” under the SEC's definition. Organizations must now rapidly assess their ERP security posture and threat detection readiness, and carefully evaluate the extent of an ERP breach's consequences to consider potential financial losses, operational disruptions, legal fines and damage to customer trust.”
US Senate approves online child safety bills.
The US Senate Commerce Committee Thursday approved two bills focused on improving child internet safety: the Kids Online Safety Act (KOSA) and a revised version of the Children’s Online Privacy Protection Act (COPPA 2.0). The Verge explains that President Joe Biden has made it clear that protecting children on the web is of the utmost importance to his administration, and proponents of the legislation say the bills address the impact of social media and other online platforms on kids’ mental health. The Record notes that Committee Chair Senator Maria Cantwell applauded the bills’ approval, saying that minors can be “overwhelmed with the complexities of online content that is manipulated and targeted at them.”
Jim Steyer, founder and CEO of Common Sense Media, said, “KOSA and COPPA 2.0 would finally hold social media platforms accountable for how they impact young users and contribute to a worsening youth mental health crisis. Today's bipartisan committee vote is a big step forward for the mental health and overall well-being of kids and their families.” However, some critics say the bills could do more harm than good as they would require such platforms to collect a copious amount of user data, possibly violating user privacy. As the Washington Post discusses, this is similar to the issues privacy advocates have had with encryption.
Aliya Bhatia, policy analyst for the Center for Democracy and Technology’s Free Expression Project, stated, “KOSA requires online services to limit by default minors’ ability to communicate with other users and to enable a parent or caregiver account to manage their child’s privacy and account settings. These sorts of settings would not be appropriate to apply to adult users’ accounts, as they would limit key functionality for adult users and put adults’ privacy and safety at risk by giving another user the ability to control their communications.” Some tech industry groups are also opposed to the bills’ reliance on collection of user data. NetChoice’s vice president and general counsel Carl Szabo stated, “If passed, KOSA and COPPA 2.0 will create massive privacy and security problems for American families. When it comes to determining the best way to help kids and teens use the internet, parents and guardians should be making those decisions, not the government.”
A number of other industry experts have expressed animadversions about the measures. Chris Hauk, consumer privacy champion, Pixel Privacy, notes that holding more data inevitably increases risk. “While I applaud any effort to better protect kids while they are online, COPPA 2.0 and KOSA could actually lead to more private information being collected from family members. I also believe that when it comes to kids and teens' usage of the internet, that parents are best qualified to make decisions, not some government agency. That said, it is also time for parents to take control of their offspring’s internet usage, instead of relying on privacy laws or net nannies to protect them.”
Paul Bischoff, privacy advocate at Comparitech, thinks it won't be easy to determine, with any consensus, what content is intended for minors (and what content is appropriate for minors). “COPPA was mostly ineffective at protecting children online until recently. Now more lawsuits like those against Epic and Google show the law can be effective if actually enforced," Bischoff wrote. "COPPA 2.0 expands COPPA's protections from kids up to 13-years-old to those up to 16 years old. Overall I think that's a good thing, but it does muddy the waters when it comes to determining what content is intended for kids and what's not. A 16-year-old engages with a lot of content that is not directed at children, so where do you draw the line?" He also notes the possibility of unintended consequences. "KOSA could have unintended consequences for adult users. It forces tech companies to give parents control over their kids privacy and account settings. That's great for parents, but it's also great for domestic abusers who want control over their victims. Usually these systems work by the child allowing the parent to scan a QR code on their account that enables parental control. This could easily be leveraged in abusive adult relationships as well. One might argue, 'why don't you just verify that the user is a child before allowing parental control?' That brings us to another sticking point in both of these bills. How do you verify someone's age and still maintain their privacy? Many kids don't have a state-issued ID. Face recognition is not reliable. Both of those solutions, as well as every other solution I've seen, involves uploading sensitive private information, such as a face or ID scan, to a privately-owned company. Why should users---or the government, for that matter---trust these companies to manage age verification properly and protect collected data from abuse and data breaches? There is also the concern that LGBTQIA+ content could be censored for kids if categorized as 'adult' content.”
US lawmaker says Microsoft should be held accountable for recent data breach.
Ars Technica reports that US Senator Rob Wyden has penned a letter urging the Justice Department, along with the Federal Trade Commission and the Cybersecurity and Infrastructure Security Agency (CISA), to hold Microsoft responsible for what he calls “negligent cybersecurity practices.” Specifically, he’s referring to a Beijing-linked cyberattack that resulted in the theft of hundreds of thousands of emails from cloud customers, including top US officials. A Democrat from Oregon, Wyden wrote,“Government emails were stolen because Microsoft committed another error. Holding Microsoft responsible for its negligence will require a whole-of-government effort.”
As Security Week notes, when Microsoft first acknowledged the hack the company said Outlook.com and Exchange Online were the only applications impacted, but new data indicates the hackers accessed data beyond these platforms. The Wall Street Journal explains that full details of the incident aren’t known, but the breach came to light earlier this month when it was reported that hackers had accessed the email account of the US ambassador to China, and critics say Microsoft has not been forthcoming with info about the incident. Reuters reports that Wyden has asked CISA to get the Cyber Safety Review Board to launch a probe into the incident and the FTC to determine whether the incident overlapped with a 20-year-long consent decree. As the Record explains, Wyden noted that this is not the first time Microsoft has mishandled a cyber incident, claiming the company never took full responsibility for its part in the SolarWinds hacks. “It blamed federal agencies for not pushing it to prioritize defending against the encryption key theft technique used by Russia, which Microsoft had known about since 2017,” Wyden said. “It blamed its customers for using the default logging settings chosen by Microsoft, and then blamed them for not storing the high-value encryption keys in a hardware vault.”