At a glance.
- Australian privacy commissioner says speed is key.
- The SEC’s new cyber disclosure rules go into effect.
- US prosecutors call for laws to combat AI child sexual abuse images.
Australian privacy commissioner says speed is key.
Angelene Falk, head of the Office Australia’s Information Commissioner (OAIC) and Privacy Commissioner, is urging organizations to be more expedient about notifying individuals impacted by data breaches. "Prompt notification ensures individuals are informed and can take further steps to protect themselves, such as being more alert to scams," Falk stated. "The longer organizations delay notification, the more the chance of harm increases." The OAIC says 74% of breached organizations reported the incidents within thirty days – not bad, but Falk says there’s room for improvement. She noted that one organization’s investigation took over five months due to running assessments before beginning an investigation. As Bank Info Security explains, Falk recommends organizations perform assessment and investigation simultaneously in order to be more efficient. The OAIC stated, "Conclusive or positive evidence of unauthorized access, disclosure or loss is not required for an entity to assess that an eligible data breach has occurred. An eligible data breach can occur based on unauthorized access alone and individuals' data can be stolen by less traceable means, such as screenshots.”
The SEC’s new cyber disclosure rules go into effect.
The US Securities and Exchange Commission’s new cyber incident rules for businesses went into effect yesterday, and SC Media offers an overview of what companies can expect. The new rules require that breached companies report an incident to the SEC within four days of determining the incident is “material” in nature. While this portion of the rules won’t go into effect until December, experts say companies should begin preparing for the change now. Harley Geiger, a cybersecurity policy lawyer at Venable, states, “Because of the required timeline for disclosure, companies should be prepared to perform these assessments and disclosures even if the cybersecurity incident is ongoing. Public companies' security, legal, and corporate communication teams should collaborate to adjust cyber incident response plans and financial reporting processes to accommodate these obligations.”
There’s also a bit of uncertainty surrounding what constitutes “materiality” in the SEC’s eyes. Guidance from the agency says a breach is considered material when “it is probable that the judgment of a reasonable person relying upon the report would have been changed or influenced by the inclusion or correction of the item.” However, some cyber experts say this definition leaves much to interpretation, especially when time is of the essence. George Gerchow, a faculty member of IANS research and chief security officer and senior vice president of IT at Sumo Logic, explains, “We are trying to understand what a ‘material incident’ means, but it’s still too ambiguous. Furthermore, there is very little guidance on how companies should handle third-party attacks. Supply chain attacks are on the rise and add another layer of complexity to reporting the full nature and scope of an incident. So, how are companies going to pull in third-parties and their team to handle an incident within such a short timeframe?”
US prosecutors call for laws to combat AI child sexual abuse images.
On Tuesday the top prosecutors in all fifty US states submitted a letter to the leaders of the House and Senate urging them to pursue legislation to protect against the use of artificial intelligence to produce child sexual abuse materials. As the AP News explains, the letter calls on Democrat and Republican lawmakers to create a commission of experts focused on investigating how AI tech can be used to generate child sexual abuse images, and then use that research to expand existing legislation. The prosecutors wrote, “We are engaged in a race against time to protect the children of our country from the dangers of AI. Indeed, the proverbial walls of the city have already been breached. Now is the time to act.”
South Carolina Attorney General Alan Wilson, a Republican and leader of the effort, said this is a bipartisan initiative. Wilson stated, “My hope would be that, no matter how extreme or polar opposites the parties and the people on the spectrum can be, you would think protecting kids from new, innovative, and exploitative technologies would be something that even the most diametrically opposite individuals can agree on — and it appears that they have.” Earlier this year the Senate held hearings to discuss the potential threats of AI, but the US has yet to pass AI rules like lawmakers in the EU.