At a glance.
- AI regulation focuses on potential data privacy violations.
- State Department official calls for cyber assistance fund.
AI regulation focuses on potential data privacy violations.
The debate over the potential threats posed by artificial intelligence tech continues, and powerful chatbot ChatGPT is at the forefront of deliberations. Last month Italian officials announced a temporary ban of the platform amid allegations it violates the EU’s General Data Protection Regulation, and Politico posits that the move is just the first pothole in the bumpy regulatory road ahead in the EU. AI-powered platforms like ChatGPT sweep the public web to compile the massive datasets necessary to train their tech, and much of this data is being used without the creators’ consent. And of course there’s also the potential threat of using AI to spread misinformation (potentially far worse than posing the Pope as a fashion icon) or conduct fraud and other cybercrimes. Because ChatGPT parent company OpenAI has no headquarters in an EU member state, any member country has the authority to follow in Italy’s footsteps and launch a new investigation or implement a ban. Data protection authorities in Belgium and Ireland have stated they’re looking into ChatGPT’s potential violations, and France’s data protection authority CNIL has received at least two complaints about the platform’s alleged privacy abuses. What’s more, advocacy groups and AI experts alike are warning about the risks AI present to humanity, with a group of Elon Musk-led tech moguls calling for a pause in AI tech development, and the US’s Center for AI and Digital Policy has asked the Federal Trade Commission to block further releases of ChatGPT. Think tank the Future of Privacy Forum’s Gabriela Zanfir-Fortuna stated, “Data protection regulators are slowly realizing that they are AI regulators.”
Meanwhile, the companies behind the AI, as well as their financial backers, remain evasive about the tech that underpins their operations. OpenAI is so secretive about how exactly ChatGPT works that even Microsoft, the company’s chief investor, has admitted it does not understand exactly how the chatbot acquires its datasets. Michael Wooldridge, a professor of computer science at the University of Oxford, told the Guardian that the large language models, or LLMs, that support AI tech hinge on vacuuming up massive amounts of data that inevitably include personal info. “This includes the whole of the world wide web – everything… In that unimaginable amount of data there is probably a lot of data about you and me. And it isn’t stored in a big database somewhere – we can’t look to see exactly what information it has on me. It is all buried away in enormous, opaque neural networks.” There are also the legal issues that arise when copyrighted material is used to fuel AI. Stock photo giant Getty Images is suing Stability AI, the British company behind the AI image generator Stable Diffusion, for allegations the firm violated copyright by using their photos to train the system, and in the US a group of artists are filing similar lawsuits. Woodridge warns, “Many artists are gravely concerned that their livelihoods are at risk from generative AI. Expect to see legal battles.”
State Department official calls for cyber assistance fund.
The US State Department is asking for special, flexible funding to support American allies that suffer cybersecurity emergencies. During an Atlantic Council panel last week, Nate Fick, the department’s cybersecurity ambassador explained, “First, we’re making a push for a dedicated cyber assistance fund. We did it after 9/11 for counterterrorism, we should do it now. We don’t have the mechanisms in place for rapid, dedicated response. That would help a lot, and I think there’s support for it on the Hill.” Fick says the fund would be one component of a three-pronged approach to bolstering global capacity building. The second step would be to use online tools to complement in-person delivery of capacity building resources, and the third would be to integrate more support from the private sector. Fick explained that the war in Ukraine taught policy makers that “there’s a large role for the private sector here, where we can play a brokering and introduction kind of role, but they’re not government dollars being used, and we can bring a lot of private sector capacity to bear quickly.” Breaking Defense notes that Fick’s plan emphasizes a need for speed, allowing for a quicker response to cybersecurity crises. Often attacks arise without warning, and when any delay in response could result in greater damage to the penetrated system, it’s essential that help is offered as soon as possible. Marshal Miller, principal associate deputy Attorney General at the Department of Justice, commented, “When people experience cyber attacks, whether it’s companies, whether it’s individuals, whether it’s nation-states, they’re at their most vulnerable. When we as a government can help those folks at that moment, that’s an incredible relationship-building opportunity.”