At a glance.
- Warwick University hacked.
- Chegg discloses data breach.
- Privacy challenges of decentralized exposure notification.
- Privacy concerns surrounding centralized contact tracing, including the possibility of mission creep.
- Ransomware, data theft, and persistence.
Warwick University was hacked (and was slow to notify those affected).
According to Sky News, Britain's Warwick University was hacked last year. The university did not, according to reports, notify affected individuals until March of this year. An audit by the Information Commissioner's Office to which the university voluntarily submitted last month found, SC Magazine writes, significant deficiencies in Warwick's information governance. Sky News says those deficiencies were of such a nature as to render the university incapable of determining what information had been exposed in the attack.
Chegg discloses data breach.
California-based education tech company Chegg has disclosed its third data breach in as many years, TechCrunch reports. Tripwire says the data lost include information associated with some seven-hundred employees, including names and Social Security numbers.
Decentralized contact tracing (or exposure notification).
The New York Times has a rundown of the concerns, many of them related to privacy, that even decentralized exposure notification have raised. Apple and Google have released the first, “developer-focused” version of their jointly developed exposure notification API, TechCrunch reports. “Exposure notification” has replaced “contact tracing,” and that’s probably a more accurate description given the system’s decentralized design. The beta version allows developers to tailor alerts to specific exposure criteria, including proximity and duration, and it allows users to toggle their alerts on or off. Users may also opt in to sharing a COVID-19 diagnosis anonymously.
The Electronic Frontier Foundation (EFF) has expressed concerns, Threatpost says, that the exposure notification system suffers from a security vulnerability. There’s no reliable way, the EFF warns, of ensuring that the devices sending proximity warnings are in fact the devices they’re supposed to be, and that trolling can’t effectively be ruled out. “A well-resourced adversary could collect RPIDs [rolling proximity identifiers] from many different places at once by setting up static Bluetooth beacons in public places, or by convincing thousands of users to install an app. The tracker will receive a firehose of RPIDs at different times and places. With just the RPIDs, the tracker has no way of linking its observations together.”
Other problems with false positives don’t require bad actors’ involvement. To take some of the examples the EFF considers—two cars with windows rolled up passing side-by-side in traffic, a patient near a nurse in full protective gear, and two people kissing—all these look about the same to Bluetooth.
And the EFF is also concerned that data collected by the apps could eventually be de-anyomized. One scenario the group paints is using a combination of increasingly common networked security cameras, facial recognition software, and the Bluetooth exposure notification systems to locate individuals and associate them with sensitive data.
Centralized contact tracing: efficacy and privacy, and long-term concerns.
As the UK’s National Health Service proceeds with plans for a centralized contact tracing system, the Government Communications Headquarters (GCHQ) will receive such access to the NHS system as it requires to ensure the system’s integrity and security. “During the emergency, the network and information systems held by or on behalf of the NHS in England or those bodies which provision public health services in England must be protected to ensure those systems continue to function to support the provision of services intended to address coronavirus and Covid-19.” Computing and others quote GCHQ as saying that it has no interest in acquiring personal health data, and that the agency’s interest is solely the security of NHS systems.
ZDNet reports that more than one-hundred-seventy privacy and information security researchers in the UK have signed an open letter about NHSX’s development of a centralized COVID-19 contact tracing system. The signatories “urge that the health benefits of a digital solution be analysed in depth by specialists from all relevant academic disciplines, and sufficiently proven to be of value to justify the dangers involved.”
They have roughly speaking three questions. First, they wish for some reasonable assurance that any contact tracing system would actually work as intended, and help to control the pandemic. Second, while politely expressing their appreciation for NHS’s commitment to transparency, they ask for assurances that anonymized data won’t be de-anonymized to associate individuals with the information being collected. And, third, they’re concerned that the system might be adapted to other purposes and retained even after it had served its purpose and the UK has emerged from the pandemic: “Finally, we are asking NHSX how it plans to phase out the application after the pandemic has passed to prevent mission creep.”
Ransomware, data theft, and persistence.
A report this week from Microsoft's Microsoft Threat Protection Intelligence Team concludes that it’s not just the ransomware gangs who threaten data theft who are actually stealing information. Even the criminals who don't threaten to steal information are doing so anyway. The report also concludes that ransomware attackers don't necessarily leave a victim's networks even after a victim has paid. Instead they'll maintain persistence as long as possible, the better to position themselves for subsequent attacks.