The British data protection authority ICO (Information Commissioner’s Office) intends to fine Clearview AI in the amount of just over 17 million pounds (around 20 million euros). In addition, the data protection commissioner, Elizabeth Denham, has asked the American company, specialized in automated facial recognition, to stop the processing of personal data of British citizens and delete them.
Denham accuses Clearview of “serious breaches of UK data protection laws”, which are still essentially based on the General Data Protection Regulation (GDPR). She cites a joint investigation by the ICO and Australia’s data protection authority, which focused on the hacking of images and data from the internet and its use for facial recognition by Clearview. The company’s app would compare the recordings to a database of more than 10 billion photos.
Uninformed affected
These images are also likely to contain data on a significant number of people in the UK and may have been compiled without their knowledge from publicly available online information such as social media, run the authority. You are also aware that the detection service offered by Clearview AI has been used by various UK law enforcement agencies on a free trial basis. However, in the meantime, the service is no longer offered in Great Britain.
Specifically, the ICO accuses the company of failing to process the data of UK citizens in a way “that they can reasonably expect or that is fair.” There was also no delete routine. Also, no legitimate reason to process sensitive biometrics can be seen. Those affected were also not informed of what happened to their data. Clearview also requested additional personal information from citizens who wanted to speak out against the crackdown.
“Clear Message”
The company now has the opportunity to comment on the alleged violations. Denham wants to announce his decision in mid-2022. The civil rights organization Privacy International, which had complained to the ICO, welcomes the initiative. He speaks of a “clear message to companies whose toxic business model is based on the exploitation of moments that we and our loved ones put online.”
Previously, former Hamburg data protection officer Johannes Caspar from Clearview wanted to know what data processing model the service is based on. After some back and forth, the supervisory authority ordered the company to remove the whistleblower’s hash value and biometric template. Caspar saw the need for all European supervisory authorities to take further action.
(watch)
“Extreme gamer. Food geek. Internet buff. Alcohol expert. Passionate music specialist. Beeraholic. Incurable coffee fan.”