Britain’s Data Protection Authority (ICO) plans to fine Clearview AI just over £17 million (about €20 million). In addition, Elizabeth Denham, a data protection officer, asked the American company that specializes in automatic facial recognition, to stop further processing and delete the personal data of British citizens.
Denham accuses Clear View of committing “serious breaches of UK data protection laws”, which are primarily based on the General Data Protection Regulation (GDPR). She cites a joint investigation by the ICO and the Australian Data Protection Authority, which focused on the capture of images and data from the Internet and their use for facial recognition by Clearview. Using the company’s app, the recordings will be compared against a database of more than 10 billion photos.
affected not reached
There is a good possibility that these images also contained data from a large number of people from Great Britain and may have been collected from publicly available information online such as social networks without their knowledge, executing power. It is also aware that the detection service provided by Clearview AI has been used by a number of UK law enforcement agencies on a free trial basis. Meanwhile, the service is no longer available in Great Britain.
Specifically, the ICO accuses the company of failing to process UK citizens’ data in a way that they “most likely would expect or be fair”. There was no delete routine. Furthermore, a legitimate reason for addressing sensitive biomarkers cannot be identified. Those affected were also not informed of what was happening with their data. Clearview has also requested additional personal information from citizens who want to speak out against the practice.
The company now has the opportunity to comment on the alleged violations. Dunham wants to know your decision in mid-2022. The civil rights organization Privacy International, which filed a complaint with the ICO, welcome the initiative. She talks about “a clear message for companies whose toxic business model is based on exploiting the moments we and our loved ones put online.”
Previously, former Hamburg data protection officer Johannes Kaspar of Clearview wanted to know what data processing model the service was based on. After some back and forth, the supervisory authority ordered the company to delete the complainant’s hash value and biometric form. Caspar felt that all European supervisory authorities were required to take further steps.