The New York City Police Department has been using controversial facial recognition software to track citizens without their knowledge or permission, according to emails released by nonprofit news site Muckrock. Emails between the department and Clearview AI show a two-year relationship where the company was paid thousands of dollars to provide services, including assisting with fugitive tracking investigations masstamilan.
The documents, obtained by Muckrock and a journalist under a freedom of information request, reveal an alarmingly friendly relationship between Clearview and the NYPD, even as they warned of its potential to snoop on people. “Facial recognition technology has the capacity to be not a tool for law enforcement, but a threat to it,” says Public Advocate Jumane Williams, who urged the department to stop using the tech in her district myvuhub.
In a statement to Buzzfeed, Clearview CEO Hoan Ton-That said the data had been scraped from Facebook, YouTube and other social networks and that it was used for “security purposes” only. But an investigation by cybersecurity firm SpiderSilk revealed that the source code to the platform was stored in a Dropbox repository that had a password, allowing anyone to log in and access the system teachertn.
This is just the latest in a series of international privacy rules violations and sanctions against Clearview. It was recently fined EUR20 million by the Athens-based Hellenic Data Protection Authority for breaching European rules and ordered to delete user data it had collected. In addition, last month the U.K.’s data protection watchdog announced a joint procedure with Australia’s privacy watchdog to investigate how Clearview handled personal information.
France’s CNIL, meanwhile, issued a privacy order last week against the US firm for violating GDPR and breaching multiple user rights. It ordered Clearview to stop collecting data on French citizens, as well as to erase all of the user’s images that were already in its database pagalsongs.
CNIL’s decision comes on the heels of a decision in March by Italy’s data protection agency that found Clearview to be liable for GDPR violations. In this case, the agency also found that Clearview violated several other data protection laws by not letting users know that they were using their photos and violating purpose limitation rules in using photos for different reasons than those allowed under the GDPR (such as to find people in a group).
The CNIL’s finding is the latest in a string of international privacy rules breaches for Clearview. This year the firm has been ruled to be in violation of data rules in Canada, Australia and the UK, while the American company is currently facing a class action lawsuit in Illinois over its compliance with biometric protection laws there yareel.
Aside from these international rulings, Clearview has been subject to a series of local legal challenges in the U.S., too, mainly over its use of facial recognition on public photos. Earlier this year the company agreed to settle a class-action suit filed by the ACLU in Illinois, which alleged that it was in violation of the state’s Biometric Information Privacy Act, or BIPA.