Ukraine is using Clearview’s controversial AI facial recognition during war; the company already has 2 billion images from Russian social media in its database
We’ve been covering the controversial AI facial recognition tech startup Clearview for at least three years. The startup first made headlines in January 2020 after The New York Times ran a story titled: “The Secretive Company That Might End Privacy as We Know It.”
The Peter Thiel-backed Clearview uses AI to automatically scrape and collect publicly billions of photos of faces across social media and other websites to build out its biometric database. Clearview sells access to the database using a proprietary search engine to law enforcement agencies and private companies.
Clearview has been used for several years by the FBI and Department of Homeland Security to collect photos on social media sites for inclusion into a massive facial recognition database. Now, Clearview is a surveillance tool of choice for governments around the world.
Over the weekend, Ukraine’s defense ministry said it has begun to use Clearview AI’s facial recognition technology to identify Russian assailants, combat misinformation, and identify the dead. Ukraine started using the tool after the CEO Hoan Ton-That had offered the tool to the country, Reuters reported.
Ton-That told Reuters that they had not offered the technology to Russia. As part of the offer, Ukraine received free access to Clearview AI’s powerful search engine for faces, letting authorities potentially vet people of interest at checkpoints, among other uses.
Ton-That also told Reuters that the startup has over 2 billion images from the Russian social media service VKontakte in its database of more than 10 billion photos total.
The VKontakte images make Clearview’s dataset more comprehensive than that of PimEyes, a publicly available image search engine that people have used to identify individuals in war photos, Wolosky said. VKontakte did not immediately respond to a request for comment; U.S. social media company Facebook, now Meta Platforms Inc, had demanded Clearview stop taking its data. -Reuters
Ukraine’s Ministry of Defense did not reply to Reuters’ requests for comment. Earlier on, a spokesperson for Ukraine’s Ministry of Digital Transformation said it was considering offers from U.S.-based artificial intelligence companies like Clearview.
That database can help Ukraine identify the dead more easily than trying to match fingerprints and works even if there is facial damage, Ton-That said. At least one critic says facial recognition could misidentify people at checkpoints and in battle. A mismatch could lead to civilian deaths, just like unfair arrests have arisen from police use, said Albert Fox Cahn, executive director of the Surveillance Technology Oversight Project in New York.
“We’re going to see well-intentioned technology backfiring and harming the very people it’s supposed to help,” he said.
Ton-That added that Clearview should never be wielded as the sole source of identification and he would not want the technology to be used in violation of the Geneva Conventions, which created legal standards for humanitarian treatment during the war.
Like other users, those in Ukraine are receiving training and have to input a case number and reason for a search before queries, he said.
Clearview, which primarily sells to U.S. law enforcement, is fighting lawsuits in the United States accusing it of violating privacy rights by taking images from the web. Clearview contends its data gathering is similar to how Google search works. Still, several countries including the United Kingdom and Australia have deemed its practices illegal.
Cahn described identifying the deceased as probably the least dangerous way to deploy the technology in war, but he said that “once you introduce these systems and the associated databases to a war zone, you have no control over how it will be used and misused.”
Over the last two years, Clearview has become one of the highest-profile developers of facial recognition because it sends authorities matches from an ever-growing database of over 10 billion photos that it finds posted publicly on the internet.
Last August, a government audit found that Clearview facial recognition tool is used by a dozen of US federal agencies including The Federal Bureau of Investigation (FBI), Immigration and Customs Enforcement, and Fish and Wildlife Service. The startup also won about $50,000 to research augmented reality glasses with facial recognition to secure Air Force base checkpoints.
“Clearview AI has a pattern of deception: the company has been publicly defending its mass surveillance by claiming it will only sell to law enforcement while privately pitching an expansion into finance, retail, and entertainment,” said Jack Poulson, executive director of tech accountability group Tech Inquiry.
Speaking at The All Things Digital Conference in 2011, former Google Chairman Eric Schmidt said Google decided not to implement facial recognition technology because of privacy concerns. He said he thought it’s something that can be used in a “very bad way as well as a very good way.”
In 2021, Clearview was hit with legal complaints about its controversial face scraping in Europe after EU privacy watchdogs announced the company’s image-scraping methods violate European laws. Privacy International (PI) and several other European privacy and digital rights organizations filed legal complaints in France, Austria, Greece, Italy, and the United Kingdom saying that the images of faces the company automatically extracts from public websites — violate European privacy laws. But the privacy complaints did little to stop investors from pouring more money into the company.
Founded in 2017 by Hoan Ton-That, an Australian techie and onetime model, Clearview is a new research tool used by law enforcement agencies to identify perpetrators and victims of crimes. Clearview AI’s technology has helped law enforcement track down hundreds of at-large criminals, including pedophiles, terrorists, and sex traffickers. It is also used to help exonerate the innocent and identify the victims of crimes including child sex abuse and financial fraud. With Clearview AI, law enforcement is able to catch the most dangerous criminals, solve the toughest cold cases and make communities safer, especially the most vulnerable among us.