Nowhere To Hide
Your Face Belongs to Us

Nowhere To Hide

New York Times tech reporter Kashmir Hill explains why governments and individuals are fighting against facial recognition technology to protect personal privacy. The Clearview company is a case in point.

New York Times tech reporter Kashmir Hill has crafted a substantial explanation of how a wrong or wrongheaded decision by a facial recognition technology company could decimate personal privacy. He uses the saga of the Clearview company to ground his case about the dangers of this invasive technology.

Faces

In the early 1990s, software programmer Hoan Ton-That made his first big impression by creating quiz applications that more than six million Facebook users liked well enough to install.

In 2016, Ton-That, having moved from California, where he felt his career had stalled, to New York, attended the 2016 Republican National Convention with Charles Carlisle Johnson, founder of the GotNews website. In 2015, Twitter banned Johnson for seeking donations to “take out” Black Lives Matter activist DeRay McKesson.Johnson introduced Ton-That to conservative venture capitalist Peter Thiel and to Richard Schwartz, a former aide to Rudy Giuliani.

By 2017, government authorities and law enforcement had been getting pitches for subpar facial recognition technology for decades.
Kashmir Hill

The trio of Ton-That, Johnson, and Schwartz considered the commercial possibilities of identifying people solely by photographs of their faces, particularly asking whether experts could analyze those individual faces to predict people’s group behavior and marketing preferences.  However, as Hill reports, long before they came together, facial recognition technology had already developed a history. 

Early Facial Recognition

In 1960, mathematician Woody Bledsoe started a Palo Alto, California technology firm called Panoramic. In 1965, its “government patron,” the Central Intelligence Agency, funded Panoramic’s development of “a manmade machine system for facial recognition.”

Engineer Matthew Turk became curious about facial recognition technology in 1986 when he was working for Martin Marietta. The defense contractor was developing a self-driven car – the Autonomous Land Vehicle, nicknamed “Alvin” – for its client, the federal Defense Advanced Research Projects Agency (DARPA). Martin Marietta equipped the car with a form of computer vision, but shadows and mud distorted its results, so the vehicle’s tests were disappointing.

Seeking to improve computers’ “sight,” Turk enrolled at MIT in 1987. He produced a composite face based on images of 115 students and stored each facial image with a calculation of its mathematical variation from the composite. Turk then photographed 16 volunteers to produce a set of facial images that, when arranged in the correct combinations, could represent each volunteer. He called his combined faces eigenface, German for “directional stretching.” His work proved that computers could sift data to “see” people.

Even a highly accurate algorithm, deployed in a society with inequalities and structural racism, will result in racist outcomes.
Kashmir Hill

Move ahead to January 2017 when Ton-That, Johnson, and Schwartz became co-founders and equal owners of SmartCheckr LLC. Their company’s facial recognition technology vetted the attendees at the celebratory “DeploraBall” held on the eve of Donald Trump’s inauguration.

When SmartCheckr made a presentation to Hungarian government officials in hopes that they would buy its facial recognition tech, it took credit for keeping the DeploraBall secure. SmartCheckr also “fine-tuned” its technology to identify the people Hungary’s Prime Minister Viktor Orbán, an authoritarian populist, considered his enemies.

After Peter Thiel invested $200,000 in SmartCheckr in July 2017, Schwartz and Ton-That registered the company and listed themselves as its sole shareholders, omitting Johnson. They changed the firm’s name to Clearview AI. Eventually, Johnson signed a nondisclosure and non-disparagement agreement in exchange for 10% of Clearview.

By the end of 2018, Clearview had amassed a library of one billion photos of faces, images that it had found online. That year, Hill reports, venture capitalist David Scalzo discovered that Clearview AI’s computer vision worked well even with badly lit photos, pictures of people wearing hats or eyeglasses, and images of men who had grown beards after the creation of their database photos. It could even distinguish between sisters. Impressed, Scalzo invested $200,000 in Clearview, despite the potential legal liability it could face for lifting photos of millions of images from Facebook, LinkedIn, and other platforms without their consent.

USA PATRIOT Act

Congress passed the USA Patriot Act after the September 11, 2001 terrorist attacks. Hill explains that the law required the government to determine the reliability of any technology, including facial recognition, that tracks people based on their biometric data.

In March 2011, CNN reported that Google was developing its own facial recognition app. However, chairman Eric Schmidt told journalists the company never offered the app for sale due to its dangerous potential if it fell into the hands of a wrongdoer, such as an “evil dictator.”

That July, Google acquired Pittsburgh Pattern Recognition, or “PittPatt,”, a company that had developed a facial recognition product. The following December, the Federal Trade Commission sponsored the Face Facts conference to discuss ways to prevent facial recognition from becoming a threat to society. Those who attended, including representatives from Facebook and Google, agreed that companies should not distribute software that can identify faces.

In Hill’s estimation, everyone who contributed to the advance of facial recognition technology seemed to assume that someone else would come along to protect society from its potential misuse or excesses.

30 Billion Faces

In 2019, the Indiana State Police paid Clearview a fee of $49,500 for a year’s subscription to its facial recognition technology, thus becoming the first law enforcement agency to sign up. Clearview offered free software trials to law enforcement agencies in Australia, Brazil, Sweden, Canada, the United Kingdom, and elsewhere, while its website carried “Success Stories” highlighting how it had helped police identify suspects.

The significance of what Clearview had done was not a scientific breakthrough, it was ethical arbitrage.
Kashmir Hill

New Jersey’s attorney general banned the state’s police from using Clearview. Privacy protection agencies in the United Kingdom, Canada, Australia, France, Italy, and Greece investigated Clearview. Six countries outlawed it as it amassed $70 million in fines.

When Clearview settled an ACLU lawsuit in May 2022, the company agreed not to sell facial recognition technology to private firms and individuals. However, the settlement allowed Clearview to sell its facial recognition software to law enforcement agencies. Following along those lines, Amazon, IBM, and Microsoft say they will sell their computer vision products in the United States only to law enforcement agencies. Meanwhile, Clearview has expanded its database to more than 30 billion portraits.

Multilayered Understanding

Tech reporter Kashmir Hill deploys the technical, political, and social understanding that this complex, multilayered topic demands. He tracks Clearview’s creation and ascension without neglecting the classic Silicon Valley infighting many startups face. Those squabbles add compelling drama to his tale and underscore the distastefulness of Clearview’s executive team and its mission.

Hill weaves political and technological history throughout his strongly cautionary tale. Though he is crystal clear about his belief that nothing will stop the widespread private use of facial recognition software, he makes a strong case for imposing legal limits on this invasive technology.

Share this Story
Show all Reviews