“Dystopian” and “’Black Mirror’-esque” are among the ways critics have described Clearview, a facial recognition technology startup founded in 2016. The program’s ability to scrape photos off of the web and instantly aggregate information on just about anyone with an online presence, without their knowledge, has drawn the ire of privacy advocates, Democratic lawmakers and the same social media companies it relies on for data. 

The system has been used by more than 600 law enforcement agencies in the United States and abroad — including, as newly obtained records show, the Allegheny County District Attorney’s Office. 

Emails obtained by PublicSource through an open records request show that Clearview trial accounts were linked to email addresses of four employees in the office of District Attorney Stephen Zappala Jr.: analysts Andrew Colvin, Ted DeAngelis and Norah Xiong, and detective Lyle Graber. The trials started at different times, with emails first referencing a trial on Feb. 7 and last noting a log in on March 17. Three of the employee accounts were signed in to more than once. The emails do not provide detail about how the system was used.

But they do show some of the interactions between the county employees and Clearview representatives. Clearview invited the four employees who had access to the technology on a trial basis, and one additional employee, Matt Baumgard, to a Feb. 28 introductory Zoom video briefing on the technology, according to the emails. 

An email from a Clearview representative to Colvin encouraged him to use the technology extensively, claiming that the more searches an investigator does, the more cases they solve. “Don’t stop at one search. Or ten,” the email reads. “Try to reach 100 searches with Clearview.

(Excerpts of emails provided by the Allegheny County District Attorney's Office. Email addresses have been redacted by PublicSource.)
(Excerpts of emails provided by the Allegheny County District Attorney’s Office. Email addresses have been redacted by PublicSource.)
(Excerpts of emails provided by the Allegheny County District Attorney’s Office. Email addresses have been redacted by PublicSource.)

Assistant District Attorney Kevin McCarthy told PublicSource in an email response to PublicSource’s records request that the DA’s office “has never purchased or contracted with any entity to obtain facial recognition software” other than the Clearview trials shown in the emails. There is no record of a relationship with Clearview past when the last 30-day trial would likely have ended.

In a series of email responses to PublicSource on July 14, DA spokesperson Mike Manko wrote that the DA’s office does not use any type of facial recognition technology. “…If a school district or municipality were to approach District Attorney Zappala believing that that type of technology would be productive for their needs, then the District Attorney would assist them in obtaining that technology,” he said. 

Manko did not address the previous trial accounts, despite several PublicSource inquiries. Unanswered questions include how the Clearview trial was used, how many searches were made, if the system was used in relation to any past or present cases, whether any ethical guidelines were issued and if the DA’s office intends to use facial recognition technology in the future.

Clearview did not respond to multiple requests for comment. 

In an email dated Feb. 7, Xiong from the DA’s office wrote to colleagues that “the premise of [Clearview] sounds like it’d be very helpful for our OSINT work.”

According to Freddy Martinez, a policy analyst at the transparency advocacy organization Open the Government, OSINT stands for “open source intelligence” and is “a growing staple of policing work, with a lot of officers using Facebook/Twitter/Youtube/Snapchat/TikTok to search for suspects.” He notes that OSINT is cheap, with minimal setup required to learn about suspects’ lives.

(Excerpts of emails provided by the Allegheny County District Attorney's Office. Email addresses have been redacted by PublicSource.)
(Excerpts of emails provided by the Allegheny County District Attorney’s Office. Email addresses have been redacted by PublicSource.)
(Excerpts of emails provided by the Allegheny County District Attorney’s Office. Email addresses have been redacted by PublicSource.)

Clearview has been used elsewhere in Pennsylvania. The Philadelphia Police Department used a trial version in November 2019, running roughly 1,000 searches, the Philadelphia Inquirer reported in March. According to Buzzfeed, police in Wyomissing had an account, as did Central Montco Technical High School in suburban Philadelphia, though, at most, five searches had been made from the school’s account as of February.

Company founder Hoan Ton-That has described the technology as a “search engine for faces.” By scraping the open web, Clearview has compiled an ever-growing database of billions of photos. Users can upload a person’s photo and, within seconds, see all publicly available photos of the person and the sources of the photos, such as social media accounts or news sites. The New York City-based company markets itself as a tool for law enforcement and says its system has helped investigators track down “hundreds of at-large criminals,” including pedophiles, terrorists and sex traffickers. “It’s a very powerful thing, and used responsibly… it can have so much benefit to the world,” Ton-That said in a March CNN Business interview.   

Martinez warned that the company’s system has serious privacy implications. 

“They claim that it’s a facial recognition software app, but really what they’re doing is building really intimate profiles of people online,” said Martinez, who has researched Clearview and aided in a January New York Times investigation of the company. A person’s LinkedIn page might reveal their place of work; their Facebook page might list their family members or the neighborhood groups they belong to. “You can imagine that that kind of information, in aggregate, over time, is really revealing,” Martinez said. 

In 2019, San Francisco became the first major U.S. city to ban police from using facial recognition technology, followed by Boston in June. New Jersey ordered police to stop using Clearview in January, following the New York Times investigation. 

The use of facial recognition technology by law enforcement is controversial, and ongoing protests over the deaths of George Floyd, Breonna Taylor and other Black people at the hands of police have elevated already existing concerns from activists and privacy advocates. During protests against the death of Freddie Gray in 2015, Baltimore police used facial recognition technology to identify and arrest protesters with outstanding warrants. 

The current wave of protests has caused some major facial recognition companies to rethink their practices. In June, IBM announced it would stop selling facial recognition technology to law enforcement. Amazon followed suit, announcing it would temporarily ban police use of its facial recognition technology. 

There are no records indicating that the DA’s office used Clearview during the Black Lives Matter protests. The 30-day trials would have ended before the recent protests began, and Manko said the system is not currently being used. 

Facial recognition technology has been useful in solving some crimes, and more than half of U.S. adults at least somewhat trust law enforcement to use facial recognition technology responsibly, a Business Insider poll found. Yet Clearview specifically has generated a significant amount of pushback, including a lawsuit from the American Civil Liberties Union and investigations by Canada and the United Kingdom and Australia. Facebook, Instagram, Twitter, Google, YouTube and Venmo have demanded Clearview stop using photos from their sites, to which Clearview has not agreed. The company’s system was also featured in a critical segment by comedian John Oliver in June.

Michael Skirpan, the executive director of Community Forge in Wilkinsburg and member of the Pittsburgh Task Force on Public Algorithms, voiced concern about how Clearview uses information outside of what he described as its intended context. 

“Clearview is sort of a very ‘Black Mirror’-esque breaking of trust, in the fact that our data, once we put it on one platform, in aggregate it’s just sort of out there,” without the privacy or consent that was originally intended, he said. He compared it to Facebook’s Cambridge Analytica scandal, when the firm used Facebook user data to sell profiles of American voters to political campaigns. “No one is really protecting or stewarding your information the way you think,” he said. 

Previous facial recognition technologies have been found to have significant racial and gender biases in their algorithms, meaning they’re less accurate at correctly identifying faces of people of color, especially Black people, and women. Clearview has said that, according to an independent review panel, its algorithm achieved 100% accuracy and did not show any racial or gender bias. Yet some researchers and advocates find the claims dubious. 

“…From what we’ve seen, I wouldn’t even say it’s misleading. It’s just intentionally claiming things that have no basis in reality,” Martinez said.

Bonnie Fan, a student organizer at Carnegie Mellon University and member of the Coalition Against Predictive Policing in Pittsburgh, pointed out that facial recognition technology has led to wrongful arrests before. “If you have a technology that does a poor job of recognizing racial and gender minorities, you’re much more likely to target the wrong people… and that’s on top of the current racial inequitable structures,” Fan said.

In an email to PublicSource, Pittsburgh-based cybersecurity professional and Women in Technology Pittsburgh founder Alison Falk noted the implications Clearview could have if law enforcement officers used it for personal, non-policing reasons. “Privacy scholars and researchers have been extremely vocal about the abuses that can occur when utilizing facial recognition, and to disregard experts in these instances is straight ignorance at the expense of the citizens they aspire to ‘protect and serve,’” she wrote.

As both a tech professional and community advocate, Black Tech Nation founder Kelauni Cook said she is often torn about the advancement of technology. “[W]hile I do think facial recognition technology could have some very amazing effects on our world, I think that bad actors and people who are irresponsible with it will only exacerbate the effects that we’re already seeing, especially with policing,” she said, speaking generally about facial recognition technology.  

She noted that the people who use such technology are often not part of the communities most affected by it. “I don’t think they’re consciously saying, ‘Oh, and I hope that it racially profiles. I hope that it helps us be more racist,’” she said, “But again, this is why it’s so important that certain people are at the table when decisions are being made.”

Juliette Rihl is a reporter for PublicSource. She can be reached at juliette@publicsource.org or on Twitter at @julietterihl.

This story was fact-checked by Emma Folts.

Know more than you did before? Support this work with a MATCHED gift!

Through Dec. 31, the Wyncote Foundation, Loud Hound Foundation and our generous local match pool supporters will match your new monthly donation 12 times or double your one-time gift, all up to $1,000. Now that's good news!

Readers tell us they can't find the information they get from our reporting anywhere else, and we're proud to provide this important service for our community. We work hard to produce accurate, timely, impactful journalism without paywalls that keeps our region informed and moving forward.

However, only about .1% of the people who read our stories contribute to our work financially. Our newsroom depends on the generosity of readers like yourself to make our high-quality local journalism possible, and the costs of the resources it takes to produce it have been rising, so each member means a lot to us.

Your MATCHED donation to our nonprofit newsroom helps ensure everyone in Allegheny County can stay up-to-date about decisions and events that affect them. Please make your gift of support now.

Juliette Rihl reports on criminal justice, public safety and mental health for PublicSource. Her 2020 series on how court debt impacts low-income Allegheny County residents prompted the county to join...