The Capitol riot marks another notable moment in the ongoing facial recognition debate. 

The facial recognition app Clearview AI saw an increase in use the day after the Jan. 6 storming of the U.S. Capitol, the New York Times reported. As police departments throughout the United States are helping the FBI identify rioters, some are reportedly using facial recognition technology. 

The use of facial recognition last year to investigate suspected crimes related to Black Lives Matter protests raised privacy and First Amendment concerns from activists, advocates and some lawmakers. Studies show the technology, which attempts to match an uploaded image of a person to other images in a photo database, is less accurate at identifying people of color and women. Facial recognition has also resulted in at least three Black men being wrongfully arrested.

Privacy and ethical technology advocates PublicSource spoke to warned against ramping up surveillance technology following the overwhelmingly white and male insurrection at the Capitol.

Since last summer, Pittsburgh and Allegheny County lawmakers have been grappling with the ‘ifs’ and ‘hows’ of regulating facial recognition. Pittsburgh City Council in September passed a bill regulating the technology. Allegheny County introduced a similar bill in October. It has not yet come up for a vote. 

Pittsburgh’s legislation requires city council’s approval of facial recognition and predictive policing technologies but allows, in emergency circumstances, for the public safety director or police chief to temporarily authorize their use without council’s approval. 

Councilman Corey O’Connor, who sponsored the city bill, said that the emergency exception was included to account for extreme situations like the Capitol riot. “You’re dealing with, in my opinion, domestic terrorism there,” O’Connor said, as opposed to “robbing a convenience store.”

The city legislation was prompted by a PublicSource investigation of facial recognition use by local law enforcement, including a trial of Clearview by the Allegheny County District Attorney’s office.

Compared to other facial recognition companies, Clearview’s method is unique: it harvests photos from social media and other public websites and stores them in a database of more than three billion photos. The technology has come under fire in the past from privacy advocates, Democratic lawmakers and the social media companies it scrapes for images. The company has been sued several times for allegedly violating privacy laws, including by the ACLU.

According to Clearview co-founder and CEO Hoan Ton-That, the app saw a 26% increase in searches on Jan. 7. “We have been informed that multiple identifications have been made by law enforcement using Clearview AI technology,” Ton-That wrote through a spokesperson in an email to PublicSource.

The Pittsburgh Bureau of Police does not use Clearview or possess facial recognition technology, spokesperson Chris Togneri said — though the bureau has access to facial recognition through the statewide law enforcement system JNET. In an email to PublicSource, Togneri said the bureau “is always willing to help our law enforcement partners at the local, state and federal levels” but did not specify whether the bureau is actively aiding in any Capitol investigations.

Adam Scott Wandt, an assistant professor of public policy at John Jay College of Criminal Justice, has trained law enforcement to conduct investigations using social media. He estimates thousands of officers and prosecutors around the country are collecting photo and video evidence to identify Capitol protesters, using facial recognition as a “primary investigative method.”

Law enforcement officers aren’t the only ones collecting evidence. Wandt pointed out that citizens are creating their own public archives of photos and images from the Capitol on websites like Reddit so that they can’t be deleted. He also noted that unlike other criminals and insurgents, most Capitol rioters took no steps to cover their faces or obscure their identities — making it even easier for law enforcement to identify them.

Freddy Martinez, a policy analyst with the transparency advocacy organization Open the Government, cautioned against expanding law enforcement’s surveillance powers following the riot. In an email to PublicSource, Martinez noted the privacy ramifications of “giving the government the ability to search a population’s entire social media history based on a person’s biometrics.”

“Instead, we should ask hard questions about why many of the law enforcement agencies ‘missed’ a threat organized on the same social media platforms it wants to collect data from,” Martinez wrote.

Bonnie Fan, a member of the Coalition Against Predictive Policing in Pittsburgh, agreed that the Capitol riot does not justify facial recognition’s use. “The legacy of facial recognition has thus far been disproportionately impacting marginalized populations, as legal systems work in concert with technical ones in criminalizing the poor and historically disenfranchised,” Fan wrote in an email to PublicSource.

Michael Skirpan, the executive director of Community Forge in Wilkinsburg and member of the Pittsburgh Task Force on Public Algorithms, said more consistent standards of accountability and privacy are needed “across the board” for facial recognition. 

There shouldn’t be a new standard because people “feel OK” with using the technology now, Skirpan said. But because it was used during Black Lives Matter protests, he said he thinks its use following the Capitol riot is only fair. “I believe with the correct checks and balances, we can make sure tools are used for the best cases and withheld for the problematic ones,” he wrote in an email to PublicSource. “However, that is not where we are today.”

According to Wandt, law enforcement agencies are increasingly turning to facial recognition. He said Clearview is especially “game-changing” because it is designed to be used by all types of officers, including those without advanced training or directives from their departments. “I think that’s a little bit dangerous at this point, but that’s how it’s happening,” he said. 

Wandt stressed the need for anti-bias training and not relying solely on facial recognition matches. “A full human investigation still needs to be done,” he said, “because ultimately the person that you think you have may not be the suspect.”

Juliette Rihl is a reporter for PublicSource. She can be reached at juliette@publicsource.org or on Twitter at @julietterihl.

We don't have paywalls — but your support helps us bridge crucial information gaps.

Readers tell us they can't find the information they get from our reporting anywhere else, and we're glad to provide this important service for our community. We work hard to produce accurate, timely, impactful journalism without paywalls that keeps our region informed and moving forward.

However, only about .1% of the people who read our stories contribute to our work financially. Our newsroom depends on the generosity of readers like yourself to make our high-quality local journalism possible, and the costs of the resources it takes to produce it have been rising, so each member means a lot to us.

Your donation to our nonprofit newsroom helps ensure everyone in Allegheny County can stay up-to-date about decisions and events that affect them. Please make your gift of support now.

Juliette Rihl reports on criminal justice, public safety and mental health for PublicSource. Her 2020 series on how court debt impacts low-income Allegheny County residents prompted the county to join...