SAN FRANCISCO (KGO) -- A facial recognition company is being accused of violating state privacy law - including scraping images online without consent and selling to police. A new report suggests the company's alleged practices are disproportionately impacting communities of color.
The report released from Consumer Watchdog says Clearview AI has more than 40 billion images that have been scraped online, often from social media sites. Consumer Watchdog accuses the facial recognition company of selling these images to public agencies to make a profit.
MORE: Are you biased? Here's what we found when we put people around the Bay Area to 'perception test'
"And in fact, it's virtually impossible to avoid being in that database," said Consumer Watchdog's Litigation Director Jerry Flanagan. "Clearview AI then uses its technology to do a mathematical map of your face and then sell the data to state agencies."
Consumer Watchdog says the company's tech is invasive and a violation of state law.
"California's law makes it very clear -- it's illegal for Clearview to collect your image without your consent, as well as to sell that data or share it to others without your consent," said Flanagan.
According to Flanagan, a recent ACLU settlement barred Clearview from selling its product to private businesses, but says there are no statewide restrictions on the company to sell its tech to government and law enforcement agencies in California.
"The CEO has made the point that they are selling broadly throughout the United States to law enforcement," Flanagan told the I-Team.
Clearview AI advertises on its website it's "Advancing Public Safety" - adding its investigative platform allows law enforcement to rapidly generate leads to help identify suspects, witnesses, and victims to close cases faster. The company says the technology only searches publicly available information from the internet and is used to help solve crimes such as drug trafficking, human trafficking and child abuse.
MORE: SF supervisors OK measure expanding police access to private surveillance cameras
"It is not intended to be used as a real-time surveillance tool. Clearview AI requires its law enforcement customers to provide a case number and crime type to ensure an audit trail and to enforce responsible usage of facial recognition and provides training for its customers," said Hoan Ton-That, Clearview AI's CEO.
But numerous reports highlight concerns about inaccuracies targeting communities of color - including a recent study published in November by Georgetown University's Law Center that wrote facial recognition software generally to be "particularly prone to errors". The study went on to say the real world consequences of such errors include the "investigation and arrest of an unknown number of innocent people and the deprivation of due process of many, many more."
This comes after a M.I.T. study found generally facial recognition software incorrectly identified up to 35 percent of dark-skinned women.
In a statement, Clearview AI criticized Consumer Watchdog's report to be mistaken conclusion of the law.
Consumer Watchdog's assertions about Clearview AI's opt-out processes are incorrect. Images of Californians who have exercised their right to opt-out of our data processing are blocked from any subsequent collection. Consumer Watchdog's assertions about facial recognition accuracy are incorrect. Clearview AI's algorithm has been assessed by the National Institute of Standards and Technology, a U.S. government office, and found to be highly accurate across all demographics."
MORE: Black people 4x more likely to be arrested than white people in Bay Area, data shows
But the alleged misidentification is not the only controversy with the technology.
Consumer Watchdog also says Clearview AI is not allowed to sell or publish pictures of children without their consent, but the nonprofit alleges it appears the company knowingly does so despite violation of the law.
Clearview AI responded to the I-Team refuting that claim saying, "Clearview AI is fully compliant with California law, including law relating to the data of minors, and has completely and effectively opted-out many Californians."
The company added, "Images of Californians who have exercised their right to opt out of our data processing are blocked from any subsequent collection."
But Consumer Watchdog says there's reason to be skeptical.
"So who knows at this point where this data has showed up," Flanagan said. "That's one of the problems is that we don't know a lot about how the data is being used."
MORE: San Francisco police to stop releasing mug shots
There's also questions about the right to privacy with political rallies. Clearview AI CEO Ton-That told the I-Team, "I was appalled by the tragic events on January 6th and the attack on the Capitol and our democracy. While we cannot provide specifics on these cases, it is gratifying that Clearview AI has been used to identify the Capitol rioters who attacked our great symbol of democracy."
But Consumer Watchdog argues the use of the technology is infringing upon First Amendment rights.
"When you're in a public space exercising your First Amendment rights to protest or to associate with a political party or an issue that you care about, you have the right to do so without unwarranted government surveillance," said Flanagan. "Because the surveillance aspect in our public places chills that participation."
Consumer Watchdog sent a letter to California Attorney General Rob Bonta and the California Privacy Protection Agency requesting enforcement of state law and an audit of the company to find out how they're using the data.
The AG's office told the I-Team, "Protecting minors' personal information is a top priority for the California Department of Justice. However, to protect its integrity, we're unable to comment on, even to confirm or deny, a potential or ongoing investigation."
Several California cities, including San Francisco and Oakland have adopted ordinances barring city agencies from using facial recognition technology. According to court documents, the technology has also been used in Antioch and Alameda. Court filings show numerous lawsuits have been filed against the company regarding the use of its technology, including one case out of Alameda.
Take a look at more stories and videos by the ABC7 News I-Team.
If you're on the ABC7 News app, click here to watch live