
AURORA | Aurora Police are asking city lawmakers permission to use facial recognition software and systems to help identify suspects and solve crimes.
“This is very much what we do now, just relying on human recognition,” Police Commander Chris Poppe of Aurora’s District 3 said. “We’re just going to use software to do that now.”
The systems and philosophy of government using the software has long been peppered with criticism by critics who say the technology can result in grievous errors and has resulted in civil rights abuses around the globe. Proponents say it could be a huge boon in quickly identifying suspects before they get away.
As required by state law, Aurora police officials are asking city leaders to authorize the use of facial recognition technology, a step that would formalize a program the department has already been building for nearly three years, according to Poppe.
Poppe presented the proposal during a city council Public Safety, Courts and Civil Service Policy Committee meeting Sept. 11, describing it as a “deliberate and paced” effort shaped by state regulations, outside consultants and national best-practice standards.

“We’re hoping to enhance productivity, increase crime solvability and ultimately just make the community safer,” Poppe said. “The more efficient we are at solving crimes, the safer Aurora will be.”
If city lawmakers do approve it, facial recognition would be added to the department’s existing biometric tools, such as DNA and fingerprinting, Poppe said. Investigators often collect video or still images from doorbell cameras, as well as those from businesses or city cameras, if individuals and businesses are willing to provide them, Poppe said. Currently, those images are circulated through bulletins or Crime Stoppers, with the hope that someone will recognize the suspect.
Now, the police department is hoping to utilize software as an additional tool for this purpose. If approved, they will be able to take an image and compare it to a “volume of images” that they already have, like mugshots, or from the internet, including social media pictures, Poppe said.
Aurora police propose using two widespread systems, Lumen and Clearview AI, Poppe said.
Lumen is a statewide database of mugshots from Colorado jails, with which the city already partners. Clearview AI is a private company that scrapes publicly available images from social media and the internet.
Both systems would be used only after investigators establish reasonable suspicion in an ongoing case, he said. Matches generated by the software would be treated only as investigative leads, not as probable cause for arrest.
Missteps in other cities have led to wrongful arrests, he said.
“It’s a tip,” he said. “It does not establish probable cause.”
Using facial recognition has been controversial for several years, with some critics claiming that people of color have a higher rate of misidentification than white people. The aspect that seems to be stressed most by lawmakers and police departments is that it cannot be used as singular evidence, as was the case in a 2023 incident involving a man named Randal Quran Reid in Georgia, who was held in jail for six days after being misidentified by facial recognition technology.
Critics also have concerns about civil liberty and privacy regarding the technology, and some groups like the ACLU have asked police departments to stop using it altogether, like in Detroit, after an eight-month pregnant Black woman was held in jail for hours after being misidentified for a robbery and carjacking.
In 2022, the Colorado Legislature established guidelines for the use of facial recognition by police, requiring agencies to adopt policies, submit accountability reports and obtain approval from their governing bodies, Poppe said.
For Aurora, this means the city council must formally approve the implementation, and the public must be informed about it.
“Prior to that, we were using facial recognition, and it didn’t have a lot of guidelines,” Poppe said. “It was across the country, not being used necessarily responsibly. We weren’t providing accountability or much transparency. So this gave us some framework to move forward.”
The police department has already posted accountability reports on its website and opened a public comment portal. By law, police must maintain that feedback channel open for at least 90 days and hold at least three community meetings before proceeding.
Aurora Police will evaluate and incorporate that information into their final implementation, Poppe said.
“This is the start of the community input process,” Poppe said. “We’ve put a lot of effort into making sure this is done responsibly, transparently and with accountability.”
Aurora’s draft policy also outlines prohibited uses. If fully approved, police would not be allowed to use facial recognition for live surveillance, immigration enforcement, harassment or ongoing surveillance, Poppe said.
Use for marketing or commercial purposes is also barred, unless authorized by a court order. Poppe said that Nordstroms, for example, does use their cameras for facial recognition tracking for advertising and other purposes.
Poppe said that all searches would undergo “meaningful human review,” with at least two analysts and a supervisor reviewing the results before they are passed on to detectives. About a dozen specially trained officers would operate the software, with broader training planned for roughly 140 detectives over time.
The department estimates Clearview AI would cost about $32,000 in the first year, $36,000 the second year and $67,000 in the third. Training expenses would add to the total, Poppe said. Aurora’s participation in the Lumen database carries no additional cost.
Council members Amsalu Kassaw and Danielle Jurinsky asked questions about training numbers. Kassaw asked whether the system would be used for live identification, such as during arrests or at detention centers, and Poppe said no.
The proposal will be reviewed by the full city council in the coming months.


Eyewitness facial recognition has been used in courts for centuries despite proof that it is often unreliable as testimony. Computer facial recognition is more accurate in ideal situations, but can be less precise than human recognition when there is poor lighting or a video is of low quality. I view it as another tool to use in identifying suspects, as long as additional evidence is required to convict or exonerate a person accused of a crime. Again, how we approach subjects, treat them, and compile evidence is as important as the physical evidence. Our police departments are learning this the hard way in this age of ubiquitously available camera footage!