A Ring camera. Sentinel File Photo

City lawmakers should give tentative permission to Aurora  Police to add facial recognition software to its investigative toolbox, with some critical caveats.

The technology could greatly help police solve crimes, track down dangerous offenders and bring justice to crime victims. There is no doubt that it also comes with risks that demand strict rules, full transparency and careful oversight.

Aurora police leaders say they have been working for nearly three years on a plan to adopt the software. Essentially, the program being proposed would allow investigators to take images of suspects and then search most of the internet for possible matches, providing not only an identity but possibly even a location.

So far, police have posted accountability reports on the city’s police website, opened a public comment portal, and pledged to hold at least three public meetings before moving forward. Police also stress that the technology would be used only as an investigative lead, never as probable cause for an arrest, and that all matches would be reviewed by multiple trained analysts.

Those are encouraging steps. They suggest the department understands both the promise and the peril of facial recognition.

Still, the community should not rush. Technology that scans faces against vast databases, including billions of images scraped from the internet by private firms like Clearview AI, is unlike fingerprints or DNA. It reaches into public places, into people’s lives without consent, and it carries a history of mistakes that cannot be brushed aside.

As reported by the Associated Press and other media, gaffes have led to wrongful arrests across the country, often of Black men misidentified by flawed systems. In Georgia, a man spent nearly a week in jail after being misidentified by facial recognition software. In Detroit, an eight-month pregnant woman was jailed for hours after a bad match linked her to a robbery she had nothing to do with.

These are not minor errors. They are profound injustices, and they undermine faith in both policing and technology. Aurora police cannot afford any loss of trust or credibility.

Aurora police officials say they want to avoid those tragedies by making sure every search result is treated only as a tip, not as definitive evidence. That distinction matters. But it also requires discipline and transparency.

The Colorado Legislature, to its credit, established rules in 2022 requiring police agencies to adopt policies, publish accountability reports and secure approval from their governing bodies before deploying facial recognition. It also made clear that no arrest can be based solely on a facial recognition result.

Aurora must go further. The city should require:

• Clear prohibitions on using the software for live surveillance, immigration enforcement or tracking political demonstrations. Police leaders say these restrictions are already in place, but they must be written into policy and enforced.

• Mandatory transparency. Every search conducted with facial recognition should be logged, audited and reported publicly on a regular basis. The community has a right to know how often the technology is used, for what kinds of cases, and the outcome.

• Independent oversight. An outside body, a civilian review board or an outside inspector should be empowered to review use of the technology and investigate complaints. Police cannot police themselves on this issue.

• Strong training requirements. Only officers who are specially trained should operate the software, and they must be instructed repeatedly that the results are investigative leads, not conclusions. Aurora’s plan to start with about a dozen trained officers is sensible, but expansion must be paced and carefully monitored.

• Strict limitations on databases. Lumen, providing a statewide mugshot database, is one thing. Clearview AI’s massive collection of billions of social media and internet images, however, is another. Aurora must explain, clearly and publicly, what images will be used, how they will be safeguarded, and how long they will be kept.

The financial cost is not insignificant. Clearview AI alone would cost an estimated $135,000 over three years. Taxpayers have a right to demand clear and accountable evidence that the benefits justify the expense.

Proponents of facial recognition say it can speed up investigations, help catch all manner of criminals and make communities safer. That may well be true. A fuzzy doorbell camera image might one day provide the crucial lead that solves a homicide or prevents another crime.

But justice is not just about catching criminals. It is also about protecting the innocent, preserving civil rights and ensuring equal treatment under the law. A technology that increases the risk of wrongful arrest erodes justice. A tool that chills free speech by making people fear they are being watched at protests or public gatherings weakens democracy.

Aurora is right to study the issue slowly, invite public comment and put safeguards in writing. Aurora should approve any program for no more than one year, providing an opportunity for careful review and a time to persuade the community whether the technology has any value. 

Join the Conversation

2 Comments

  1. I agree with the board! As long as strict guidelines are written into law and followed transparently by instating a citizen’s review committee that has real input, this tool could have merit. I also agree that by logging and reviewing outcomes to measure efficacy the community will be assured that the usage of the software and the data backing it up is driving future usage.

    1. Kane you continue to amaze me with your simplistic look at the Sentinel Blogs simplistic look at our community.

      The APD comes up with a tool to help solve crimes then you and the Blog suggest we overwhelm that tool with a tremendous amount of rules, guidelines and bureaucracy.

Leave a comment

Your email address will not be published. Required fields are marked *