EDITORIAL: Zoom in on Colorado’s criminal element
AI technology’s rapid expansion was bound to intersect with society’s need to collar criminals. Hence, the latest generation of facial-recognition technology now employed by several Colorado law enforcement agencies and soon to be implemented in the state’s No. 3 city.
When used within reasonable parameters to safeguard citizens’ civil liberties, facial-recognition technology holds promise in advancing the crime fight and shoring up public safety. This week, Aurora tentatively OK’d facial recognition for use by the city’s police; it’s the largest law enforcement agency in the state to do so thus far. The decision on Monday by the Aurora City Council requires a follow-up vote before it is officially implemented.
Larger law enforcement agencies, like Denver and Colorado Springs police departments and the El Paso County Sheriff’s Office haven’t yet put facial recognition to work, but it’s probably just a matter of time.
Facial recognition’s use in criminal investigations piggybacks on another phenomenon — society’s proliferating penchant for capturing just about anything that moves on video. A doorbell camera might record a prowler, for example, and criminal investigators can compare those images to databases of booking mugshots or driver’s license photos.
Only, instead of investigators spending days poring over thousands upon thousands of photos with a magnifying glass and inexact human eyes, AI-driven facial recognition technology can make the same comparisons in moments and yield some likely matches with precision. It can be a big stride in cracking some cases and identifying suspects.
“Cameras are everywhere, and oftentimes we have the benefit of having a criminal captured on some kind of film, either doing the crime or arriving or after,” Aurora Police Commander Chris Poppe told council members.
Considerable care has been taken to ensure authorities don’t nab the wrong suspect, he said.
As reported in The Gazette, Poppe assured the City Council that facial recognition technology would not allow police to detain somebody merely by a photo match. Instead, police would use it to guide investigations, pointing them in the right direction much as fingerprints or eyewitness testimony do.
A 2022 state law in fact established guardrails for facial recognition’s use by police as well as prerequisites to implementing the technology, including holding public meetings to hear community feedback on using facial recognition and publishing a webpage informing the public about its use.
Poppe also assured the council police wouldn’t use facial recognition to harass or intimidate someone; to violate constitutional rights; for civil immigration enforcement, or for ongoing surveillance or persistent tracking without a search warrant.
What about false positives? Poppe told the council it is actually human witnesses to crimes who are more likely to present that challenge; AI would be a more precise tool than relying on human memory. In any event, a photo match doesn’t in itself provide authorities with probable cause for an arrest, he said. It just helps investigators ask the right questions.
Given those ample precautions, policymakers in other communities would be wise not to let less plausible, contrived concerns about civil liberties dissuade them from employing the technology.
Facial recognition and other next-gen innovations — like AI-assisted license plate readers, which can curb auto theft and find missing persons, among other uses — are increasingly critical to today’s crime fight.
So long as new technology is used in ways that respect civil rights, with clearly defined boundaries, its use should come as a comfort to all law-abiding Coloradans.




