APNEWS.COM
AI-powered police body cameras, once taboo, get tested on Canadian citys watch list of faces
An Axon body camera is worn by MSGT Matt Gilmore who is one of the officers using Axon's Draft One AI software at OKCPD headquarters on Friday, May 31, 2024 in Oklahoma City, Oklahoma. (AP Photo/Nick Oxford, File)2025-12-07T11:22:12Z Police body cameras equipped with artificial intelligence have been trained to detect the faces of about 7,000 people on a high risk watch list in the Canadian city of Edmonton, a live test of whether facial recognition technology shunned as too intrusive could have a place in policing throughout North America.But six years after leading body camera maker Axon Enterprise, Inc. said police use of facial recognition technology posed serious ethical concerns, the pilot project switched on last week is raising alarms far beyond Edmonton, the continents northernmost city of more than 1 million people.A former chair of Axons AI ethics board, which led the company to temporarily abandon facial recognition in 2019, told The Associated Press hes concerned that the Arizona-based company is moving forward without enough public debate, testing and expert vetting about the societal risks and privacy implications. Its essential not to use these technologies, which have very real costs and risks, unless theres some clear indication of the benefits, said the former board chair, Barry Friedman, now a law professor at New York University. Axon founder and CEO Rick Smith contends that the Edmonton pilot is not a product launch but early-stage field research that will assess how the technology performs and reveal the safeguards needed to use it responsibly. By testing in real-world conditions outside the U.S., we can gather independent insights, strengthen oversight frameworks, and apply those learnings to future evaluations, including within the United States, Smith wrote in a blog post.The pilot is meant to help make Edmonton patrol officers safer by enabling their body-worn cameras to detect anyone who authorities classified as having a flag or caution for categories such as violent or assaultive; armed and dangerous; weapons; escape risk; and high-risk offender, said Kurt Martin, acting superintendent of the Edmonton Police Service. So far, that watch list has 6,341 people on it, Martin said at a Dec. 2 press conference. A separate watch list adds 724 people who have at least one serious criminal warrant, he said. We really want to make sure that its targeted so that these are folks with serious offenses, said Ann-Li Cooke, Axons director of responsible AI. If the pilot expands, it could have a major effect on policing around the world. Axon, a publicly traded firm best known for developing the Taser, is the dominant U.S. supplier of body cameras and has increasingly pitched them to police agencies in Canada and elsewhere. Axon last year beat its closest competitor, Chicago-based Motorola Solutions, in a bid to sell body cameras to the Royal Canadian Mounted Police.Motorola said in a statement that it also has the ability to integrate facial recognition technology into police body cameras but, based on its ethical principles, has intentionally abstained from deploying this feature for proactive identification. It didnt rule out using it in the future. The government of Alberta in 2023 mandated body cameras for all police agencies in the province, including its capital city Edmonton, describing it as a transparency measure to document police interactions, collect better evidence and reduce timelines for resolving investigations and complaints.While many communities in the U.S. have also welcomed body cameras as an accountability tool, the prospect of real-time facial recognition identifying people in public places has been unpopular across the political spectrum. Backlash from civil liberties advocates and a broader conversation about racial injustice helped push Axon and Big Tech companies to pause facial recognition software sales to police.Among the biggest concerns were studies showing that the technology was flawed, demonstrating biased results by race, gender and age. It also didnt match faces as accurately on real-time video feeds as it did on faces posing for identification cards or police mug shots.Several U.S. states and dozens of cities have sought to curtail police use of facial recognition, though President Donald Trumps administration is now trying to block or discourage states from regulating AI. The European Union banned real-time public face-scanning police technology across the 27-nation bloc, except when used for serious crimes like kidnapping or terrorism. But in the United Kingdom, no longer part of the EU, authorities started testing the technology on London streets a decade ago and have used it to make 1,300 arrests in the past two years. The government is considering expanding its use across the country.Many details about Edmontons pilot havent been publicly disclosed. Axon doesnt make its own AI model for recognizing faces but declined to say which third-party vendor it uses. Edmonton police say the pilot will continue through the end of December and only during daylight hours. Obviously it gets dark pretty early here, Martin said. Lighting conditions, our cold temperatures during the wintertime, all those things will factor into what were looking at in terms of a successful proof of concept.Martin said about 50 officers piloting the technology wont know if the facial recognition software made a match. The outputs will be analyzed later at the station. In the future, however, it could help police detect if theres a potentially dangerous person nearby so they can call in for assistance, Martin said.Thats only supposed to happen if officers have started an investigation or are responding to a call, not simply while strolling through a crowd. Martin said officers responding to a call can switch their cameras from a passive to an active recording mode with higher-resolution imaging. We really want to respect individuals rights and their privacy interests, Martin said. The office of Albertas information and privacy commissioner Diane McLeod said she received a privacy impact assessment from Edmonton police on Dec. 2, the same day Axon and police officials announced the program. The office said Friday its now working to review the assessment, a requirement for projects that collect high sensitivity personal data.University of Alberta criminology professor Temitope Oriola said hes not surprised that the city is experimenting with live facial recognition, given that the technology is already ubiquitous in airport security and other environments. Edmonton is a laboratory for this tool, Oriola said. It may well turn out to be an improvement, but we do not know that for sure.Oriola said the police service has had a sometimes frosty relationship with its Indigenous and Black residents, particularly after the fatal police shooting of a member of the South Sudanese community last year, and it remains to be seen whether facial recognition technology makes policing safer or improves interactions with the public. Axon has faced blowback for its technology deployments in the past, as in 2022, when Friedman and seven other members of Axons AI ethics board resigned in protest over concerns about a Taser-equipped drone.In the years since Axon opted against facial recognition, Smith, the CEO, says the company has continued controlled, lab-based research of a technology that has become significantly more accurate and is now ready for trial in the real world. But Axon acknowledged in a statement to the AP that all facial recognition systems are affected by factors like distance, lighting and angle, which can disproportionately impact accuracy for darker-skinned individuals.Every match requires human review, Axon said, and part of its testing is also learning what training and oversight human reviewers must have to mitigate known risks.Friedman said Axon should disclose those evaluations. Hed want to see more evidence that facial recognition has improved since his board concluded that it wasnt reliable enough to ethically justify its use in police cameras. Friedman said hes also concerned about police agencies greenlighting the technologys use without deliberation by local legislators and rigorous scientific testing. Its not a decision to be made simply by police agencies and certainly not by vendors, he said. A pilot is a great idea. But theres supposed to be transparency, accountability. ... None of thats here. Theyre just going ahead. They found an agency willing to go ahead and theyre just going ahead.-AP writer Kelvin Chan in London contributed to this report. MATT OBRIEN OBrien covers the business of technology and artificial intelligence for The Associated Press. mailto GARANCE BURKE Burke is a global investigative journalist with The Associated Press based in San Francisco. She focuses on artificial intelligence and government accountability, and her work has been honored as a Pulitzer finalist and with a documentary Emmy Award. She can be reached on Signal at garanceburke33. twitter mailto
0 Comments 0 Shares 2 Views 0 Reviews