Edmonton police are testing out artificial intelligence facial-recognition bodycams without approval from Alberta’s information and privacy commissioner Diane McLeod.
Police say they don’t legally require what they describe as “feedback” from the commissioner during the trial or proof of concept stage.
But in an interview Wednesday on CBC’s Edmonton AM, McLeod said they do.
Liam Newbigging, Edmonton Journal:
Police at the Tuesday event to [unveil the pilot](https://edmontonjournal.com/news/local-news/edmonton-police-first-world-test-axon-facial-recognition-body-…
Edmonton police are testing out artificial intelligence facial-recognition bodycams without approval from Alberta’s information and privacy commissioner Diane McLeod.
Police say they don’t legally require what they describe as “feedback” from the commissioner during the trial or proof of concept stage.
But in an interview Wednesday on CBC’s Edmonton AM, McLeod said they do.
Liam Newbigging, Edmonton Journal:
Police at the Tuesday event to unveil the pilot said the assessment was sent to Alberta’s privacy commissioner Diane McLeod to ensure a “proof of concept test” for body-worn video cameras with new facial recognition technology is fair and respects people’s privacy.
But the office of the information and privacy commissioner told Postmedia in an email that the assessment didn’t reach it until Tuesday afternoon and that it’s possible that the review of the assessment might not be finished until the police pilot project is already over.
This looks shady, and I do not understand the rush. Rick Smith — the CEO of Axon, which markets body cameras and Tasers — points out the company has not supported facial recognition in its cameras since it rejected it on privacy grounds in 2019. Surely, Edmonton Police could have waited a couple of months for the privacy commissioner’s office to examine the plan for compliance.
Smith (emphasis mine):
The reality is that facial recognition is already here. It unlocks our phones, organizes our photos, and scans for threats in airports and stadiums. The question is not whether public safety will encounter the technology—it is how to ensure it delivers better community safety while minimizing mistakes that could undermine trust or overuse that encroaches on privacy unnecessarily. For Axon, utility and responsibility must move in lockstep: solutions must be accurate enough to meaningfully help public safety, and constrained enough to avoid misuse.
Those three examples are not at all similar to each other; only one of them is similar to Axon’s body cameras, and I do not mean that as a compliment.
We opt into using facial recognition to unlock our phones, and the facial recognition technology organizing our photo libraries is limited to saved media. The use of facial recognition in stadiums and airports is the closest thing to Axon’s technology, in that it is used specifically for security screening.
This is a disconcerting step toward a more surveilled public space. It is not like the Edmonton Police are a particularly trusted institution. Between 2009–2016 (PDF), roughly 90% of people in Edmonton strongly agreed or somewhat agreed with the statement “I have a lot of confidence in the EPS [Edmonton Police Service]”. This year, that number has dropped to around 54% (PDF) — though the newer survey also allows for a “neither confident nor unconfident” response, which 22% of people agreed with. Among Indigenous, 2SLGBTQI+, and unhoused populations, the level of distrust in the EPS rises dramatically.
Public trust is not reflective of the reality of crime in Edmonton, which has declined somewhat in the same time period, despite growing by half a million people. However, institutional trust is a requirement for such an invasive practice. A good step toward gaining trust is to ensure it has clearance from the privacy commissioner’s office before beginning a trial.