Advertisements
<pre><pre>Axon (formerly Taser) says that face recognition on the police station cams is unethical

Axon (formally known as Taser) has shifted more and more in recent years to body cameras for police officers, but today the company is undergoing a major change. On the advice of his AI ethics board: "Axon does not commercialize face-matching products on our body cameras," the company said. announced in a blog post today.

Advertisements

Axon established its AI and Policing Technology Ethics Board last April to help the company provide advice on ethically developing products. The first report from the board was published today, with an emphasis on advising Axon against the use of face recognition technology.

According to the board's report: "Face recognition technology is currently not reliable enough to justify its use on cameras worn on the body." It cites that at least more precise technology would be needed that "performs equally well in all races, ethnic groups, genders, and other identity groups", assuming that police body face recognition technology can someday become ethical considered a conversation that the board is starting to investigate.

The board also advocated enabling users (ie, police officers) to adapt face recognition software if it is part of future products to prevent abuse, while also advising that any court planning to use face recognition technology should do so. through "open, transparent, democratic processes."

Axon & # 39; s decision not to place face recognition software on cameras from the police station does not go as far as the board's suggestions. The company's blog post makes it clear that Axon will continue to investigate and pursue face-matching technology, including an attempt to de-bias the algorithms in the future. That means the company still hopes it's a case of & # 39; when & # 39; is, instead of & # 39; if & # 39; it is able to add the technology to its products.

The fact that Axon – a major supplier of police cameras in the US – is moving away from the technology is still great, given the controversial nature of the technology. Earlier this year, San Francisco became the first city in the US to forbid government agencies to use face recognition software, and Microsoft has refused software to law enforcement agencies, citing human rights issues.

At the same time, Amazon came under fire last year for doing exactly that, selling its Rekognition software to the police in Orlando and Oregon in Washington County, while New York City was caught earlier this year for abusing its own face recognition software.