News

Report: Live Facial Recognition Tools 'Staggeringly Inaccurate'

According to the BBC, U.K. police using live facial recognition tools to help identify wanted criminals in large crowds are encountering a significant number of false positives.

One report given to British privacy group Big Brother Watch stated that the South Wales Police used the technology from March 2017 to March 2018 to make 2,685 "matches"; however, 2,451 of those -- or a whopping 91 percent -- were "false alarms."

Another report Big Brother Watch uncovered said that a police department that used the software to scan people at a couple of one-day events got 102 matches, but no one was actually arrested.

And one police department reported that it stopped using the technology altogether.

The reports did not name the systems or technologies being used by each department.

South Wales Police responded to the BBC's reporting that part of the reason its false positives were so high is that, at first, it was working with very low-quality images; it told the BBC that its accuracy rate has improved over time. The department also stated that no one was wrongly arrested.

Calls to improve the accuracy of facial recognition software have also occurred in the United States, among other locations.

About the Author

Becky Nagel serves as vice president of AI for 1105 Media specializing in developing media, events and training for companies around AI and generative AI technology. She also regularly writes and reports on AI news, and is the founding editor of PureAI.com. She's the author of "ChatGPT Prompt 101 Guide for Business Users" and other popular AI resources with a real-world business perspective. She regularly speaks, writes and develops content around AI, generative AI and other business tech. Find her on X/Twitter @beckynagel.

Featured

Upcoming Training Events