Print this page
Published in AI

London coppers ready to intro inaccurate facial recognition system

by on02 July 2018


What could possibly go wrong?

Millions of people face the prospect of being scanned by police facial recognition technology that has sparked human rights concerns and has been found to be "staggeringly inaccurate".

The Met Police is set to expand a trial across six locations in London over the coming months despite The Independent revealing that software was returning "false positives" of people who were not on a police database in 98 percent of alerts.

Detective Superintendent Bernie Galopin, the lead on facial recognition for London's Metropolitan Police, said the operation was targeting wanted suspects to help reduce violent crime and make the area safer. "It allows us to deal with persons that are wanted by police where traditional methods may have failed", he said.

Of course if the system is targeting innocent people then it would explain why the system works.  Innocent people who have not committed a crime are easier to catch, because they don't expect it.

Det Supt Galopin said the Met was assessing how effective facial recognition was at tackling different challenges in British policing, which is currently being stretched by budget cuts, falling officer numbers, rising demand and the terror threat.

A policy officer from the National Council for Civil Liberties called the technology "lawless", adding "the use of this technology in a public place is not compatible with privacy, and has a chilling effect on society".

But a Home Office minister said the technology was vital for protecting people from terrorism, though "we must ensure that privacy is respected. This strategy makes clear that we will grasp the opportunities that technology brings while remaining committed to strengthening safeguards".

Last modified on 02 July 2018
Rate this item
(0 votes)