Metropolitan Police’s facial recognition technology wrong in 98 per cent of cases

Facial recognition software used by the UK’s biggest police force has returned false positives in more than 98 per cent of alerts generated, The Independent can reveal, with the country’s biometrics regulator calling it “not yet fit for use”.

Facial recognition software used by the UK’s biggest police force has returned false positives in more than 98 per cent of alerts generated, The Independent can reveal, with the country’s biometrics regulator calling it “not yet fit for use”.

The Metropolitan Police’s system has produced 104 alerts of which only two were later confirmed to be positive matches, a freedom of information request showed. In its response the force said it did not consider the inaccurate matches “false positives” because alerts were checked a second time after they occurred.

Facial recognition technology scans people in a video feed and compares their images to pictures stored in a reference library or watch list. It has been used at large events like the Notting Hill Carnival and a Six Nations Rugby match.

The system used by another force, South Wales Police, has returned more than 2,400 false positives in 15 deployments since June 2017. The vast majority of those came during that month’s Uefa Champion’s League final in Cardiff, and overall only 234 alerts – fewer than 10 per cent – were correct matches.

Both forces are trialling the software.

Source: Metropolitan Police’s facial recognition technology wrong in 98 per cent of cases

Advertisements
Categories: Uncategorized

Post navigation

Comments are closed.

Blog at WordPress.com.

%d bloggers like this: