Are you being watched? Police admit to taking photos of innocent people to hunt for crims (VIDEO)
UK police have admitted to using facial recognition cameras – despite the tactic being labeled as “dangerous and inaccurate” by privacy groups, who say the software is flagging thousands of innocent people.
Police facial recognition cameras have already been trialed at large events across the UK, including football matches and festivals. The HD cameras detect all the faces in a crowd and compare them with existing police photographs, including mug shots. Potential matches are then flagged, allowing police to investigate further.
Big Brother Watch, a British civil liberties and privacy campaign group that speaks out against state surveillance and threats to civil liberties, has raised concerns that such programs infringe on the rights of Britons. The group submitted freedom of information (FoI) requests to every police force in the UK to find out the extent that the police are trialing facial recognition methods.
Two forces admitted to using the software, including the Metropolitan Police Force.
The Metropolitan Police used facial recognition at London's Notting Hill Carnival in both 2016 and 2017, and also at a Remembrance Sunday event. Systems incorrectly flagged 102 people as potential suspects, though none were arrested. Big Brother Watch said police systems had wrongly flagged thousands of innocent people, and was concerned that photos dubbed as "false alarms" were sometimes kept by police for weeks.
In figures handed over to the privacy group, South Wales Police revealed that its facial recognition technology had made 2,685 "matches" between May 2017 and March 2018 – and 2,451 of those were false alarms.
READ MORE: Facebook data downloads highlight what the social network knows about you
South Wales Police defended its use of the facial recognition software, insisting that the system has improved over time. "When we first deployed and we were learning how to use it... some of the digital images we used weren't of sufficient quality," Deputy Chief Constable Richard Lewis told the BBC. "Because of the poor quality, it was identifying people wrongly. They weren't able to get the detail from the picture."
South Wales Police added that a "number of safeguards" stopped police taking action against innocent people. "Firstly, the operator in the van is able to see that the person identified in the picture is clearly not the same person, and it's literally disregarded at that point," Lewis said.
"On a much smaller number of occasions, officers went and spoke to the individual... realised it wasn't them, and offered them the opportunity to come and see the van. At no time was anybody arrested wrongly, nobody's liberty was taken away from them."
The Metropolitan Police said that “all alerts against the watch list are deleted after 30 days,” adding that any “faces in the video stream that do not generate an alert are deleted immediately."
Big Brother Watch said it was concerned that facial recognition cameras would affect "individuals' right to a private life and freedom of expression."
The privacy group also said that: "automated facial recognition technology is currently used by UK police forces without a clear legal basis, oversight or governmental strategy."
The Home Office said that it plans to publish its biometrics strategy in June, and it "continues to support police to respond to changing criminal activity and new demands."
"When trialling facial recognition technologies, forces must show regard to relevant policies, including the Surveillance Camera Code of Practices and the Information Commissioner's guide," it said in a statement.
Think your friends would be interested? Share this story!