More facial recognition is coming to London
The city's police have just been given a big confidence boost over its use of AI
Morning — thanks to ChatGPT, there’s a lot of hype around artificial intelligence at the moment. And now London’s police force is joining in: a new study has given the Met a big confidence boost over its use of facial recognition technology. We explain why after your Thursday briefing below.
P.S. We hope everyone enjoys the big Easter weekend. If you’ve got a spare mo over those bank holidays, why not give the Spy a share using the button below?
What we’ve spied
🚇 Plans to expand the Night Tube are being paused indefinitely until TfL gets more cash. At the moment the 24-hour service is only available on the Central, Victoria, Jubilee, Northern, Piccadilly and Overground lines on Fridays and Saturdays, but in 2018 the mayor said he was looking to bring it to other parts of the Underground. City Hall now says this is unlikely given TfL’s finances have taken a turn for the worse after the pandemic.
😡 London schools are holding anti-extremism workshops to counter the growing popularity of misogynist Andrew Tate’s views among pupils. The BBC popped in on one this week to see how teens are being warned about the power of influencers like Tate.
👮 Downing Street and Sadiq Khan are among those calling for Sarah Everard’s murderer Wayne Couzens to be stripped of his second pension. The former officer already had his Met pension taken away in January, but a campaign is underway to get him stripped of his second pension from the Civil Nuclear Constabulary too.
✈️ A London airport is now the second in the country to scrap 100ml liquid limits. London City Airport will use high-tech scanners instead which also allow electronics to be kept in hand luggage at security.
💻 Tech industry layoffs had a visible impact in the capital this week. Hundreds of Google employees staged a walkout at the company’s London offices on Tuesday in a dispute about the 12,000 employees it’s cutting worldwide. Meanwhile, Microsoft, which is laying off 10,000 people from its global workforce, announced it was abandoning plans for a new London HQ.
📣 A march was held in Croydon at the weekend over a recent rise in violent incidents in the borough. Four young people have been stabbed in the past two weeks, prompting local residents, church and community leaders and politicians to take a public stand.
🦜 Finally: the BBC Earth Experience opened in Earl’s Court on Thursday. The exhibition showcases nature documentary footage using massive screens and a bespoke narration by Sir David Attenborough.
A turning point for Met facial recognition
The Metropolitan Police is set to use more ‘live facial recognition’ in London after a study found it’s getting more accurate. Though the force has been experimenting with the tech for a while, it’s done so tentatively, amid concern it’s too inaccurate, prone to discrimination and still quite a grey zone legally. That could be about to change — on Wednesday the Met publicly welcomed an independent report that found a “substantial improvement” in the accuracy of its automatic cameras. Here’s what Londoners need to know:
What exactly is this technology? Live facial recognition cameras (LFR) let the Met scan London’s crowds and streets in real time to identify faces from a predetermined watchlist, using artificial intelligence. That watchlist might be made up of wanted criminals or terrorists, but it also could include those of wider concern to the police, like missing persons. Nearby officers receive alerts when a person of interest has been flagged by a camera and then, after manually looking at the match themselves, they potentially take action, such as making an arrest. The faces of those not getting matched — i.e. the random members of the public passing by — are pixelated and then deleted by the system, rather than being permanently saved. The cameras themselves are meant to be portable, and they’re sometimes mounted on top of poles, almost like a boom mic, or sometimes on top of police vans.
How have the Met used it? The force started trialling LFR in 2016, with the first-ever deployment being at Notting Hill Carnival. For 12 hours cameras on poles were extended over a street and hooked up to a watchlist of 266 people that included wanted sex offenders and individuals banned from attending the carnival. Though the cameras made no positive identifications, the force was impressed with the technology’s “potential” and returned to the carnival the next year with cameras mounted on vans. This time they got their first match.
So trials continued through to 2019, with further deployments at events like Remembrance Sunday and in busy London locations like Soho and Stratford shopping centre. By the final four trials, the Met had managed to make eight arrests using the cameras. Though the technology falsely matched 1 in every 1,000 passersby, the force concluded: “The trial indicates that LFR will be an effective policing tool that helps the MPS stop dangerous offenders and make London a safer place.”
Yet by this point, scrutiny of the use of LFR by UK police forces had increased significantly. In May 2019 the UK’s data regulator warned it represented a threat to privacy and could violate data protection laws. The next month MPs on the House of Commons Science and Technology called for police forces to suspend using the technology until new regulations were in place.
But no regulations came. So the Met continued testing LFR, with more deployments taking place as recently as last summer, where it was used in Oxford Street following a spate of robberies. What the increased pressure did amount to though was this new study that’s been published this week. Funded by the Home Office, the study was commissioned to further test LFR’s accuracy and the possibility of discrimination. Importantly, unlike the first trials, the analysis was conducted independently from the Met, by the National Physical Laboratory.
So what did this new study find? The chance of an incorrect match is now 1 in 6,000 — sizeably lower than the 1 in 1,000 seen in the first trials. It’s a significant improvement that means the Met is less likely to accidentally flag innocent people while using LFR.
In addition, the study also showed that the Met’s LFR doesn’t get any less accurate in terms of race or sex. Campaigners have previously warned facial recognition software is not as accurate for black people or women. Yet in statistical terms, the study couldn’t find a significant decrease in accuracy in this regard. There was a key caveat to this result though — it depends on the camera’s settings. If officers don’t set up a camera properly, the study found that bias could still creep in.
Are there still concerns? Some have already pointed out that a 1 in 6,000 error rate still leaves a lot of room for false flags, especially when thousands of faces are being scanned per day. “If rolled out across the UK, this could mean tens of thousands of us will be wrongly flagged as criminals and forced to prove our innocence,” said Madeleine Stone, legal and policy officer from Big Brother Watch.
But other concerns are more philosophical and centre around the intrusiveness of the tech. Campaigners from human rights groups Amnesty, Liberty and Big Brother Watch have all spoken out against it, with Big Brother Watch branding it “Orwellian” and saying it “turns us into walking ID cards”.
There’s also the wider backdrop to how the Met is being scrutinised at the moment. With the Casey Review finding last month that the force is institutionally racist, sexist and homophobic, there’s not exactly a lot of trust in the force using the technology properly. “It’s virtually impossible to imagine that faulty facial recognition technology won’t amplify existing racial prejudices within policing,” said Oliver Feeley-Sprague, Amnesty International UK’s military, security and police director.
What now? Well, it’s clear the Met is pleased with the study. Lindsey Chiswick, the force’s director of intelligence, reacted by saying: “This is a significant report for policing, as it is the first time we have had independent scientific evidence to advise us on the accuracy and any demographic differences of our Facial Recognition Technology.”
"We commissioned the work so we could get a better understanding of our facial recognition technology, and this scientific analysis has given us a greater insight into its performance for future deployments.”
And that last bit is key — “future deployments”. It’s fair to assume the Met will now use LFR in London with even more confidence than before.