What is live facial recognition?
Live facial recognition (LFR), also known as automatic facial recognition, identifies people in a video in real time, using a set of photographs as a reference. When used in public, cameras scan a crowd and the software highlights any matches between members of the public and the people in their database.
How does live facial recognition work?
The live video feed is scanned for faces. Each face that is found is then mapped by the software, taking measurements of facial features, such as the distance between the eyes and the length of the jawline, to create a unique set of biometric data. This dataset is then compared to a database of people to be identified; for the police, this database contains people with outstanding warrants. If the system judges the face to be sufficiently similar to someone in its database, this match is highlighted.
Who is using live facial recognition?
In the UK, the London Metropolitan Police (the Met), the South Wales Police and Leicestershire Police have all trialled the technology in public since 2015. The system was tested at Download Festival in 2015, the Champions League final and Notting Hill Carnival in 2017, among other events, with the Met’s final test taking place on 14 February 2019.
In these cases, the database consists of photos of people wanted by the police or courts. If the system makes a match, it presents the police with both images, so they can decide whether to stop and speak to the person. Unmatched faces are deleted straight away, and matched images are deleted after 30 days.
Read more about security with technology:
Other facial recognition systems are already in use in the UK. In 2004, EU countries began to incorporate biometric data into new ePassports, identifiable by a small, gold camera logo on the front cover. These became available in the UK in 2006, and the microchip embedded in the cover contains both the holder’s personal information and photograph.
In many airports across the UK and Europe, as well as Eurostar terminals in Paris and Brussels, travellers can verify their identity with the facial recognition systems in automated ePassport gates in immigration halls.
Trials of facial recognition software were also carried out in three prisons in an attempt to combat drug smuggling. HMP Hull saw a 40 per cent drop in visitors during this time, and a spokesperson for the Ministry of Justice described this as a successful deterrent.
Amazon’s facial recognition system, Rekognition, has been tested by police forces in the US, including Orlando Police Department in Florida and Washington County Sheriff’s Office in Oregon. It was reported in July that the system had been abandoned in Orlando after 15 months of unsuccessful trials. The FBI and Immigration and Customs Enforcement (ICE) have also used the technology to identify undocumented immigrants from their driving licence photographs.
Is live facial recognition legal?
Currently, there is no UK regulation regarding facial recognition technology. Though the police trials have been supported by Home Secretary Sajid Javid, he told the BBC that legislation would have to be put in place before it could be used long-term.
The House of Commons Science and Technology Committee published a report in July calling for a moratorium on all facial recognition technology until legislation has been put in place.
Meanwhile, San Francisco and Oakland in California, and Somerville in Massachusetts, have all banned the use of the technology.
How effective is facial recognition?
The issues raised by the committee’s report include the technology’s effectiveness. For example, at the 2017 Champions League final, the facial recognition system highlighted 2,470 matches to individuals on the police’s database. Only 173 of these were correct, giving it a 92 per cent error rate. When the Home Office tested the Police National Database’s system in 2015, they found that it was half as effective as a human, and a University of Essex study of the Met’s system found it to be correct only 19 per cent of the time.
Facial recognition systems also have the potential to be biased. If the database that ‘trains’ them is predominantly white and male, it will be much more effective at distinguishing white, male faces. The risk is that women and ethnic minorities are much more likely to be falsely identified. A 2019 paper from MIT found that Amazon’s Rekognition had a 0 per cent error rate among light-skinned men, which rose to over 30 per cent for dark-skinned women.
Can we stop our biometric data being collected?
The Met’s website says that “Anyone can refuse to be scanned; it’s not an offence or considered ‘obstruction’ to actively avoid being scanned.” However, the BBC reported in May that one man was fined £90 for disorderly conduct after attempting to cover his face during a trial of the system in East London.