Kenny Long recalls that when the London Metropolitan Police first identified his talent for spotting faces, back in 2015, he had an interesting gut reaction to being called a “super-recognizer.”
“It sounds like a rubbish superhero,” he told his commanding officer.
But incircles in London, and with British intelligence, it’s no joke. Long’s talent, the ability to put names to faces, even of people he hasn’t seen in years, is akin to a superpower. It’s led to him linking offenders to multiple crimes and has resulted in court convictions. New Scotland Yard even recruited Long to a special unit, though he now runs his own company to identify and train super-recognizers around the world.
You might recognize your neighbors if you glimpsed them in an unfamiliar context, but you might not have the observational skills to identify a hooded figure in a blurry video as the same person who sat opposite you on the train last week. A small handful of people can, though, and this is what separates super-recognizers from the rest of us.
The drawback is the scarcity of people who qualify as super-recognizers. The ability to recognize an extraordinary number of faces is a cognitive quirk first identified by researchers at Harvard University and University College London in 2009, and one that’s thought to affect only 2 percent of the population. There’s nothing you can do to teach yourself this skill, according to Josh Davis, a reader in applied psychology at the University of Greenwich who studies super-recognizers. You’re either born with it or you aren’t. (There’s even an online test you can take.)
Super-recognizers are an oddity in law enforcement at a time when government agencies and businesses are pushing forward with privacy and authoritative overreach.. The technology, which boils the aspects of your face down to a numerical code, is used everywhere from to the Face ID feature on the . But civil liberties groups are concerned about invasion of
In the midst of this debate, super-recognizers offer a human alternative or an opportunity to complement the technology. That’s if the humans and machines can get along.
London’s Metropolitan Police employ a special team of super-recognizers who scour CCTV footage of crime scenes from all over the British capital looking for familiar faces. It might seem like a tedious and archaic way of doing things, but the accuracy of super-recognizers is such that it makes them a worthwhile investment for law enforcement.
At the same time, police forces around the world are embracing the use of facial recognition algorithms to identify criminals from crime scenes or find specific faces in a crowd. The technology promises faster and more efficient identification of suspects, by comparing scenes against watchlists of suspects in a way similar to how DNA is compared against a database.
Rather than some John Henry, man-versus-machine dynamic, there’s an opportunity for super-recognizers and computers to work together. A major study published in the journal PNAS last year concluded that “optimal face identification was achieved only when humans and machines worked in collaboration.”
Super-recognizers, and the experts who work with and study them, agree.
“Let’s say the police had a credible threat that someone was going to set something off in London and you had the computer systems trawling through vast amounts of feeds from CCTV — it would be great to have a bunch of super-recognizers looking at all the potential matches that are flagged up,” said Davis.
The all-seeing eye of the law
The Met Police facial recognition trials have just concluded, and the agency is set to publish the results next month.
But the trials have already sparked outrage among privacy activists and human rights groups such as Big Brother Watch and Liberty, in part because of a perceived lack of transparency about when and where they’re taking place.
“Live facial recognition is a form of mass surveillance that, if allowed to continue, will turn members of the public into walking ID cards,” Big Brother Watch’s director, Silkie Carlo, said in a statement to CNET.
One particular criticism of the UK police trials is that they haven’t even been successful. Back in 2017 when the Met Police tested the technology at the Notting Hill Carnival, it was found to be wrong 98 percent of the time, according to an analysis conducted by Big Brother Watch.
The Met Police stands by the trial.
“Tackling violent crime is a key priority for the Met, and we are determined to use all emerging technology available to support standard policing activity and help protect our communities,” a spokeswoman for the force said in a statement.
The South Wales police force has attempted to increase transparency around its own trials by publishing details online about the number of alerts from the facial recognition technology and the number of correct matches made.
Professor Martin Innes, along with his fellow researchers at Cardiff University, conducted an independent review into the South Wales trial, which took place in live environments but also used existing CCTV footage of crime scenes.
Innes’ review found that though facial recognition technology can help police identify persons of interest and suspects when they otherwise probably couldn’t have, considerable investment and changes to police operating procedures are required to generate consistent results.
Rather than thinking of AFR as “automated facial recognition technology,” he prefers to think of it as “assisted facial recognition technology,” he said. “At this point in time our sense is that it’s probably best put alongside some sort of human operator.”
Superhumans and supercomputers
Beyond finding and training super-recognizers, Long assists a facial recognition technology company called Digital Barriers.
Long worked with Digital Barriers to conduct trials at the Brit Awards and the National Television Awards in London earlier this year, where he used his skills as a super-recognizer to confirm that its facial recognition software was correctly identifying people.
“It’s very important to have human verification, because facial recognition technology is absolutely outstanding, but you always need someone at the end to confirm it,” said Long.
For him, it makes sense that to get the most out of the technology, you’d use people with his skill set to get a positive ID on a suspect. “Why would you want such fantastic technology without the right people using it?” he said.
When it comes to identifying suspects in a live environment, the two can perform very different functions, with the software filtering massive amounts of live data, and the police officer who’s familiar with the suspect making a call on the correct match.
Facial recognition systems have accuracy thresholds that, depending on the particular circumstances, can be adjusted to cast a wide net rather than be specific, explained Davis. In some of the trials, the bar has been set purposefully low to scoop up a wide range of potential matches, he said, adding that this has contributed to some of the media coverage of success rates being overly critical.
If you’re one of the people mistakenly arrested as part of the trial and later released without charge, you probably have a different view on this. But according to Davis, there may even be times when it’s in the police’s favor to have a wide range of false positives — for example if they’re searching for terrorist suspects in real time and the stakes are high. “It doesn’t matter if the system flags up false alarms for completely innocent people, because you just need to find that one suspect,” he said.
Super-recognizers don’t do a huge amount to challenge the privacy concerns raised by facial recognition technology. In fact, as the London Mayor’s Policing Ethics Panel report published in July 2018 points out, they pose some of the same problems. But the increased accuracy they bring to detection methods could potentially help strengthen the case for police use of technology.
Somewhere, at the end of the line, a human will have to make a choice, and for experts like Davis, it makes sense if those humans are super-recognizers.
“The computer can’t make a final decision,” he said. “But it can give you information to make a final decision.”