U.S. Customs and Border Protection is one user of facial recognition technology. In 2020, a year that did see a drop in travel amid the pandemic

The use of facial recognition technology is often controversial in terms of privacy and bias issues, and some even suggest it should be banned. But ethical considerations aside, is it really that effective?

Apparently not.

U.S. Customs and Border Protection is one user of facial recognition technology. In 2020, a year that did see a drop in travel amid the COVID-19 pandemic, out of 23 million facial scans undertaken by CBP the net result of the technology attempting to identify imposters — those attempting to enter the U.S. using a false identity at airports — was zero.

The results were disclosed by the agency in its annual report on trade and travel published earlier this month. The points of entry where CBP used the technology included airports, seaports and pedestrian crossings.

Although the facial recognition technology did manage to detect fewer than 100 imposters at pedestrian crossings, as Gizmodo noted the overall results indicate either one of two things: Very few imposters use fake credentials to enter the U.S., or the CBP system is very ineffective at finding them.

OneZero reported that the Government Accountability Office “lambasted” CBP over lackluster accuracy audits, poor signage notifying the public that the technology is being used, and little information on how the system worked in late 2020. The GAO also noted that CBP didn’t audit the technology’s efficacy until March 2020 and have put further audits on hold because of COVID-19.

The lack of detection could be argued as a sign that the facial recognition technology is not working as well as it should, but others argue that other factors could be involved.

“We should not assume that the CBP facial recognition tools have failed simply by a lack of imposter identification, as this may simply be the result of fewer individuals attempting to enter the country as a result of COVID,” Stuart Sharp, vice president of technical services at identity and access management service provider OneLogin Inc., told SiliconANGLE. “Nevertheless, while biometrics have a role to play in identification, it does face significant limitations.”

When comparing a facial or fingerprint scan to the stored value, the system accepts a degree of variation, Sharp explained. “This is called the False Acceptance Rate (FAR) metric, which is the probability that the system will incorrectly identify a user as valid,” he said. “Realizing that facial recognition is simply verifying that the scan is ‘similar’ to the stored image, you can see that there is a real risk that the CBP tools are not detecting skillful imposters.“

Josh Bohls, founder of secure content capture company Inkscreen LLC, said the system may not have identified anyone using false credentials. But I suspect the agency considers it a huge success in that it was able to collect a massive trove of images of visitors that can be directly linked to their passports,” he said. “This is the holy grail of artificial intelligence and machine learning system training data and will serve to improve the system dramatically over time.”

Originally published at Silicon Angle