Should Facial Recognition Technology Be Used In Law Enforcement?
Facial recognition technology (FRT) is used to catch the bad guy on virtually every television program and movie that has to do with policing. It positively identifies criminals and saves society from further wrongdoing by getting these villains off the streets. But the fact is, this technology isn’t quite so slick in real life, despite the big money it brings to developers. It’s useful to take a look at the problems associated with using facial recognition technology.
The FBI is the primary organization utilizing FRT in order to capture persons suspected of wrongdoing. They have well over 400 million images in their databases. Additionally, more and more local police departments are turning to this technology, with big cities like Chicago, New York City, and Orlando leading the way.
Potential for Misuse
There are plenty of concerns held by citizens, politicians, and even law enforcement. For instance, the fear of misuse of surveillance is very real. Right now, many guilty of petty crimes in China are being arrested thanks to widespread surveillance and quick identification using this technology. That makes some anxious about the potential to abuse civil liberties and basic privacy rights. Particularly when issues involving the justice system are considered, accuracy and accountability are essential. Which brings us to the next issue.
While studies show that under optimal conditions FRT is as reliable as iris scanners, the key phrase here is under optimal conditions. In the real world, however, where lighting, camera angle, and the quality of footage differs at any given time and space, the error rate rose to over nine percent. In fact, while images taken while passing through boarding gates in an airport reached 94.4 percent accuracy rates, images taken of people walking through a sports venue were as low as 36 percent, due mostly to camera angles. Another issue: aging. It seems facial recognition errors grow by a factor of ten when pictures are compared with faces over a period of many years. Another factor worth considering is vendor reliability. Studies indicate that different vendors using the same cameras got very different accuracy rates because the technology among companies differs that much.
It turns out some groups are more susceptible to errors than others. For example, one study of gender-recognition technology demonstrated that darker-skinned women were misidentified at a rate of nearly 35 percent. That is roughly 50 times greater than the error rate for white men. In fact, the highest error rates are for Native Americans, followed by Asian and Black females.
A Few Mistakes?
If the statistics don’t terrify you, perhaps drilling down to real instances of error might get your attention. One facial analysis program identified Oprah Winfrey as a male (we all know she’s a woman, right?). Another matched mugshots to Congressional members. Still another identified a university student as a suspect in a bombing incident in Sri Lanka. That one resulted in death threats and real upheavals in the student’s life.
Lobo Law Fights for Justice
Justice must be the bottom line in this country. When unregulated identification systems are used to arrest citizens, and those systems lack standards for accuracy, can that really be justice? If you believe that your arrest involved misconduct or error, you need aggressive, and sometimes creative defenders. At Lobo Law, you’ll get nothing less. Contact our Las Vegas criminal defense lawyers today for a confidential consultation about your circumstances.