The “Mythification” of Facial Recognition Continues
- Michael Terry

- Sep 28, 2019
- 5 min read
Recently, a USA Today article appeared in one of my news feeds, entitled “Researchers call facial recognition ‘imperfect’.” Now you might think I’m heralding the merits of a technology, but I am not. Instead, I am calling out the weakness of would be reporters who write from a press release without doing any actual journalism. This is the era of the “narrative” after all. More importantly, there is an outcome which is being challenged for a variety of reasons and motivations.

The article “exposed” all of the same factoids from other articles, of how facial recognition technology is inaccurate but only as it relates to one vendor solution. There are other similar technologies in use around the world which are being lumped into the same arguments against. The same organizations are referenced which attribute the same possibilities that our civil liberties will fall to an automated system which connects our images to criminal acts, all based upon a “could be” scenario. Here’s one statement:
“According to Pew Research, data scientists Stefan Wojcik and Emma Remy, ‘These systems can fail in ways that seem difficult to understand and hard to predict – such as showing higher rates of error on the faces of people with darker skin relative to those with lighter skin, or classifying prominent members of Congress as criminals.’”
One system tested and a common argument tying the limits of that technology to calling Congressional members “criminals”. I am not disputing the research but there is other research and other systems which address generalizing arguments intended to reach one conclusion. Because something “can” happen doesn’t not mean it “will” happen. Why is this? Law enforcement is not an unleashed beast ready to destroy the lives of those in its path. Law enforcement is highly regulated and governed as is the rest of the judicial system.
I’ve made this point before and I think it bears repeating. I do believe that if five “guilty” people go free so that one truly innocent person is not convicted, our judicial system works. Now I concede that there are likely innocents in jail at this very moment and our criminal justice system, while actually imperfect, needs to continually improve. The majority of those working in this arena do care about getting it right.
In application, and this is what really matters, any facial recognition technology used in public safety ONLY produces a lead. Hoards of polices aren't going to descend upon your very being because you look like someone. Law enforcement has a steep climb before investigating a lead which may be associated with a criminal act.

As a biometric, facial recognition is not a fingerprint or DNA. There are degrees of accuracy to be reached before law enforcement would include a person in an investigation. Other attributes would have to be assessed before an officer makes contact with an
individual. Because you look like someone doesn’t mean you committed a crime and law enforcement knows this. Simply put, you are not going to be detained or arrested over a similarity. The police have to do more, and they should do a great deal more. We haven't reached the premise of the film, "Minority Report".
Facial recognition technology failure is almost a dead horse which has been continually beaten, yet the same narratives repeatedly find their way to our newspapers from journalists who aren’t digging, aren’t investigating, aren’t asking critical questions, aren’t seeking countering arguments, reporters who aren’t conducting any real journalism. This is about opinion rather than reporting. The narrative is stronger and more worthy to present in print than presenting actual facts and alternative viewpoints which allow the reader to form their own opinion. Instead, a reporter is serving their opinion to us so we may call it our own.
What I am writing is a narrative though I am not a journalist and I am not pretending to be one. What I am trying to suggest, which is in my opinion, is what a narrative is about and in this context, why it is wrong. A narrative really isn’t fact based though facts can be used to prove a correlation, support a hypothesis or to frame an argument. A narrative is a story a writer wants you to believe. In the context associated with facial recognition arguments, there is other information readily available and for seemingly respected journalistic brands, one would think a higher standard of presenting all sides rather than one is more desirable.
The message I am hearing from the “anti-facial recognition” narratives being presented is facial recognition is bad if law enforcement uses it. These same narratives are intended to stoke fear in the minds of readers who aren’t as informed. After all, it is surveillance, an onerous invasion of our privacy by government. Oh, it also is an effective tool to find people who are alleged to have committed crimes because it does work. By the way, have you verified your social media accounts or confirmed a payment using facial recognition yet? Right but that isn’t an invasion of our privacy.
Here’s an example:
In New York City, someone planted rice cookers in subway stations. I’m oversimplifying the phrase “subway station” because I have almost no experience with subways but I get the point, these are high traffic areas and given other terrorist events, a rice cooker is that one thing that likely doesn’t belong in a subway and would reasonably cause fear among those who see it. This event called out the talents of local law enforcement who used facial recognition to correctly and quickly identify the alleged suspect. It worked! Facial recognition technology worked as a law enforcement tool and not one member of Congress was included in the investigation.

For procuring entities if facial recognition technology is being considered, it is within the entity’s authority to arm itself with facts, the totality of research, data, application models, use cases, and examples of policy application before considering a contract. To be only reliant on conjecture, fear, hyperbole and narratives isn’t public policy, it is bad public policy. Perhaps after serious deliberation, facial recognition is a horrible idea, which may be a fine outcome for that agency. The basis of deliberation should be serious and not primarily influenced by other agendas and that is the real standard agencies should achieve.
For the vendors and end users, there is a solution.
Let’s come together and form a robust and effective collective to communicate what technology can do and what it cannot, what it is and what it is not and beyond that, let’s work collaboratively to create technologies public safety agencies need to best serve the public. Through collective and collaborative action, narratives and myths can remain just that, narratives and myths.
Instead, we can arm others with the truth.
Michael R. Terry is the COO and National Government Relations Director for Government State and Local Partners LLC, an Austin based government affairs and business to government technology ventures firm www.gslptexas.com




Comments