Move over stop, frisk and identify! The power of the police state has increased exponentially with a new body camera which uses an integrated real-time facial recognition system. Currently, as TFTP reported, police have to take a picture of a person’s face to search a police-friendly company’s database for biographical information. That is no longer necessary. Now they can have identifying information long before they even make contact with the public. What could possibly go wrong?
WOLFCOM calls their new body camera the “Total Solution” which includes cloud or site-based storage, facial recognition software, as well as redaction features to blur out faces which the 1500 police departments WOLFCOM services choose to keep private. The new tech will afford police departments the ability to instantly search databases for arrest warrants, access missing persons reports, or other information the cops might want such as previous arrest histories.
We knew it would not be long before police departments had such capabilities. Clearview AI has been using artificial intelligence, social media accounts (both active and deactivated), to search the World Wide Web for images of a suspect. We use the word “suspect” in this case because Clearview AI is an app which is not accessible by the public, only law enforcement officers and agencies can use it. The way Clearview AI works is it requires a police officer to already have an image of a person to be submitted to the Clearview AI database. From there, the artificial intelligence takes over and uncovers the identity of the suspect.
Wolfcom has apparently improved on the process by using body cameras alongside facial-recognition software, to immediately identify suspects, long before a person has decided to consent to providing identifying information. In other words, the surveillance state has gained a new tool although some police departments have been using facial recognition software for decades.
In Florida, for example, law enforcement agencies search the facial recognition databases at a rate of over 4,000 times per month. While Florida has achieved a certain measure of success in identifying those who did not want to be identified, the use of the technology has come at the price of personal liberties. Not only do suspects not get a chance to consent, their lawyers are often not privy to how the police identified their client. And that information is often hidden during the discovery process whereby lawyers are given all the relevant evidence against their clients.
In January of this year, the New York Times reported one Florida man may have lost his freedom in a case of mistaken identity after the state’s facial recognition program fingered him for selling cocaine. He claims he’s innocent. The Times writes:
Recommended for You
Willie Allen Lynch was accused in 2015 of selling $50 worth of crack cocaine, after the Pinellas facial recognition system suggested him as a likely match. Mr. Lynch, who claimed he had been misidentified, sought the images of the other possible matches; a Florida appeals court ruled against it. He is serving an eight-year prison sentence.
According to OneZero, which used the Freedom of Information Act to gather information used in their investigative report of Wolfcom, the president of the company proudly lauded the company’s achievement in facial recognition technology. Founder Peter Austin Onruang reportedly told the Nobel Police Department in Oklahoma:
With Realtime Facial Recognition, WOLFCOM hopes to give our friends in Law Enforcement tools that will help them identify if the person they are talking to is a wanted suspect, a missing child or adult, or a person of interest.
This push for facial recognition is in spite of the fact that other body camera companies like Axon coming out against using the technology in their equipment, citing "serious ethical concerns." NEC which is another facial recognition company throughout the world has refused to sell live facial recognition technology to law enforcement for the same reason.
But the police state couldn't care less about ethical concerns. Law Enforcement Agencies (LEA) will likely welcome such an advancement in technology but we at TFTP want to ask a simple question. What could possibly go wrong? If LEA’s were incapable of framing suspects, carrying out assassinations, and not infiltrated by pedophiles with badges, then there would not be any cause for concern. Police State apologists like to mention their twisted beliefs that if “you’re not breaking any laws then you have nothing to worry about” and “there’s a bad apple in every bunch.” We, on the other hand, constantly beg to differ.
Not only do we write about entire police departments being arrested for corruption, we also cover stories about cops abusing their badge and oath to stalk and rape women. Now, those same police officers who sometimes get corrupted by their own natures, will have access to a whole host of personal information, all at their fingertips.
More than 1,000 police officers in California have already been investigated for accessing databases when they were not legally allowed to do so. And that’s just in California. If it’s already going on in California, it’s common sense to assume the practice is taking place nationwide by police officers in every state. When and if Wolfcom’s body cameras become standard issue nationwide, people will have to kiss privacy goodbye. Search warrants and the 4th and 14 Amendments be damned!