Have you noticed that progressives never like new technology that makes policing more effective? Surveillance cameras, drones, license plate readers—every time a great new tool comes along to fight crime, the wokes denounce it.
The latest example? Facial recognition technology (FRT), which can do some pretty amazing things. Forbes Magazine identifies 14 uses ranging from medical care and banking security to a replacement for passwords and theater tickets. Beyond that is the obvious value of FRT in law enforcement. The General Accounting Office of the U.S. Government says that at least six Federal agencies “reported using FRT to generate leads in criminal investigations, such as identifying a person of interest, by comparing their image against mugshots.”
You’d think that, with public safety on everyone’s mind these days, everybody except the criminals would love FRT. But some people hate it. Let’s start with the American Civil Liberties Union. Their opposition can be summed up in this inflammatory, paranoid statement from their website: “If police are authorized to deploy invasive face surveillance technologies against our communities, these technologies will unquestionably be used to target Black and Brown people merely for existing.”
“Invasive”! “Targeting”! That’s gaslighting for you. Getting mugged is invasive. Getting shot is invasive. Having cops use FRT to solve crime is not.
Having a gestapo out there, exploiting our images, sounds pretty horrible, akin to Hitler targeting Jews “merely for existing.” But is FRT really the same thing as the Holocaust? Come on. The ACLU also argues that FRT “can be racially biased.” Seems that the FRT algorithms can have high rates of error for Black people, especially women. “One false match,” the ACLU warns, “can lead to a wrongful arrest, a lengthy detention, and even deadly police violence.”
Of course, no one wants that. If there is such a problem, there are readily available mitigations, which should be considered in order to save FRT from being tossed onto the scrapheap, as the ACLU and others want. Algorithm accuracy can be greatly enhanced, resulting in improved performance and accuracy without bias. Neural network architectures and deep learning models, upon which FRT depends, also can be and are being improved. Artificial intelligence (AI) will be central to these improvements. Then too, no law enforcement agency or court or jury will ever depend exclusively on FRT in indicting, prosecuting or sentencing defendants. Any defense lawyer worth her salt could successfully challenge a conviction based only on FRT, which is why additional evidence would be required, i.e, FRT would be only one rather small part of a package of condemnatory evidence. Then too, law enforcement agencies, like INTERPOL, make sure “that its officers always carry out a manual check of the conclusions of computer systems.”
Americans have mixed feelings about FRT. Black people are the most distrustful, which perhaps is understandable. According to the Pew Research Center, when asked if FRT will make policing fairer, 34% of all U.S. adults agreed, 40% of Hispanics agreed, 36% of Whites agreed, but only 22% of Blacks. Far fewer Blacks than Whites or Hispanics believe that FRT can be made more acceptable than it currently is, while far more Blacks think that FRT would be used much more often in Black neighborhoods than in others. These fears, which are unfounded, are inspired by a paranoia among people of color that is constantly fanned by anti-police activists and the media. The New York Times, for example, ran an article headlined “A Case for Banning Facial Recogition,” while Sky News termed FRT “Orwellian” and the San Francisco Chronicle reported, in 2019, on how San Francisco “became the first city in the country to ban city use of facial recognition surveillance technology,” further feeding into minority fears. That legislation, by the way, was written by Supervisor Aaron Peskin, whose candidacy for mayor is flailing because he’s too woke even for San Francisco.
Facial recognition technology has the capacity to help the police in ways we haven’t even dreamed of. Yes, more work needs to be done to make it safer and more accurate, but that’s what Silicon Valley does: make technology better. In Oakland--where FRT has been banned since 2019 due to Rebecca Kaplan’s efforts--when the new City Council takes office early next year, the first thing they should do is revoke that ban and give OPD the funds it needs to use facial recognition. The second thing they should do is eliminate the Privacy Advisory Commission and rededicate the money to the police. Nobody in Oakland government, least of all the police, is trying to invade anyone’s privacy. What we’re trying to do—most of us, anyway--is stop criminals, and we’ll use any technology we can, including FRT, to help.
Steve Heimoff