BrownWatch

View Original

Facial Recognition Now a Routine Policing Tool in the “Surveillance State" that Incorrectly Identifies Non-White People More Frequently than Whites

From [HERE] In August 2017, a woman contacted the Arapahoe County Sheriff’s Office in Colorado with what seemed like a simple case: After a date at a bowling alley, she’d discovered $400 missing from her purse and asked the manager to review the surveillance footage, which showed her companion snatching the cash while she bowled a frame.

But despite the clear evidence, the search for the bowling companion floundered. The woman knew only his first name. He’d removed his profile from the dating site on which they’d met. His number, now disconnected, was linked to a hard-to-trace “burner” phone. Security video captured his car in the parking lot, but not its license plate.

The investigator, Tara Young, set the case aside to work on others. It sat on a shelf until early 2018, when she ran into a colleague who was testing out the department’s new facial recognition system.

Young gave the officer a picture of the bowling companion taken from the victim’s cellphone. He plugged it into the software and up popped a mugshot of a man who looked a lot like the date thief.

It was Young’s first experience with facial recognition, one of the most powerful and controversial technological innovations of the 21st century. It gave her dormant case new life, and showed her its potential to transform policing.

Her investigation “would have been at a dead end without the facial recognition,” Young said. “It’s huge.”

A disputed tool goes mainstream

The technology-driven revolution in policing is unfolding in big cities and small communities around the country, as more police departments purchase facial recognition software. The government “facial biometrics” market — which includes federal, state and local law enforcement — is expected to soar from $136.9 million in 2018 to $375 million by 2025, according to an estimate by market research firm Grand View Research. Driven by artificial intelligence, facial recognition allows officers to submit images of people’s faces, taken in the field or lifted from photos or video, and instantaneously compare them to photos in government databases — mugshots, jail booking records, driver’s licenses.

Unlike DNA evidence, which is costly and can take a laboratory days to produce, facial recognition requires little overhead once a system is installed. The relative ease of operation allows officers to make the technology part of their daily work. Rather than reserve it for serious or high-profile cases, they are using it to solve routine crimes and to quickly identify people they see as suspicious.


But these systems are proliferating amid growing concern that facial recognition remains prone to errors — artificial-intelligence and privacy researchers have found that algorithms behind some systems incorrectly identify women and people with dark skin more frequently than white men — and allows the government to expand surveillance of the public without much oversight. While some agencies have policies on how facial recognition is used, there are few laws or regulations governing what databases the systems can tap into, who is included in those databases, the circumstances in which police can scan people’s photos, how accurate the systems are, and how much the government should share with the public about its use of the technology.

Police praise the technology’s power to improve investigations, but many agencies also try to keep their methods secret. In New York, the police department has resisted attempts by defense attorneys and privacy advocates to reveal how its facial recognition system operates. In Jacksonville, Florida, authorities have refused to share details of their facial recognition searches with a man fighting his conviction for selling $50 of crack. Sometimes people arrested with the help of facial recognition aren’t aware that it was used against them.

Because police don’t treat facial recognition as evidence for presentation in court, the technique does not often turn up in public documents and has not been the subject of many judicial rulings. Its use, and spread, are difficult to track.

The companies that build the technology are also grappling with the implications of its use. Amazon has given its facial recognition system to police departments to try out, sparking protests from employeesshareholders and artificial intelligence researchers. Microsoft says it has resisted requests to sell its products to police, and has called for government regulation. Axon, the largest maker of body cameras in the United States, has taken out patents for facial recognition applications but says it is not pursuing them as it consults with an artificial-intelligence ethics board.

At the same time, companies are creating even more advanced systems that will allow police to identify people from live video footage, such as body cameras, rather than just still images. It is only a matter of time before such technology is available for police to buy. [MORE]