Listen to this article here
Getting your Trinity Audio player ready...

The Detroit police chief said he’s setting new policies on the use of facial-recognition technology after a woman who was eight months pregnant said she was wrongly charged with robbery and carjacking in a case that was ultimately dismissed by prosecutors.

The technology, which was used on images taken from gas station video, produced leads in the case but was followed by “very poor” police work, Chief James White said.

“We want to ensure that nothing like this happens again,” White said Wednesday.

YouTube video

His comments came two days after the American Civil Liberties Union of Michigan urged Detroit police to stop using the technology. The city was sued last week by Porcha Woodruff, a 32-year-old Black woman, who was arrested in February while trying to get children ready for school. There have been two similar lawsuits against Detroit.

Woodruff was identified as a suspect in a January robbery and carjacking through facial-recognition technology. She denied any role. The Wayne County prosecutor’s office said charges later were dropped because the victim did not appear in court.

White said his officers will not be allowed “to use facial-recognition-derived images in a photographic lineup. Period.”

ABC News reports he said two captains must review arrest warrants when facial technology is used in a case, among other changes. The new policies will be presented to the Detroit Police Board of Commissioners.

White said there must be other evidence, outside the technology, for police to believe a suspect had the “means, ability and opportunity to commit the crime.

She spent hours in jail following her arrest.

“It’s particularly difficult when you’re talking about someone who was eight months pregnant, so we empathize with that,” White said. “We recognize we have to do better and there will be accountability on this mistake.”

Beyond Detroit, Research Suggests Police Facial Recognition Technology Can’t Tell Black People Apart

According to Scientific Americanresearch supports fears that facial recognition technology (FRT) can worsen racial inequities in policing. They found that law enforcement agencies that use automated facial recognition disproportionately arrest Black people.

They suggest this results from factors that include the lack of Black faces in the algorithms’ training data sets, a belief that these programs are infallible and a tendency of officers’ own biases to magnify these issues.

Recognizing the threat to our civil liberties, cities like San Francisco and Boston banned or restricted government use of this technology.

At the federal level President Biden’s administration released the “Blueprint for an AI Bill of Rights” in 2022.

While intended to incorporate practices that protect our civil rights in the design and use of AI technologies, the blueprint’s principles are nonbinding.

In addition, earlier this year congressional Democrats reintroduced the Facial Recognition and Biometric Technology Moratorium Act. This bill would pause law enforcement’s use of FRT until policy makers can create regulations and standards that balance constitutional concerns and public safety.

In the U.S. most software developers are White men.

For companies, creating reliable facial recognition software begins with balanced representation among designers.

Research shows the software is much better at identifying members of the programmer’s race. Experts attribute such findings largely to engineers’ unconscious transmittal of “own-race bias” into algorithms.

Hailing from Charlotte North Carolina, born litterateur Ezekiel J. Walker earned a B.A. in Psychology at Winston Salem State University. Walker later published his first creative nonfiction book and has...