Eplly is Your Ultimate Source for the Latest News, Science, Health, Fashion, Education, Family, Music and Movies.
—— 《 Eplly • Com 》
Detroit police chief says 'poor investigative work' led to arrest of Black mom who claims facial recognition technology played a role
Views: 2909
2023-08-10 12:27
Detroit's police chief on Wednesday blamed "poor investigative work," not the use of facial recognition technology, for the arrest of a Black mother who claims in a lawsuit that she was falsely arrested earlier this year while eight months pregnant.

Detroit's police chief on Wednesday blamed "poor investigative work," not the use of facial recognition technology, for the arrest of a Black mother who claims in a lawsuit that she was falsely arrested earlier this year while eight months pregnant.

Porcha Woodruff, 32, was home one morning in February helping her 6- and 12-year-olds get ready for school when six Detroit police officers arrived at her door with an arrest warrant for carjacking and robbery, according to a federal lawsuit she filed last week. She was handcuffed, taken to jail and booked, the lawsuit says.

On Wednesday, Detroit Police Chief James White told reporters that it was his department's inadequate work on the case that led to the arrest.

"I have no reason to conclude at this time that there have been any violations of the DPD facial recognition policy," White said during a news conference on Wednesday. "However, I have concluded that there has been a number of policy violations by the lead investigator in this case."

White held the news conference to address the allegations Woodruff made in her filing last week stemming from the February 16 arrest.

Woodruff said she learned she was implicated in the alleged incident after the facial recognition software hit as well as the carjacking victim's alleged identification of her in a lineup of six photos that included her mugshot from a 2015 arrest, the complaint states.

The lawsuit also alleges Detroit police engaged "in a pattern of racial discrimination of (Woodruff) and other Black citizens by using facial recognition technology practices proven to misidentify Black citizens at a higher rate than others in violation of the equal protection guaranteed by" Michigan's 1976 civil rights act.

The police chief said a detective involved in the case incorrectly presented an image generated by facial recognition technology to the victim, which violated the department's policy of not using facial recognition photos in lineups.

The facial recognition software, White said, gave the investigator dozens of possibilities of who the suspect was and was meant to be the "launch of the investigative point" for detectives to determine who should be presented in a photo lineup.

"What this is, is very, very poor investigative work that led to a number of inappropriate decisions being made along the lines of the investigation, and that's something this team is committed to not only correcting, having accountability, having transparency with this community, and in building policy immediately to ensure regardless of the tool being used, this never happens," White said.

The Detroit Police Department plans to implement three reforms in response to the incident, White said. Those include requiring an independent reason for believing that a suspect presented in a lineup could have committed the crime.

The lawsuit comes as facial recognition technology and its potential risks are under scrutiny as experts warn about AI's tendency toward errors and bias, along with the dangers of inaccurate facial recognition usage.

In recent years, researchers have cautioned against the widespread use of technologies like facial recognition that may lead to race or gender discrimination.

A 2019 study conducted by the United States government found many facial recognition algorithms were far more likely to misidentify racial minorities than White people. Native American, Black and Asian people were all disproportionately more likely to be affected, according to the study by the National Institute of Standards and Technology.

Testing found that some algorithms were up to 100 times more likely to confuse two different non-White people, the agency said at the time. In the face of those concerns, several cities nationwide have banned the use of facial recognition by city officials, including San Francisco and Somerville, Massachusetts.