The Detroit Board of Police Commissioners approved a policy Sept. 19 outlining the use of facial recognition technology for the police department. The policy includes guidelines for how officers who abuse the policy will be punished and a prohibition from sharing the photos with private companies. 

The software has been used by Detroit police since July 2017, when the Detroit City Council approved the software purchase. In June 2019, Detroit Police Chief James Craig asked the City Council to approve its permanent use. 

Controversy surrounds the technology because of issues with the systems misidentifying people with darker skin. In a test done by the ACLU, Amazon’s facial recognition tool Rekognition falsely matched 28 members of Congress, disproportionately identifying them as people who had committed crimes. The false matches were disproportionately people of color, misidentifying six members of the Congressional Black Caucus. 

Craig told The Detroit News after the approval the decision has been discussed for long enough and its purpose is to help the police department protect the community. 

“This is about the victims,” Craig said. “We took the community’s concerns to heart. I know some have felt we were not transparent during this process, but when we purchased this $1 million software, we had a conversation with City Council … so there was nothing secret about it.”

LSA senior Hannah Agnew, president of the Student Executive Committee of the Prison Creative Arts Project, said the continued use of facial recognition technology will only increase the divide between civilians and police.

“With a study the ACLU did, there were many issues with misidentifying Black folks and women,” Agnew said. “And with the way we already over-police people of color, adding more surveillance is not going to help … People fear it and it creates distrust in the police system. We could be investing that money in services that let people prosper and that raises them up instead.” 

U.S. Rep. Rashida Tlaib, D-District 13 told The Detroit News she believed the software should be analyzed by only African Americans to avoid further misidentification of people of color.  

“Analysts need to be African Americans, not people that are not,” Tlaib said in a Detroit News video. “It happens all the time, it’s true — I think non-African Americans think African Americans all look the same. I’ve seen it even on the House floor, people calling Elijah Cummings ‘John Lewis,’ and John Lewis ‘Elijah Cummings,’ and they’re totally different people.” 

Craig told the Detroit News he believes Tlaib’s comments were insulting, adding all department officers and civilian employees receive bias training and should not be barred from jobs involving the software. 

“That’s something we train for, and it’s valuable training, but to say people should be barred from working somewhere because of their skin color?” Craig said. “That’s racist.”

Law student Michael Goodyear, who is editor-in-chief of the Michigan Technology Law Review, said though the controversy surrounding the technology is valid, focus should be put on the software itself as opposed to those analyzing its results. 

“I think she (Tlaib) has a valid point, for sure, that visualization technology can be helpful, but it does have shortcomings that are particularly related to recognizing people of color,” Goodyear said. “That said, it is from a more technical standpoint, so it should mean reworking the algorithm, making sure the algorithm correctly identifies individuals, which obviously can have some biases from the coders themselves. But I think it is a maybe another separate move from what Representative Tlaib had said in those comments. So it’s not necessarily the person who’s using the algorithm, but the algorithm itself.” 

Facial recognition technology is used in everyday life through Facebook photo tagging, airport security and even in online dating applications. Goodyear pointed out it is also being used by other governments for more surveillance purposes but emphasized a need for regulation similar to Detroit’s in these situations. 

“The Chinese government has been using facial recognition technology to track the movements of certain Uighur groups in western China … that’s kind of the far, not great side of this technology,” Goodyear said. “The Chinese and Hong Kong governments have been using it in Hong Kong to actually track jaywalking in Hong Kong, which is maybe a little bit ‘big brother’. So it is important to — kind of like in what this ordinance is doing now — to draw boundaries of what’s acceptable behavior and what’s not acceptable.”

If facial recognition technology were to be implemented in Ann Arbor, Agnew said it would have similarly detrimental effects on the community.

“Anywhere you implement this it won’t have a good outcome because of over-policing and over-surveilling,” Agnew said. “Ann Arbor likes to think of itself as a very liberal city, and this would be a way to police the already small numbers of people of color we have here.”

If the software were to be implemented in Ann Arbor, Goodyear said he hoped it would be with clear regulations and limitations in part produced through community engagement. He pointed out Ann Arbor’s smaller size and said the roles of the Ann Arbor Police Department and the University police would change how such technology could be implemented compared to in Detroit. 

“I think they’d definitely implement it in a different way,” Goodyear said. “But hopefully they’d do it in a similar way to Detroit, where they have public forums that allow things to be discussed publicly and actually create regulations to limit any sort of things might be going on like in China, or Hong Kong for example.”

Deputy Chief of Police Melissa Overton of the University’s Division of Public Safety & Security said the department has not discussed using facial recognition technology yet.

“We are always reviewing the latest technology that would assist law enforcement in solving crime, however, we have not discussed facial recognition at this time,” she wrote in an email to the Daily.

The Ann Arbor Police Department did not respond to requests for comment in time for publication.

With regards to the primary controversy surrounding the software, Goodyear believes the technology still needs improvement. However, he emphasized the facial recognition technology would not be able to solely be used to indict or imprison a suspect. 

“They should be going through and making sure that it’s absolutely accurate,” Goodyear said. “But … no one’s going to be arrested and thrown in jail, indicted, based purely on their picture. Facial recognition technology so far is one factor. So under standard evidence, you need to have a variety of different things that kind of show that someone did something. In this case, it would be good evidence, but in and of itself, it’s not enough.”

Agnew argued there would be no positive outcomes from the continued use of the technology for both victims and those incarcerated. 

“With the criminal justice system we tend to disproportionately target people of color,” she said. “And that often doesn’t happen in a way that helps victims. They’re saying, ‘We’re going to use this system of surveillance to help victims.’ But how is it going to help victims? It’s not providing support or help to victims to deal with what happened, it’s just another way to police people.”

Leave a comment

Your email address will not be published. Required fields are marked *