Re: The Detroit News’ Oct. 3 editorial, “Rashida Tlaib should wake up”: My eyes are wide open. I’m not asleep at the wheel as I continue to represent the 13th Congressional District.
I’m going to call out every injustice I see. It’s probably what makes most people uncomfortable when I speak the truth. My comments weren’t racist, out of order, or “inappropriate.” It is inappropriate to implement a broken, flawed and racist technology that doesn’t recognize black and brown faces in a city that is over 80% black.
I’m not going to mince words when my residents are threatened. The Detroit News should not take Detroit Police Chief James Craig’s bait and help him to distract from the fact that broken technology is being used to lock up black and brown people in Detroit. While the continued fascination with my every word is flattering, it will not get me to back away from the truth.
The people who will be wrongly identified and arrested by this broken technology will suffer far greater harm. If it makes things easier to digest, next time I’ll lead with the studies — which you’ve apparently dismissed — that illustrate the fact that people of one race are less accurate in recognizing and identifying faces of people of other races. If it makes you feel better, this applies to people of all races — white people have a hard time recognizing differences between black faces, and black people have a hard time recognizing differences between white faces, for instance.
Craig’s department spent millions of dollars on a system that is flawed and jeopardizes our civil liberties, and it was implemented it with no public input. Calling out racism isn’t racist — but this technology is.
Facial recognition technology is so bad that when the ACLU did a test with members of Congress, it incorrectly identified 28 congresspeople as other people who have been arrested for a crime. On my tour I watched as 178 matches came up for a single male suspect, including a woman. It takes someone looking through all the matches to determine who is the best fit, with major consequences — these matches are used to help issue arrest warrants, which can change people’s lives forever.
Craig admitted this technology isn’t perfect, and that the human element plays a major role, and I hope instead of attacking me in the media he will read the studies I have provided to his department.
I was elected to serve my residents, and I cannot in good conscience sit by while inaccurate facial recognition technology is deployed in ways that run the risk of false arrests and over-policing. Facial recognition technology will have racist results and relying on human analysts for intervention is inadequate. We need to ban facial recognition.
Read original article here.