Imagine getting arrested and spending time in jail for a crime you never committed? That’s exactly what happened to Robert Williams from Detroit earlier this year when facial recognition technology misidentified him as a suspect accused of stealing watches from a Shinola store.
In the past decade, Artificial Intelligence (AI) has surged in popularity, improving the accuracy and efficiency of facial recognition technology Used by both the private and public sector, it’s no surprise that this technology is increasingly being used to tackle crime, as well.
On January 9th 2020, Robert Williams was wrongfully arrested right in front of his wife and two young daughters, without being informed of the reason for the arrest. He spent a total of 30 hours behind bars. Unfortunately, so far, Robert Williams’ experience is not an isolated case as other examples across the country exist as well.
The Detroit Police Department commented on the incident, saying that several checks and balances are in place to ensure ethical use of facial identification. However, research conducted by Harvard University shows that facial recognition programs can produce biased results, especially towards people of color.
Even though Williams’ case was dismissed two weeks later, he is still paying the price for the wrongful arrest. Michigan’s American Civil Liberties Union has since taken on his case in an attempt to seek justice. Williams is now suing the Detroit Police Department.
The tragic incident has raised awareness of the potential flaws of facial recognition technology, underscoring the importance of scrutinizing the ethical implications of this technology. In this sense, the Facial Recognition and Biometric Technology Moratorium Act was proposed earlier this year in order to provide more oversight regarding the use of AI technology used by government.
Robert Williams’ story is a sobering reminder of the potential misuse of AI technology and the importance of ethical business practices that prioritize human integrity and justice over any other interests.