Facial recognition technology (FRT) has come under scrutiny once again due to its alarming pattern of misidentifying Black individuals, sparking calls for reform. Recent research conducted by Scientific American online has supported fears that FRT can exacerbate racial inequities in policing, as law enforcement agencies that utilize automated facial recognition disproportionately arrest Black people.
The report suggests that these disparities stem from various factors, including the lack of diversity in the algorithms’ training data sets, the belief that these programs are infallible, and the amplification of officers’ own biases. The consequences of these misidentifications can be devastating, as highlighted by the story of Harvey Eugene Murphy Jr., a 61-year-old grandfather who is now suing Sunglass Hut’s parent company.
Murphy was mistakenly identified as a robber by the store’s facial recognition technology, leading to his arrest and subsequent sexual assault while in jail. Despite the Harris County District Attorney’s office later determining that Murphy was not involved in the robbery, the damage had already been done. This case exemplifies the potential harm caused by a technology that, despite its theoretical reliability, often falls short in practice.
Scientific American researchers emphasize that the algorithms used by law enforcement are typically developed by companies like Amazon, Clearview AI, and Microsoft, which build their systems for different environments. However, federal testing has shown that most facial recognition algorithms perform poorly at accurately identifying people who are not white men, highlighting their inherent bias.
The Federal Trade Commission (FTC) has taken action against companies misusing FRT. In 2023, Rite Aid was prohibited from using facial recognition technology after falsely accusing individuals of shoplifting. One incident involved an 11-year-old girl who was stopped and searched based on a false match. In another case, the Detroit Police Department was sued by a pregnant woman who was misidentified as a carjacking suspect.
The FTC acknowledges that people of color are often misidentified when using FRT. The overrepresentation of white males in training images skews the algorithms, leading to the marking of Black faces as criminal and the subsequent targeting and arresting of innocent Black individuals. The responsibility to address these issues lies not only with the companies developing these products but also with the police forces themselves. Critically examining methods and ensuring staff and image diversity is crucial to prevent the exacerbation of racial disparities and the violation of rights.
In conclusion, the misidentification of Black individuals by facial recognition technology is a pressing concern that calls for urgent reform. As studies have shown, the algorithms used in these systems are often biased and disproportionately impact minority groups. It is essential for companies and law enforcement agencies to take responsibility for rectifying these issues and working towards a fairer and more equitable use of facial recognition technology.