False Arrests Expose Racial Bias in Facial Recognition: Massachusetts Bill Aims to Prevent Injustice
In January 2020, Robert Williams, a Black man from Detroit, experienced the harrowing consequences of a false arrest due to the use of facial recognition technology. Williams was wrongfully accused of shoplifting several watches from a store in Michigan, based on a search that resulted in false identification through facial recognition software. This incident, along with several others across the country, highlights the racial biases inherent in these technological systems.
According to the American Civil Liberties Union (ACLU), many individuals, particularly Black Americans, have been falsely arrested as a result of errors within facial recognition software. The problem lies in the source of the images used for identification, which are predominantly stored in criminal databases that disproportionately represent Black and Brown people. This overrepresentation leads to a greater likelihood of misidentification and subsequent false arrests.
To address this issue, a Massachusetts bill is being proposed to centralize all facial recognition software within the state police department. The bill, filed by state Senator Cindy Creem and Representatives Dave Rogers and Orlando Ramos, is based on the recommendations of the Special Commission to Evaluate Government Use of Facial Recognition Technology in the Commonwealth. These recommendations include limiting the use of facial recognition software to serious criminal investigations with a warrant and prohibiting its use for constant tracking of individuals throughout the day.
While the recommendations were passed by the House, the bill did not reach the Senate in time before the end of the previous legislative session. However, advocates are optimistic about the bill’s chances in the current session under Governor Maura Healey. The centralization of facial recognition data with the Massachusetts State Police would reduce the potential for misuse or abuse of the technology by individual departments across the state.
The unanimous support from the commission, including law enforcement representatives, emphasizes the need for stricter regulation and control over facial recognition technology. It is essential to acknowledge the fallibility and racial biases present in these systems, which can result in significant harm to individuals who are wrongfully arrested. Additionally, police departments themselves are also at risk of facing reputational damage and legal consequences due to the misuse of such technologies.
One of the challenges surrounding facial recognition software lies in its accuracy rate, which varies depending on the circumstances in which it is used. Testing the software using well-lit photos yields higher accuracy rates, while using dim or grainy photos can significantly reduce accuracy. Therefore, it is crucial to approach claims of high accuracy with skepticism and caution.
The proposed bill in Massachusetts is a step towards rectifying the injustices caused by false arrests resulting from racial bias in facial recognition technology. By centralizing the technology under the state police department, there is a greater opportunity for accountability, oversight, and regulation. This can help prevent further instances of injustice and protect the rights of individuals, particularly those from marginalized communities who are disproportionately affected by these biases.
Ultimately, it is essential to critically examine and address the flaws and biases within facial recognition systems to ensure a fair and just society. By working towards comprehensive legislation and regulation, Massachusetts is taking a significant step forward in combating racial biases and preventing the wrongful arrests that have plagued individuals like Robert Williams. The fight for justice and equality continues, and it is crucial to remain vigilant in the face of emerging technologies that have the potential to perpetuate systemic biases and discriminatory practices.