Chinese surveillance equipment manufacturer Dahua is facing backlash over its skin colour analytics feature, which has raised concerns about human rights violations. According to a report by the IPVM (Internet Protocol Video Market), Dahua is selling cameras with this feature in Europe. The analytics claim to identify skin colour and categorize it into three groups: yellow, black, and white. While Dahua defends this feature as a basic aspect of smart security solutions, critics argue that it enables racial profiling and poses a threat to privacy and equality.
The IPVM report also highlights Dahua’s previous involvement in providing a video surveillance system with real-time Uyghur warnings to the Chinese police. This system included identifying features such as eyebrow size, skin colour, and ethnicity. Dahua, along with another Chinese video surveillance company called Hikvision, has reportedly won contracts worth $1 billion from China’s Xinjiang province, a region with a significant Uyghur population.
Dahua’s inclusion of skin colour detection in its surveillance cameras has drawn attention in three European countries—Germany, France, and the Netherlands—where racial tensions exist. This development raises concerns about the potential for racial profiling and discrimination on a larger scale. Skin colour detection in surveillance technology has long been a contentious and ethically sensitive issue due to the potential for errors and biased outcomes.
Human rights advocates argue that surveillance software with skin tone analytics infringes upon the right to equality and non-discrimination. They believe that allowing camera owners and operators to racially profile people at scale without their knowledge violates privacy rights. Companies, including Dahua, have a responsibility to respect human rights and mitigate any risks that may arise from their products or actions.
Dahua denies that its surveillance products are designed to enable racial identification. The company insists that its skin tone analysis is a general feature of surveillance technology and does not target specific ethnic groups. However, critics emphasize the potential dangers associated with this technology, including false arrests, discrimination, and biases against individuals based on their skin colour.
The European Union recently passed a revision proposal to ban the use of facial recognition systems in public places, citing concerns about mass surveillance and privacy infringement. The US government has also taken steps to restrict facial recognition services offered by technology companies for law enforcement purposes.
It is crucial to address the ethical, privacy, and human rights implications of technologies like skin colour analytics in surveillance. As discussions continue, stakeholders must strive to strike a balance between security concerns and the protection of individual rights and freedoms.