Comparing Bias in DALL-E Artwork Generated From African and European Workers

Date:

AI technologies, such as Stable Diffusion or OpenAI’s DALL-E, generate noticeably biased images when asked to create artworks of ‘African workers’ in comparison to ‘European workers’. The output images generated for the ‘African workers’ reveal a stereotyped representation with malnourished depictions and crude tools to carry out simple labors. Meanwhile, the images created for ‘European workers’ provide the exact opposite, with fitting work attire and happy European faces beaming alongside other individuals of similar racial characteristics.

Generative Artificial Intelligence seeks to replicate human processes of making decisions and creating artworks across a myriad of domains. This AI model can be used to generate an enormous number of images, text, or multimedia files, for example, OpenAI’s ChatGPT technology can create entire paragraphs that sound written by humans. Stable Diffusion specifically works by studying a vast quantity of images available on the web. Lack of appropriate data to train these AI models may lead to further oversights and potential bias.

It is highly questioned if this technology is launched in a reckless manner where the consequences are addressed later. On the other side, advocates argue the possible benefits of AI that could ultimately improve productivity could outweigh the risks. For example, last year the AI based avatar app Lensa faced intense criticism for overly sexualized images of women in comparison to simple PG-friendly outputs for its male avatars.

Stable Diffusion performs its image classification on LAION-5B, a widely open-source dataset. This accessibility to the dataset allows anyone to trace back the source of the damaging outputs. A small search on the web can reveal similar images. An undisclosed person with a PhD in AI, who spoke to Insider, suggested improving data collection methods with safety protocols to avoid creating stereotypical output. Similarly, Sasha Luccioni, a researcher at Hugging Face, proposed labeling model outputs with disclaimers that reflect potential biases.

See also  Breakthrough in Dark Energy Research Doubles Precision in Universe Analysis

Stability AI, the company behind the tool, failed to comment on the situation. Safety mechanisms like the ones mentioned above by Luccioni could give users of AI based technologies a better understanding of the information they are consuming from the technology.

Stability AI is a leading software development company specialized in Artificial Intelligence models. The company has a team of experts that actively seek and develop powerful AI models to help automate the mundane. From facial recognition models for law enforcement to generative models to create artworks, StabilityAi is poised to develop the future of Artificial Intelligence and its applications.

Sasha Luccioni, who mentioned the importance of labels in AI models, is a respected AI researcher and a graduate of the Hugging Face’s Masters in Artificial Intelligence program. Luccioni graduated Cum Laude and worked in different projects related to machine learning, natural language processing, computer vision, and AI optimization. Sasha is a key member in the Hugging Face research team, contributing significantly to their development of AI models.

Frequently Asked Questions (FAQs) Related to the Above News

Please note that the FAQs provided on this page are based on the news article published. While we strive to provide accurate and up-to-date information, it is always recommended to consult relevant authorities or professionals before making any decisions or taking action based on the FAQs or the news article.

Share post:

Subscribe

Popular

More like this
Related

Obama’s Techno-Optimism Shifts as Democrats Navigate Changing Tech Landscape

Explore the evolution of tech policy from Obama's optimism to Harris's vision at the Democratic National Convention. What's next for Democrats in tech?

Tech Evolution: From Obama’s Optimism to Harris’s Vision

Explore the evolution of tech policy from Obama's optimism to Harris's vision at the Democratic National Convention. What's next for Democrats in tech?

Tonix Pharmaceuticals TNXP Shares Fall 14.61% After Q2 Earnings Report

Tonix Pharmaceuticals TNXP shares decline 14.61% post-Q2 earnings report. Evaluate investment strategy based on company updates and market dynamics.

The Future of Good Jobs: Why College Degrees are Essential through 2031

Discover the future of good jobs through 2031 and why college degrees are essential. Learn more about job projections and AI's influence.