OpenAI’s use of a voice similar to Scarlett Johansson in ChatGPT has raised concerns about gender-stereotyping in technology. Actress Scarlett Johansson expressed her frustration with OpenAI for using a voice resembling hers without her consent. The voice named Sky, offered to users since September 2023, was strikingly similar to Johansson’s voice, especially with the launch of the updated model, GPT-4o.
Although OpenAI denied intentionally mimicking Johansson’s voice, CEO Sam Altman’s tweet her on the day of GPT-4o’s launch alluded to the connection. The company explained that Sky’s voice was provided by a different actress but acknowledged the potential ethical concerns surrounding voice cloning using AI technology.
Historically, performers like Bette Midler and Tom Waits have taken legal action against unauthorized use of their voices in advertisements, setting legal precedents in voice mimicry cases. In light of these events, legislators are navigating the challenges presented by the advancement of AI, with recent FCC rulings addressing AI-generated voices in robocalls.
Furthermore, OpenAI’s choice to assign a female voice to Sky adds to the long-standing tradition of feminized personas in AI technologies, perpetuating gender stereotypes. Many scholars have criticized this practice, highlighting how it reinforces submissive tropes and distracts users from the data extraction and surveillance conducted by voice assistants like ChatGPT.
Scarlett Johansson’s call for transparency and legislation to protect vocal likeness and identity underscores the need for clearer regulations around the use of AI-generated voices. As AI continues to advance, safeguarding individuals’ voices from unauthorized use by AI systems like ChatGPT becomes increasingly crucial for privacy and identity protection.