Actress Scarlett Johansson has taken legal action against OpenAI over concerns about an artificial intelligence voice that closely resembled hers. Johansson revealed that she felt compelled to hire legal counsel after a voice named Sky was released by the AI company, claiming that it bore a resemblance to her own voice.
Johansson initially received an offer from OpenAI CEO Sam Altman to voice a feature for their ChatGPT chatbot. However, after careful consideration, she decided not to participate in the project for personal reasons. Despite her decision not to lend her voice to the project, Johansson was shocked to discover the similarities between her voice and the AI voice produced by OpenAI.
Altman clarified that the voice in question, Sky, was not intended to mimic Johansson’s voice and that it was portrayed by a different actress. He expressed regret over the miscommunication and stated that they had cast the voice actor before approaching Johansson about the project.
In response to Johansson’s concerns, OpenAI promptly removed the voice named Sky and replaced it with a voice called Juniper. The company emphasized that their AI voices are not designed to imitate any specific individual, including celebrities like Johansson.
The incident highlights the increasing significance of AI technology in Hollywood, particularly as deepfake technology advances in replicating human voices and faces. It also underscores the importance of clear communication and consent when using AI to simulate or replicate individuals’ voices.
Johansson’s legal action against OpenAI echoes the broader trend of challenges faced by AI companies regarding data privacy and intellectual property rights. With ongoing litigation in multiple cases involving AI companies, the intersection of technology, entertainment, and legal frameworks continues to evolve.
This development follows Johansson’s previous legal dispute with Walt Disney over the release of her film Black Widow, showcasing her proactive stance on protecting her artistic contributions in the industry. Johansson’s case adds to the complex landscape of legal and ethical considerations surrounding AI, deepfake technology, and intellectual property rights.