Meta’s AI-Powered Stickers Draw Concerns About Safety Measures
Meta, formerly known as Facebook, recently launched AI-powered stickers on its apps, including Instagram and Messenger. This new feature allows users to generate unique stickers from text prompts within seconds. However, these stickers have quickly drawn ridicule online and raised concerns about safety measures. Users have discovered ways to create wildly inappropriate images using the technology, such as a child bearing a gun or a lewd image of Canadian Prime Minister Justin Trudeau.
The issue gained attention when a user shared examples of these controversial stickers, which quickly garnered millions of views on social media. Other users followed suit, generating stickers featuring popular characters engaging in inappropriate or violent behavior. Although certain words are blocked by the feature, users found ways to bypass these restrictions with typos or creative prompts.
Meta claims to have implemented safeguards for its AI-generated stickers, powered by Llama2, to ensure responsible usage. However, critics have questioned the effectiveness of these safeguards. Tama Leaver, a professor of internet studies at Curtin University, pointed out that if the term naked is blocked but the AI still produces naked figures, the safeguards are clearly inadequate.
It is worth noting that Meta is not the only company facing issues with problematic AI-generated content. In December, the AI avatar generator app Lensa faced accusations of sexualizing and racializing user avatars, despite rules against adult content.
Representatives for Meta, as well as Disney, Nintendo, Sega, and Trudeau, have not yet responded to requests for comment on the matter.
Overall, the launch of Meta’s AI-powered stickers has sparked concerns about safety measures and highlighted the prevalence of problematic AI-generated content. This serves as a reminder for tech companies to prioritize responsible AI development and ensure proper safeguards are in place to avoid unintended consequences.