Snapchat’s artificial intelligence chatbot could face a shutdown in the UK after the country’s privacy watchdog, the Information Commissioner’s Office (ICO), raised concerns about the company’s failure to assess its risk to children. According to the ICO’s provisional findings, Snap, the parent company of Snapchat, may be required to stop offering the chatbot in Britain, where it boasts a user base of 22 million. This crackdown marks a significant step by the UK in regulating large language models, such as ChatGPT and Google Bard.
The ICO’s investigation revealed flaws in Snap’s data protection assessment, which was conducted prior to launching the My AI chatbot. The regulator emphasized the importance of a thorough risk assessment, particularly when working with innovative technologies that process personal data of children aged 13 to 17. John Edwards, the Information Commissioner, warned AI companies against rushing to launch products without proper compliance measures and urged the sector to avoid becoming a Wild West.
Snapchat has faced criticism over its My AI bot, which utilizes ChatGPT technology but incorporates additional safeguards for children. The company has come under fire for alleged misleading practices, such as collecting location data without user consent and promoting unsafe diets. Furthermore, unlike ChatGPT, the chatbot inserts advertisements into conversations.
In response to the ICO’s provisional decision, Snapchat expressed its commitment to protecting user privacy and stated that it will work closely with the regulator. Snap claimed that My AI underwent a thorough legal and privacy review before becoming publicly available, and the company will continue to collaborate with the ICO to address any concerns regarding their risk assessment procedures.
The ICO has repeatedly issued warnings about the use of generative AI services like ChatGPT, cautioning office workers that incorporating personal data into emails drafted using the technology could potentially violate data protection laws. Additionally, AI companies have been put on notice that they could face fines for incessantly scraping individuals’ personal data.
In conclusion, the ICO’s preliminary findings could lead to a forced shutdown of Snapchat’s AI chatbot in the UK due to inadequate risk assessment, especially concerning the protection of children’s data. The regulator’s actions highlight the need for AI companies to prioritize compliance and responsible development practices. As the sector continues to evolve rapidly, maintaining an appropriate regulatory framework is essential to ensure user privacy and protect against potential risks.