Eating Disorder Chatbot Disabled Over Accusations of Harm

Date:

The National Eating Disorder Association (NEDA) faced criticism this week for plans to replace its helpline staff with a chatbot called Tessa. Tessa was developed by Cass, formerly known as X2 AI Inc., in partnership with researchers from Washington University. Rather than an AI-led decision, NEDA has attributed the decision to remove its human helpline staff to an internal reevaluation process that had been underway for three years. The organization temporarily deactivated Tessa for further investigation after comments by a weight inclusive consultant on social media accused the bot of offering harmful advice, including on restrictive dieting. Cass also reported that Tessa experienced an unusually high level of traffic and malicious activity during the same period. Despite NEDA’s claims that Tessa would not replace human support staff, the rollout of the bot led to concerns that it was being used to replace staff and devalue the importance of human interaction in such sensitive areas.

Cass, formerly X2 AI Inc, is a tech firm specializing in advanced conversational AI. The company has developed a chatbot named Tessa in collaboration with researchers from Washington University to provide support to individuals with eating disorders.

Sharon Maxwell is a weight inclusive consultant who expressed concerns after conversing with Tessa and finding that it suggested harmful advice.

See also  Cal Fire's AI Technology Revolutionizes Fire Response in Bay Area

Frequently Asked Questions (FAQs) Related to the Above News

Please note that the FAQs provided on this page are based on the news article published. While we strive to provide accurate and up-to-date information, it is always recommended to consult relevant authorities or professionals before making any decisions or taking action based on the FAQs or the news article.

Share post:

Subscribe

Popular

More like this
Related

Multi-faith Event in Hiroshima: World Religions Unite for AI Ethics

Join us at the Multi-faith Event in Hiroshima on July 9-10, where world religions unite for AI ethics and the future of technology.

Moncton Joins Bloomberg Philanthropies Data Alliance

Join Moncton, Oakville, and Ottawa as they tap into data and AI through Bloomberg Philanthropies City Data Alliance to enhance city services.

Global Multi-Faith Event in Hiroshima to Address AI Ethics for Peace

Participate in the Global Multi-Faith Event in Hiroshima addressing AI ethics for peace with prominent religious figures.

OpenAI Mac App Exposes Conversations: Urgent Privacy Alert

Protect your privacy: OpenAI Mac app ChatGPT exposes conversations in plain text. Update now to safeguard your data.