Lawyer apologizes for using fabricated legal cases generated by AI-powered chatbot in lawsuit against airline. The incident raises concerns about the ethical considerations of using AI in the legal industry. Repercussions are yet to be determined.
OpenAI's AI language platform ChatGPT is at the center of another lawsuit, this time filed by a radio host who claims the bot falsely accused him of embezzlement. This raises questions about AI systems being held accountable for providing false information. While libel cases related to AI are viable, this particular case may be an exception. OpenAI has not yet commented on the accusations.
OpenAI, creator of ChatGPT, is being sued by radio host Mark Walters for defamation. ChatGPT generated false information about Walters embezzling funds from a non-profit organization. Such instances have led to a disclaimer on ChatGPT's website warning users of occasional false information generated by the AI. The importance of ensuring accurate and reliable information is highlighted, as more professionals may face trouble. OpenAI will need to work on ChatGPT's responses to alleviate these problems.
A radio host in Georgia is suing OpenAI for defamation after a false legal document was created by ChatGPT. This marks the first AI defamation case caused by a hallucination. Experts warn of false chatbot responses. No comments from OpenAI or radio host.
A lawyer got tricked by an AI tool while using fabricated legal cases in court. ChatGPT created bogus judicial decisions with quoted text. A judge questioned how he missed it, calling it legal gibberish while the hearing went undecided.
Explore the evolution of tech policy from Obama's optimism to Harris's vision at the Democratic National Convention. What's next for Democrats in tech?