OpenAI is facing a defamation lawsuit from a radio host who claims the company’s language-generating tool, ChatGPT, included him in a summary of a lawsuit that he had nothing to do with. Mark Walters, founder of Armed American Radio, filed the lawsuit in June, alleging that ChatGPT directly identified him as a key player in the Second Amendment Foundation (S.A.F.) v. Ferguson lawsuit, even though he was not named in it. According to the suit, ChatGPT‘s output falsely accused Walters of defrauding and embezzling funds from the SAF. Walters is seeking unspecified monetary damages. As of now, OpenAI has not commented on the S.A.F.’s lawsuit.
Specifically, ChatGPT used information from Ammoland.com editor Fred Riehl to summarize the lawsuit filed by a guns rights organization against Washington state Attorney General Bob Ferguson, accusing him of unfairly targeting the S.A.F. over their political beliefs. Walters’ lawsuit alleges that ChatGPT falsely accused him of being sued by S.A.F. and of embezzling funds, manipulating financial records, and failing to provide accurate financial reports.
The lawsuit claims that every statement of fact regarding Walters in the ChatGPT summary is false and that the allegations were malicious and libelous, damaging his reputation and exposing him to public hatred, contempt, or ridicule. Walters is seeking unspecified monetary damages. OpenAI has yet to comment on the lawsuit publicly.
The case highlights the increasing concern over the use of artificial intelligence in generating written language, with critics warning of the potential for the technology to be misused to spread false information or defame individuals. ChatGPT, which is designed to mimic human writing, is already being used in a variety of applications, including content creation and customer service. As the technology continues to develop, it will be important to ensure that safeguards are in place to prevent its misuse.