Chaudhri, a well-known lawyer from the ChatGPT law firm, is facing a tough time in court after it came to light that the legal opinions offered by the company may be made up entirely of fantasy.
The revelation came to light during the recent case of Mr. Schwartz when it was discovered that ChatGPT had invented cases that had never been heard in any courtroom. Judge Castel even questioned whether some of the briefings were legal gibberish.
Using AI to produce substantive content, instead of doing one’s own research, can create doubt for ongoing employment. Although AI can be used to complete minor administrative tasks, producing substantive content using AI is a race to the bottom.
Clients hire lawyers for their expertise and innovation, and the likelihood of being retained again drops dramatically if a paying client learns that a lawyer is using AI to complete their work at a high hourly rate. As such, this raises questions about the value extracted from lawyers and employees who use AI tools like ChatGPT.
It is clear that employees using ChatGPT or any other similar AI tool at work are walking a tightrope. Using AI for minor administrative tasks can be helpful, but using it for substantive content can raise doubts about job security.
The legal profession is all about expertise and innovation, and clients expect their lawyers to deliver exactly that. By using AI tools for substantive content, lawyers run the risk of being perceived as lazy or unproductive.
In conclusion, while AI tools like ChatGPT can be helpful for some tasks, the legal profession demands innovation and expertise. Using AI for substantive content can undermine that, and lawyers and law firms need to re-evaluate their use of these tools to ensure they are delivering the kind of service clients expect.