Technology has developed at an incredibly rapid rate, and various forms of artificial intelligence (AI) have been developed as a result. One popular AI application that has gained traction in recent times is ChatGPT – an AI chatbot. While ChatGPT works as a guide when researching and has many benefits, there are risks associated with its use and reliance. The legal profession, in particular, should be aware of such risks, as relying on ChatGPT could result in serious legal repercussions, including professional negligence.
It is important to note that much of the information provided by ChatGPT, such as findings from an internet search, can already be accessed easily through more traditional methods. Here, lawyers must still do extensive legal research and check relevant sources of law such as the constitution and case law. The limitation of ChatGPT in this regard is that it frequently gets citations incorrect, and its explanations for given answers is often incorrect. As such, it is the lawyer’s job to make sure that the citations are properly listed and all legal/factual grounds are properly explained. Moreover, as the law changes quickly, ChatGPT tends to be outdated in certain areas and causes a professional to miss out on recent developments that could be crucial in helping their client’s case.
If a lawyer places too much reliance on ChatGPT, they could be found guilty of professional negligence. As a result, the lawyer could face severe consequences, such as the council randomly reporting the negligence to the Legal Practitioners Council and mistrust from clients and colleagues. Depending on the gravity of the case, the lawyer’s firm may even be forced to close down.
This is not to say that technological advances should not be welcomed in the legal profession, but that AI chatbots should be used appropriately and sparingly. Until the AI applications are advanced enough to truly alleviate burdens on legal professionals, lawyers should not forget the importance of conducting thorough research and fulfill the basic duties associated with their job.