Recent studies have suggested that artificial intelligence (AI) chatbot programs like ChatGPT may provide better care and empathy than human doctors. In a study conducted by JAMA Network, ChatGPT responses were rated much higher for quality and empathy than responses from verified doctors. The chatbot also provided more complete and accurate responses about smoking cessation, sexual and mental health issues.
In another JAMA Network study, ChatGPT’s responses to public health queries were evaluated by licensed healthcare professionals. The chatbot provided evidence-based answers 91% of the time, which is much higher than other programs like Amazon Alexa and Apple Siri. However, the chatbot only made referrals to specific resources like the National Suicide Prevention Hotline or Alcoholics Anonymous 22% of the time.
Although AI chatbots like ChatGPT have undeniable abilities, many doctors are still hesitant to give them too much credit too soon. Doctors worry about the garbage-in, garbage-out problem and the possibility of misinformation being amplified. One particular challenge with ChatGPT is its ability to communicate effectively and instill confidence in users, which may not always be warranted.
The creators and leaders of emerging AI technologies need to respond to these concerns and ensure users have the option to connect with a human expert through an appropriate referral. While AI chatbots like ChatGPT may be helpful for providing information and support, they should not replace the care and expertise provided by human doctors.