A recent study by the University of California, San Diego suggests that people who turn to AI assistants, like ChatGPT, for medical advice may be misguided. The rise of generative artificial intelligence has led to an increase in people seeking medical information from chatbots instead of a medical professional. The UC San Diego study found that while ChatGPT gave evidence-based responses to 91% of all questions, only 22% of answers made referrals to specific resources to help the questioner.
Flustered chatbots should offer a would-be patient the option of connecting to a human, the UC San Diego team said. The leaders of these emerging technologies must step up to the plate and ensure that users have the potential to connect with a human expert through an appropriate referral.
It’s worth noting that consulting Doctor Google was already a common prelude to a visit to a GP before the pandemic. During the pandemic, over 60% of people were putting their symptoms into the search engine before or instead of getting the opinion of a professional, according to a British survey.
The UC San Diego study was published by the American Medical Association’s JAMA Open Network on June 7. There are concerns surrounding the limitations of AI assistants in the medical industry and the importance of keeping human experts in the mix. In fact, the US National Eating Disorder Association reportedly disabled its Tessa chatbot after it was revealed to be offering harmful advice.
People should still seek medical advice from a licensed professional. While AI assistants can provide helpful information, they lack specific resources that can help users get the necessary help. Therefore, users of these bots must be able to connect with human experts through an appropriate referral. It’s time for the leaders of these emerging technologies to ensure users are not relying solely on AI assistance when it comes to their health.
Frequently Asked Questions (FAQs) Related to the Above News
What is the recent study conducted by the University of California, San Diego?
The recent study conducted by the University of California, San Diego suggests that people who turn to AI assistants, like ChatGPT, for medical advice may be misguided.
What percentage of questions were answered with evidence-based responses by ChatGPT?
According to the UC San Diego study, ChatGPT gave evidence-based responses to 91% of all questions.
What percentage of ChatGPT's answers made referrals to specific resources to help the questioner?
Unfortunately, only 22% of ChatGPT's answers made referrals to specific resources to help the questioner, as per the UC San Diego study.
What do the leaders of these emerging technologies need to do to ensure users have the potential to connect with a human expert?
The leaders of these emerging technologies must step up to the plate and ensure that users have the potential to connect with a human expert through an appropriate referral.
Has consulting Doctor Google become more common during the pandemic?
Yes, during the pandemic, over 60% of people were putting their symptoms into the search engine before or instead of getting the opinion of a professional, according to a British survey.
What are the concerns surrounding the limitations of AI assistants in the medical industry?
There are concerns surrounding the limitations of AI assistants in the medical industry and the importance of keeping human experts in the mix.
What happened to the Tessa chatbot of the US National Eating Disorder Association?
According to reports, the US National Eating Disorder Association disabled its Tessa chatbot after it was revealed to be offering harmful advice.
Should people rely solely on AI assistance when it comes to their health?
No, people should still seek medical advice from a licensed professional. While AI assistants can provide helpful information, they lack specific resources that can help users get the necessary help. Therefore, users of these bots must be able to connect with human experts through an appropriate referral.
Please note that the FAQs provided on this page are based on the news article published. While we strive to provide accurate and up-to-date information, it is always recommended to consult relevant authorities or professionals before making any decisions or taking action based on the FAQs or the news article.