Despite being a popular source of information, ChatGPT falls short in addressing public health issues such as addiction and domestic violence, according to a study published in JAMA. While ChatGPT's responses are well-informed, they frequently fail to provide necessary referrals to resources. Experts suggest AI companies prioritize human touch and build partnerships with public health leaders to develop databases of resources and refine models for specialized medical guidance. It's important for users to understand ChatGPT's limitations in responding to health emergencies.
Recent studies reveal that AI chatbots like ChatGPT may offer better care and empathy than human doctors. JAMA Network found ChatGPT rated higher for quality and empathy, providing more accurate responses to health issues. While doctors are cautious, chatbots can help provide info and support, but should never replace human care and expertise. AI creators must address concerns by allowing users to access human experts when needed. ChatGPT scored an impressive 91% in accuracy, but its referral rate fell behind.
Study reveals that AI assistants like ChatGPT deliver 91% evidence-based responses to public health questions. The researchers recommend AI assistants to direct people to helpful resources such as helplines.
Artificial intelligence (AI) platforms like ChatGPT may soon be able to assist those with physical and mental health questions. However, a study found that AI systems lack necessary referrals to specific resources for successful health treatment. Connecting individuals to trained professionals should be a priority for AI systems to improve public health outcomes. While relying on AI for health information is common, trained professionals remain the best source for treatment.
Explore the evolution of tech policy from Obama's optimism to Harris's vision at the Democratic National Convention. What's next for Democrats in tech?