ChatGPT Exceeds Human Advice Columnists in Social Advice, Study Shows
ChatGPT has proven itself to be a reliable source of technical information, but can it also excel in offering social advice? A recent study published in the journal Frontiers in Psychology suggests that later versions of ChatGPT outperform professional columnists in dispensing personal advice.
Since its public release in November of last year, ChatGPT has attracted an astounding 100 million active monthly users in just two months. Powered by one of the largest language models ever created, the paid version, GPT-4, boasts an impressive 1.76 trillion parameters, sparking a revolution in the AI industry. Trained on extensive text datasets, ChatGPT showcases versatility in providing advice on diverse topics, ranging from law and medicine to history and geography.
Users and AI experts alike have been captivated by ChatGPT’s conversational style and adaptability. Many have turned to the chatbot for personal advice, an area that requires empathy, a quality not explicitly programmed into ChatGPT.
Earlier iterations of ChatGPT faced challenges in providing social advice, as they lacked emotional sensitivity. However, the latest version, utilizing GPT-4, allows users to request multiple responses to the same question, giving them the opportunity to indicate preferences. This feedback mechanism enhances the model’s ability to generate socially appropriate and empathetic responses.
A groundbreaking study compared ChatGPT’s responses to those of human advice columnists when addressing social dilemmas. Participants overwhelmingly perceived ChatGPT’s advice as more balanced, complete, empathetic, helpful, and superior overall compared to professional advice.
One scenario involved a marine biologist facing a long-distance relationship, showcasing ChatGPT’s nuanced response. Participants favored ChatGPT’s advice, which emphasized the importance of considering career paths and suggesting compromises.
Despite the positive reception of ChatGPT’s responses, participants expressed a preference for human advice when unaware of the source. This bias suggests that while ChatGPT excels in certain aspects, humans still value the emotional understanding that machines lack.
The study’s results highlight the potential for well-designed chatbots, like ChatGPT, to augment therapy in the future. However, caution is warranted, and the study acknowledges that AI chatbots should not completely replace professional advisers or therapists.
In conclusion, the success of ChatGPT in providing superior social advice opens new possibilities for AI applications in counseling, while also encouraging human advisers to enhance their approaches by learning from AI. With its ability to surpass human columnists, ChatGPT signifies a significant step forward in the field of AI-driven counseling and support.
Dear readers, as AI technology continues to advance, we can expect even more remarkable developments in the realm of social advice. While ChatGPT proves its worth, let us remember the value of human empathy and understanding that remains irreplaceable.