A new study has stumbled upon ChatGPT’s Emotional Intelligence or EQ and found that not only does the AI chatbot have great EQ, but they often prioritize emotions over facts. This comes after developing a speech interface to ChatGPT for his nearly four-year-old daughter as part of a research experiment.
The researchers discovered that not only is ChatGPT equipped with excellent interpersonal and emotional capabilities, but they also prefer emotions rather than facts. Microsoft Corp and Alphabet Inc’s Google are using the huge language model technology that powers the chatbot; however the researcher’s findings point to it being better as an emotional companion than a source of information.
This was evident when Google’s Bard and Microsoft’s Bing, both based off of ChatGPT’s underlying technology, provided inaccurate information in regards to various facts. It is worth noting that this is a result of emotion over reason as these programs have difficulty differentiating facts from fiction.
This revelation has caused many to ask the question if a chatbot such as this is suitable to provide therapy. Clinical psychologist Thomas Ward of King’s College London has warned against using chatbots as therapists, citing the lack of recognition of complex emotions as a potential obstacle. Ward reminds us that subtle aspects of human connection, such as the touch of a hand or understanding the timing when to speak and when to listen, are not replicated by a bot.
ChatGPT is developed by OpenAI, an AI research laboratory based in San Francisco backed by Bill Gates. Founded in 2015, OpenAI works on artificial general intelligence, which seeks to develop a computer program that can learn any task like a human being would. OpenAI has recently updated its privacy settings for ChatGPT and released an ‘Incognito’ mode, indicating the team’s dedication to protecting user data.
With these new discoveries, ChatGPT has proven to be an invaluable companion for those looking for emotional support. However, caution should be exerted when using chatbots to replace human interactions as there many nuances of human relationships and compassion which the AI may not be able to replicate.