An Oxford University expert is warning users against sharing their deepest secrets and personal information with ChatGPT, a language learning model. According to Professor Mike Woolridge, while ChatGPT may seem sympathetic at times, it lacks empathy and has no real-life experiences. Users should also be cautious about where their confidential conversations are going since their inputs could be used in future versions of ChatGPT. Similar to previous concerns with platforms like Facebook, Woolridge advises against engaging in personal conversations, expressing political opinions, or complaining about work on ChatGPT. OpenAI, the parent company of ChatGPT, implemented measures to disable chat history after instances of accidental data exploitation. However, user data is still stored for 30 days. Experts remain concerned about the risks associated with the lack of protection for user data. A security researcher recently highlighted an unresolved data exfiltration vulnerability in ChatGPT, bringing additional data privacy concerns. OpenAI is working to address the issue, but a final solution has not yet been developed. In the meantime, users are urged to exercise caution and refrain from sharing sensitive information with language learning models like ChatGPT.
Beware: Oxford Professor Warns Against Sharing Personal Info with ChatGPT
Date:
Frequently Asked Questions (FAQs) Related to the Above News
Please note that the FAQs provided on this page are based on the news article published. While we strive to provide accurate and up-to-date information, it is always recommended to consult relevant authorities or professionals before making any decisions or taking action based on the FAQs or the news article.