If you have ever used an AI tool such as ChatGPT or Bing Chat, you may be wondering to yourself: What happens to all that information you’re entering into these tools? For instance, would you be comfortable with someone you don’t know being able to listen into your private conversations with your doctor or loved ones, or even selling snippets of what they heard to someone else? This is essentially what third-party trackers do; they watch and track your online activities and then sell that data to someone else who can use it to their benefit.
With the increasing capture of AI tool usage such as ChatGPT, Bing Chat, or Google Bard, safety and ethics come into play. So when millions of people go to a website to ask questions, generate cover letters, or debug code, you have to question: what exactly happens to the data that is being entered into these AI tools, and what exactly do data brokers have to do with this?
Jean-Paul Schmetz — Ghostery CEO, and a privacy expert — had an insightful conversation with ZDNET about the privacy concerns regarding generative AI tools like ChatGPT. He stated, “OpenAI is refreshingly simple in the sense that it is safe. They don’t have anyone looking over your shoulder while you are chatting with ChatGPT.” However, OpenAI does collect data for purposes of training its models, something which is stated in the privacy policy — which users have the option to opt out of by filling out a Google Form.
His advice to be careful with the information you share with AI systems, especially with the research of GPT-2 uncovering vulnerabilities known as data extraction attacks. Through these attacks, malicious entities are able to learn previously inputted private information, such as a user’s name, address and phone number. It is currently not known whether GPT-4 is affected by these same vulnerabilities.
To ensure the safe use of ChatGPT, OpenAI President Greg Brockman released a statement this week emphasizing the company’s commitment to safety with its AI models. He stated, “We believe that powerful training runs should be reported to governments, be accompanied by increasingly-sophisticated predictions of their capability and impact, and require best practices such as dangerous capability testing.”
Ghostery is most known for its browser extensions, such as its ad and tracker blockers, private browsing and search, and tracker analytics. With Ghostery, Schmetz compares the usage of first-party trackers in the real world to having someone standing in the back at a doctor’s office or shop, which should be avoided.
Though data privacy can be concerning when using AI tools like ChatGPT, being mindful of the data you share and using a tool like Ghostery that can be found in your browser bar to help protect your privacy is a way to maintain the safety of your information.