ChatGPT is a large language model (LLM) tool simplifying workflows for software engineers and developers and that has come into the spotlight in 2023. Although using large language model tools such as ChatGPT can be beneficial to boost productivity, there are security risks associated with the platform that organizations should be aware of, even when the platform is not being used by the organization directly.
GitHub, a platform popular amongst developers with over 100 million users, was revealed to have exposed 10 million secrets in public repositories in 2022 alone, according to the GitGuardians 2023 State of Secrets Sprawl report. Additionally, a contractor of Toyota exposed database credentials associated with a Toyota mobile application in a public GitHub repository, exposing that even organizations not using GitHub may still be affected. Similarly, sensitive information will likely be stored on ChatGPT as even though organizations are not using it, there is a high pressure within the developer community to use such tools to avoid feeling left behind.
Cyberhaven has noted that requests to input data into ChatGPT from workers at companies of their clients were blocked due to the risks of leaking confidential information, regulated information, or source code. OpenAI API keys showed a significant growth around late 2022 as well, revealed in the GitGuardian 2023 State of Secrets Sprawl Report, verifying that developers are actively using the tool.
Although ChatGPT is programmed to deny requests to explicitly provide sensitive information, its “intuitive” answers are easy to trick, as we have seen in the aforementioned examples. Even worse, since it is unencrypted, lacks strict access control, and lacks access logs, it is highly attractive for attackers. Anything entered into the platform, including personal accounts, are vulnerable to attackers as it will have a complete history log of the code requests and queries.
With the platform being so easy to access and with the AI bias of trusting its answers, it is important for organizations to educate developers about ChatGPT, the risks associated and the limitations of the technology. Additionally, secrets should always be stored in a secure way, using encryption, access control and logs.
ChatGPT is a powerful tool, but as with anything, needs to be managed and controlled to ensure it does not become a tool of attack for your organization. Make sure to educate developers, recognize and secure secrets and manage the platform with a heightened sense of security to protect your company and its data.