Security Risks of Using ChatGPT for Your Organization

Date:

ChatGPT is a large language model (LLM) tool simplifying workflows for software engineers and developers and that has come into the spotlight in 2023. Although using large language model tools such as ChatGPT can be beneficial to boost productivity, there are security risks associated with the platform that organizations should be aware of, even when the platform is not being used by the organization directly.

GitHub, a platform popular amongst developers with over 100 million users, was revealed to have exposed 10 million secrets in public repositories in 2022 alone, according to the GitGuardians 2023 State of Secrets Sprawl report. Additionally, a contractor of Toyota exposed database credentials associated with a Toyota mobile application in a public GitHub repository, exposing that even organizations not using GitHub may still be affected. Similarly, sensitive information will likely be stored on ChatGPT as even though organizations are not using it, there is a high pressure within the developer community to use such tools to avoid feeling left behind.

Cyberhaven has noted that requests to input data into ChatGPT from workers at companies of their clients were blocked due to the risks of leaking confidential information, regulated information, or source code. OpenAI API keys showed a significant growth around late 2022 as well, revealed in the GitGuardian 2023 State of Secrets Sprawl Report, verifying that developers are actively using the tool.

Although ChatGPT is programmed to deny requests to explicitly provide sensitive information, its “intuitive” answers are easy to trick, as we have seen in the aforementioned examples. Even worse, since it is unencrypted, lacks strict access control, and lacks access logs, it is highly attractive for attackers. Anything entered into the platform, including personal accounts, are vulnerable to attackers as it will have a complete history log of the code requests and queries.

See also  ChatGPT: The Hype is Real

With the platform being so easy to access and with the AI bias of trusting its answers, it is important for organizations to educate developers about ChatGPT, the risks associated and the limitations of the technology. Additionally, secrets should always be stored in a secure way, using encryption, access control and logs.

ChatGPT is a powerful tool, but as with anything, needs to be managed and controlled to ensure it does not become a tool of attack for your organization. Make sure to educate developers, recognize and secure secrets and manage the platform with a heightened sense of security to protect your company and its data.

Frequently Asked Questions (FAQs) Related to the Above News

Please note that the FAQs provided on this page are based on the news article published. While we strive to provide accurate and up-to-date information, it is always recommended to consult relevant authorities or professionals before making any decisions or taking action based on the FAQs or the news article.

Share post:

Subscribe

Popular

More like this
Related

Obama’s Techno-Optimism Shifts as Democrats Navigate Changing Tech Landscape

Explore the evolution of tech policy from Obama's optimism to Harris's vision at the Democratic National Convention. What's next for Democrats in tech?

Tech Evolution: From Obama’s Optimism to Harris’s Vision

Explore the evolution of tech policy from Obama's optimism to Harris's vision at the Democratic National Convention. What's next for Democrats in tech?

Tonix Pharmaceuticals TNXP Shares Fall 14.61% After Q2 Earnings Report

Tonix Pharmaceuticals TNXP shares decline 14.61% post-Q2 earnings report. Evaluate investment strategy based on company updates and market dynamics.

The Future of Good Jobs: Why College Degrees are Essential through 2031

Discover the future of good jobs through 2031 and why college degrees are essential. Learn more about job projections and AI's influence.