OpenAI CEO Believes Making Models Bigger is a Thing of the Past

Date:

OpenAI has been a much-discussed organization in the tech world, with its research and products providing interesting implementations of artificial intelligence (AI). Recently, their CEO Sam Altman led the way in the creation of GPT-4, the company’s latest long awaited upgrade to its language models (LLMs). This large model has been quite successful—so much so that a group of prominent experts and tech company leaders, including Elon Musk, signed a letter urging a moratorium of AI experimentation greater than OpenAI’s GPT-4.

Given the success of GPT-4 and the OpenAI’s research, one might expect them to continue to pursue creating ever-larger models to produce more impressive results. However, Altman has surprised many in the tech world by saying that the practice of simply increasing models’ size to gain improvements has come to an end. During an MIT event, Altman stated that “[we]are at the end of the era where it’s going to be these, like, giant, giant models. We’ll make them better in other ways.”

OpenAI has had a long history of working on large language models. It all started with the GPT-2, their first landmark model, which was released in 2019 with a parameter count of 1.5 billion–adjustable variables that help its AI “learn” from given data. The following year saw an incredible leap in size and power with the GPT-3, which held an estimated 175 billion parameters. Compared to that, GPT-4 was an absolute beast, with its size increasing to one trillion parameters. While OpenAI has not revealed the true size of the GPT-4, its impact upon the AI industry makes it clear that a tremendous increase in size and power was achieved.

See also  OpenAI Implements Visible Watermarks to Label AI-Generated Images on Social Media

Nonetheless, OpenAI’s own technical report suggests that further increases to model size may no longer yield results. As Altman noted, this is a similar situation to the “gigahertz race in chips in the 1990s and 2000s, where everybody was trying to point to a big number.” Interestingly, Altman conceded that parameter counts may still increase, but his primary focus has shifted away from simply scaling up models. Rather, he said that their goal is “to deliver the most capable, useful and safe models.”

Frequently Asked Questions (FAQs) Related to the Above News

Please note that the FAQs provided on this page are based on the news article published. While we strive to provide accurate and up-to-date information, it is always recommended to consult relevant authorities or professionals before making any decisions or taking action based on the FAQs or the news article.

Share post:

Subscribe

Popular

More like this
Related

Obama’s Techno-Optimism Shifts as Democrats Navigate Changing Tech Landscape

Explore the evolution of tech policy from Obama's optimism to Harris's vision at the Democratic National Convention. What's next for Democrats in tech?

Tech Evolution: From Obama’s Optimism to Harris’s Vision

Explore the evolution of tech policy from Obama's optimism to Harris's vision at the Democratic National Convention. What's next for Democrats in tech?

Tonix Pharmaceuticals TNXP Shares Fall 14.61% After Q2 Earnings Report

Tonix Pharmaceuticals TNXP shares decline 14.61% post-Q2 earnings report. Evaluate investment strategy based on company updates and market dynamics.

The Future of Good Jobs: Why College Degrees are Essential through 2031

Discover the future of good jobs through 2031 and why college degrees are essential. Learn more about job projections and AI's influence.