Title: Could AMD Emerge as the Next Promising AI Stock? A Closer Look
With Nvidia dominating the data center graphics processing units (GPUs) market, many investors wonder if Advanced Micro Devices (AMD) has what it takes to challenge the industry giant. While Nvidia’s stock has soared by nearly three times this year, driven by the growing demand for artificial intelligence (AI) and its impact on revenue and earnings, AMD’s gains have been primarily fueled by the broader semiconductor rally rather than its financial performance. However, the evolving landscape suggests it’s too early to dismiss AMD from the race.
Although Nvidia currently holds a massive 95% market share in GPUs used for machine learning, the market for AI GPUs is vast enough to accommodate more players. AMD is making progress by sampling its data center GPUs, hinting that they could provide an alternative to Nvidia’s chips.
Graphics cards play a crucial role in accelerating AI workloads in data centers, such as training large language models (LLMs) and inferencing. Nvidia has been the go-to supplier of these GPUs due to their advanced manufacturing nodes and high computational power, leading to efficient data processing.
On the other hand, AMD’s current generation Instinct MI250 data center accelerators, based on a 6nm process, have shown promising performance. AI software start-up MosaicML suggests that these accelerators can deliver 80% of the performance of Nvidia’s A100 chip. This is noteworthy considering that leading AI models like ChatGPT have been trained using thousands of Nvidia A100 GPUs. Furthermore, MosaicML believes that software optimization by AMD could allow the MI250 accelerator to match Nvidia’s A100 chip’s performance.
During its May earnings conference call, AMD announced that the MI250 data center GPU is gaining traction among customers. Notably, it has been deployed in a supercomputer in Finland for training LLMs, setting a record for the largest finished language model trained to date. AMD’s next-generation Instinct MI300 GPUs for AI training and inference have also garnered significant interest, aiming directly at Nvidia’s flagship H100 data center GPU by offering more high-bandwidth memory and memory bandwidth.
The AI chip market is projected to be worth $304 billion by 2030, growing at an annual rate of 29%. This indicates that AMD’s delayed entry into the AI chip market should not be viewed as a negative sign. If AMD can deliver a powerful and competitively priced chip, it could attract a considerable number of customers. However, investors should exercise caution and wait for AMD’s data center chips to gain market share as the company currently struggles with revenue and earnings.
AMD recently forecasted a 20% year-over-year decline in revenue for Q2 2023 due to weak PC sales. In contrast, Nvidia is expected to experience a remarkable 64% YoY revenue surge to $11 billion in the ongoing quarter, with adjusted earnings more than doubling. Given the PC market’s challenging conditions, AMD’s success lies in the adoption and demand for its data center chips. Although AMD’s stock trades at an expensive 609 times trailing earnings, similar to Nvidia’s previous situation, the potential growth in demand for AMD’s chips could provide a significant boost to the company’s stock.
As investors eagerly await AMD’s second-quarter results, which are expected to be released within the next month, any signs of a turnaround stemming from increased demand for its data center chips could positively impact the stock’s rally.
In conclusion, while Nvidia currently holds a dominant position in the AI chip market, AMD’s ongoing efforts to optimize hardware and software could position it as a formidable contender. With the market for AI chips poised for substantial growth, AMD’s success depends on delivering outstanding chips and capturing a significant share of the market. Investors should closely monitor AMD’s performance and strategic advancements in the data center chip segment to make informed decisions about the company’s stock.