Leading Chip Giants Compete to Dominate On-Device AI Processing in the Computing Industry
Artificial intelligence (AI) has become the driving force behind the technology sector, with major hardware and software companies incorporating it into their products. The potential for innovation and transformation that AI brings is expected to revolutionize how we interact with technology in every aspect of our daily lives.
A particular focus within the realm of AI is what I call client AI, which refers to on-device AI processing. This involves performing AI-related tasks and improvements locally on personal devices such as PCs, smartphones, and laptops, as opposed to relying on vast clusters of servers in the cloud or data centers.
Currently, tasks like Adobe’s photo editing tools rely on sending user prompts to cloud-based GPUs for image processing. However, the future of AI processing lies in bringing it closer to end-user devices. This shift to client AI processing offers benefits such as faster processing speeds due to lower latency and cost savings by eliminating the need for extensive server infrastructure.
With the wave of AI advancements, major computing hardware companies are poised to make significant announcements this year. Intel, for instance, has confirmed plans to incorporate AI acceleration in its upcoming Meteor Lake processors. Similarly, Qualcomm is preparing for its annual Snapdragon Summit, where it is expected to unveil product details related to AI processing. AMD, which previously launched its Ryzen AI solution, is also likely to reveal more about its AI ambitions at CES.
The rise of AI on personal devices presents both opportunities and challenges for chip companies. Intel had previously highlighted how its chips could accelerate AI and machine learning, but consumer interest in running AI on PCs remained low. Now, with the introduction of Meteor Lake and its dedicated Neural Processing Unit (NPU), Intel aims to regain market share by offering competitive performance and raising the average selling price to offset the investment.
AMD is also embracing AI on PCs with its Ryzen AI integration, which started shipping this year. While AMD’s Ryzen AI implementation has the potential to shift market share, the company needs to catch up on the software front, where Intel currently enjoys an advantage due to its vast resources.
Qualcomm, on the other hand, has been discussing AI acceleration since 2015 and progressively adding dedicated IP for AI processing in its chip products. Its upcoming computing platforms for notebooks promise significant improvements in both CPU performance and AI acceleration. The impact on Qualcomm’s PC portfolio, however, remains uncertain as the company has struggled to gain traction in recent years.
Nvidia, known as the king of AI and valued at $1 trillion, is renowned for its powerful GPUs used in server clusters for training AI models. While it hasn’t made significant strides in on-device, client AI, Nvidia’s GeForce GPUs could be well-suited for AI computing in laptops and PCs. However, unless Nvidia develops low-cost, low-power chips specifically for AI on consumer devices, it may miss out on the massive opportunity currently being pursued by Intel, AMD, and Qualcomm.
As the AI tidal wave surges, volatility in the market is expected in the coming months and into 2024. Companies like Microsoft, with its Windows and Office 365 Copilot, will play a crucial role in demonstrating how AI will transform our daily lives and work routines. As they vie for market dominance, chip companies must prove they have the most powerful and exciting technology that will revolutionize consumers’ computing habits.
In the rapidly evolving landscape of on-device AI processing, chip giants like Intel, AMD, Qualcomm, and Nvidia are intensifying their efforts to captivate consumers with cutting-edge technologies. The race is on to dominate this transformative era and shape the future of computing.