LogIn
I don't have account.

Nvidia’s GTC Will Highlight a Major AI Chip Shift : Why CPUs Are Suddenly in Focus

Nvidia is expected to reveal more details about its next-generation Grace and Vera processors at the upcoming Nvidia GTC, highlighting the growing importance of CPUs in modern AI infrastructure. While GPUs remain critical for training and running AI models, the rise of agent-based AI systems has increased demand for CPUs that manage data movement, workflows and coordination between models. Nvidia’s strategy aims to pair its CPUs with its dominant GPUs to build more efficient AI data centers. The shift also signals rising competition with established CPU leaders like Intel and AMD as the global data-center CPU market is projected to grow from about $27 billion in 2025 to around $60 billion by 2030.

3 min read
12 Views
AI Generated Image

Key Highlights

  • Nvidia is preparing to reveal new details about its next-generation CPUs at the upcoming Nvidia GTC.
  • The shift reflects a growing problem in AI infrastructure: CPUs are becoming the bottleneck as agent-based AI workloads expand.
  • Nvidia’s upcoming Grace and Vera processors aim to complement its dominant GPUs in AI data centers.
  • The CPU market could grow rapidly, with estimates suggesting it may more than double from about $27 billion in 2025 to $60 billion by 2030.

The world of artificial intelligence chips has long been dominated by graphics processing units (GPUs). But Nvidia is now preparing to shift the spotlight toward a different type of chip: the central processing unit, or CPU.

At its annual Nvidia GTC, the company is expected to unveil new details about its CPU roadmap, signaling that the once-overlooked processor is becoming critical to the next phase of AI computing.

Why CPUs Are Suddenly Important Again

For years, Nvidia’s GPUs powered the rapid rise of AI because they are highly efficient at running thousands of operations simultaneously ideal for training large machine-learning models.

However, the rapid growth of agentic AI systems software agents that can perform multi-step tasks and coordinate with other agents has changed the computing landscape.

These systems require significant general-purpose computing power to manage data movement, coordinate workflows and orchestrate interactions among multiple AI models. CPUs are better suited for these kinds of sequential tasks.

As a result, Nvidia executives say the CPU is becoming the limiting factor in modern AI systems. Without faster CPUs, GPUs can end up waiting for instructions and data, reducing overall efficiency.

Nvidia’s CPU Strategy

Nvidia first entered the data-center CPU market in 2021 with its Grace CPU, designed to work closely with Nvidia GPUs in AI servers. The next generation, called Vera, is now moving toward production and is expected to play a central role in future AI infrastructure.

Unlike traditional server CPUs from competitors, Nvidia’s chips are built specifically to support GPU-heavy workloads. The idea is to ensure that expensive GPUs are constantly fed with data and tasks, maximizing performance.

The company has already begun deploying its CPUs at scale. For example, Meta has started using standalone Nvidia processors in its data centers, marking a significant shift from CPUs being used only alongside GPUs.

Competition With Intel and AMD

Despite Nvidia’s push into CPUs, the market remains dominated by established players.

Companies like Intel and Advanced Micro Devices still control most of the data-center CPU market. Together they supply the majority of processors used in servers worldwide.

However, Nvidia’s strategy is different. Instead of competing directly on traditional server workloads, its CPUs are optimized for AI data pipelines and GPU orchestration.

This approach reflects Nvidia’s broader plan to control more of the AI computing stack from chips and networking to software platforms.

The Bigger Picture: AI Infrastructure Is Changing

The renewed focus on CPUs highlights a broader shift in how AI systems are built. Early AI infrastructure focused mainly on training large models using massive GPU clusters.

Now the focus is moving toward running AI applications at scale, particularly agent-driven systems that perform complex tasks autonomously.

In this new environment, AI data centers must balance different types of chips GPUs for model computation and CPUs for coordination and orchestration.

For Nvidia, that means the future of AI hardware may no longer be about GPUs alone, but about building entire AI computing platforms.

References

  • Nvidia’s GTC will mark an AI chip pivot. Here’s why the CPU is taking center stage
AI-assisted: This News was created with AI assistance and may contain errors. Report corrections: Contact us.