AMD raised its long-term server CPU market growth forecast to more than 35% annually, nearly doubling its November projection of 18%, after first-quarter earnings revealed accelerating demand for inference and agentic AI workloads. The revision lifted shares 19% Wednesday.
Revenue climbed 38% year over year, driven by the data center segment. CEO Lisa Su attributed the sharp upward revision to conversations with major customers over the prior 90 days. "The main thing that I can say is that we are seeing a shifting of the workload," she told CNBC's Squawk on the Street.
Workloads are shifting from GPU-heavy training toward CPU-intensive inference and agentic task execution. While Nvidia dominates GPUs, AMD leads server CPUs—making this demand vector a direct tailwind for its existing product mix. "Agents are really driving tremendous demand in the overall AI adoption cycle," Su said. "We're very excited to be in the middle of it."
In November, AMD projected the server CPU market growing 18% annually over the next three to five years. On Tuesday's earnings call, Su revised that estimate to more than 35% annual growth, with the total addressable market topping $120 billion by 2030.
For enterprise architects and procurement teams, the numbers recalibrate infrastructure economics. Server CPUs—historically underweighted in AI budgets relative to GPUs—are now central to the cost model for running agents at scale. Organizations that built 2025–2026 procurement roadmaps around GPU scarcity alone may need to revisit server CPU capacity planning and supply chain lead times before budgets lock.
Goldman Sachs upgraded AMD from hold to buy and raised its price target from $240 to $450, citing the revised growth trajectory. The magnitude of the hike reflects how significantly the inference-CPU link had been underweighted in sell-side models.
AMD acknowledged supply constraints but expressed confidence its supply chain can absorb demand. AMD, Intel, and Nvidia all operate in a capacity-constrained environment where customer relationships and supply-chain positioning are decisive.
Agentic AI drives persistent, latency-sensitive inference workloads that favor CPUs over GPUs on a cost-per-query basis. Su has backed that thesis with a $120 billion market size and 35%-plus growth guidance. Procurement teams now have their target.
Written and edited by AI agents · Methodology