Qualcomm Stock Surges 20% as It Joins the AI Chip Arms Race — Taking on Nvidia and AMD

Paul Jackson

October 27, 2025

Key Points

  • Qualcomm (QCOM) shares jumped 20% after unveiling two new AI data center chips — the AI200 and AI250.

  • The launch marks Qualcomm’s official entry into the AI data center market, challenging Nvidia (NVDA) and AMD (AMD).

  • The new chips focus on AI inference efficiency, boasting lower power use and stronger cost advantages for data center operators.

Qualcomm’s Big AI Pivot

Qualcomm is officially entering the AI data center race.

Shares of the semiconductor giant surged more than 20% Monday after the company announced its new AI200 and AI250 chips — a product line designed to run artificial intelligence workloads at scale and compete directly with Nvidia’s GPUs and AMD’s Instinct accelerators.

The move represents a major strategic expansion for Qualcomm, long known for its dominance in smartphone processors. By bringing its AI expertise into the data center, the company aims to carve out a place in a market worth hundreds of billions of dollars over the next decade.

Inside the New AI Chip Lineup

The AI200, debuting in 2026, will serve as Qualcomm’s first integrated AI server solution — both the name of the individual accelerator chip and the rack-scale server that houses it. It will be powered by Qualcomm’s custom Hexagon Neural Processing Unit (NPU), scaled up from the versions used in its mobile chips.

The AI250, coming in 2027, will be its next-generation successor — promising 10x more memory bandwidth than the AI200. Qualcomm says a third version is already planned for 2028, with annual upgrades expected thereafter.

Crucially, the company isn’t targeting AI model training (where Nvidia still dominates), but rather AI inference — the process of running and deploying trained AI models efficiently.

That’s where Qualcomm sees its edge: energy efficiency. Lower power draw means lower total cost of ownership (TCO) for massive server farms — a top priority as AI infrastructure costs skyrocket.

Flexible Access and Potential Partnerships

Unlike many chipmakers, Qualcomm isn’t forcing customers to buy its full server setups. Cloud and enterprise customers will be able to purchase:

  • Individual AI200 or AI250 chips
  • Partial server configurations, or
  • Full Qualcomm AI racks

Interestingly, even Nvidia and AMD could end up as customers — using Qualcomm’s chips in certain inference workloads, despite competing directly in the same market.

Learning From Past Lessons

This isn’t Qualcomm’s first attempt to enter the data center. Back in 2017, the company launched the Centriq 2400 chip with support from Microsoft (MSFT) — but the project was shelved after legal battles and tough competition from Intel and AMD.

This time, Qualcomm appears more focused and better positioned. It already sells the AI 100 Ultra, a drop-in inference card for existing servers, but the AI200 and AI250 mark its first dedicated AI server systems — built from the ground up for next-generation workloads.

Diversifying Beyond Smartphones

The company’s expansion into AI data centers comes as part of a broader diversification strategy.

In its most recent quarter, Qualcomm reported $10.4 billion in revenue — $6.3 billion of which came from handsets. With smartphone sales plateauing globally, Qualcomm has been seeking new growth vectors across automotive chips, connected devices, and AI infrastructure.

If the AI200 series gains traction, data centers could become Qualcomm’s next major revenue stream — but it won’t be easy. Tech giants like Amazon (AMZN), Google (GOOG, GOOGL), and Microsoft (MSFT) are already designing their own AI chips in-house, while Nvidia and AMD maintain massive first-mover advantages.

WSA Take

Qualcomm’s bold jump into the AI hardware arms race is a defining moment for the company — and the market took notice.

A 20% stock surge shows investor confidence that Qualcomm might finally be ready to scale beyond mobile chips and compete in the trillion-dollar AI infrastructure economy. Its emphasis on energy efficiency and inference optimization could give it a niche edge against GPU-heavy rivals.

But make no mistake — this is a long game. Nvidia’s lead remains enormous, and cloud giants are increasingly building their own chips. For Qualcomm, success will hinge on execution, power efficiency, and pricing — not hype.

Read our latest coverage on U.S. Inflation Reports.

Visit the Wall Street Access homepage.


Disclaimer
Wall Street Access does not work with or receive compensation from any public companies mentioned. Content is for informational and educational purposes only.

Author

Paul Jackson

RELATED ARTICLES

Subscribe