Qualcomm Enters AI Data Centre Chip Wars With Bold Inference Play
The AI chip wars have a new challenger. Qualcomm, best known for powering billions of smartphones, has unveiled its first AI data centre chips in a direct bid to crack the lucrative inference market — a space where Nvidia has long dominated and AMD has carved out a growing share.
On October 28, 2025, Qualcomm introduced two rack-scale systems — the AI200 and AI250 — designed specifically for inference workloads. The announcement sent Qualcomm’s stock soaring 11%, as investors bet that even a modest slice of the booming AI infrastructure market could reshape the company’s future trajectory ().
A Strategic Pivot Beyond Smartphones
For decades, Qualcomm has been synonymous with mobile technology. But with smartphone growth stagnating, CEO Cristiano Amon is steering the company toward AI infrastructure. The pivot is backed by a multi-billion-dollar partnership with Humain, a Saudi state-backed AI firm, signaling Qualcomm’s intent to compete head-on with entrenched rivals ().
Two Chips, Two Bets
- AI200 (2026 launch): A pragmatic entry point, featuring 768 GB of LPDDR memory per card. This design targets today’s memory-hungry large language models and multimodal AI applications, with a focus on lowering total cost of ownership (TCO).
- AI250 (2027 launch): A more ambitious leap, introducing near-memory computing architecture that promises 10x higher effective memory bandwidth. If successful, this could eliminate one of the biggest bottlenecks in AI inference — memory speed.
“With Qualcomm AI200 and AI250, we’re redefining what’s possible for rack-scale AI inference,” said Durga Malladi, SVP at Qualcomm Technologies ().
Economics Over Raw Power
Performance specs matter, but the real battle in AI infrastructure is fought on spreadsheets. Qualcomm’s systems emphasize efficiency:
- 160 kW per rack with direct liquid cooling
- PCIe scaling internally, Ethernet for rack-to-rack connectivity
- Confidential computing baked in for enterprise-grade security
These features aim to reduce operational costs while meeting enterprise demands for flexibility and data protection.
The Saudi Connection
Qualcomm’s partnership with Humain is more than symbolic. The company has committed to deploying 200 megawatts of Qualcomm AI chips, a deal worth an estimated $2 billion in revenue. While smaller than AMD’s $10 billion Humain deal, it validates Qualcomm’s entry into the market before its products even ship ().
Amon framed the collaboration as a global play:
“Together with Humain, we are laying the groundwork for transformative AI-driven innovation that will empower enterprises, government organisations and communities worldwide.”
Software Matters Too
Hardware alone won’t win the war. Qualcomm is also rolling out a developer-friendly software stack:
- AI Inference Suite
- Efficient Transformers Library
- One-click deployment for models from Hugging Face ()
This approach aims to reduce integration friction and accelerate enterprise adoption.
David vs. Two Goliaths
Qualcomm faces formidable rivals:
- Nvidia, valued at over $4.5 trillion, with a deeply entrenched ecosystem.
- AMD, whose shares doubled in 2025 thanks to aggressive AI infrastructure deals.
Analysts, however, see room for multiple winners. Timothy Arcuri of UBS summed it up:
“The tide is rising so fast, it will lift all boats.”
Outlook
Qualcomm’s late entry means an uphill climb, but its focus on inference optimisation, energy efficiency, and TCO could resonate with enterprises seeking alternatives to the Nvidia-AMD duopoly. With the AI200 set for 2026, the next chapter in the AI chip wars is already being written.

Comments are closed.