Qualcomm Puts Hit Out on Nvidia and AMD... Bringing a Full-Rack of Trouble to the AI War

Shots fired. Clip unloaded. 

Qualcomm just put a mob-hit out on Nvidia and AMD

This morning, the chipmaker that’s been running your phone since Bill Clinton was “mentoring” interns… decided it was tired of playing second fiddle. 

Qualcomm unveiled its AI200 and AI250 data center chips, built to challenge Jensen Huang’s GPU kingdom and AMD’s “please love us too” AI lineup. And based on the 12% bump, investors are here for it.

For decades, Qualcomm’s been the quiet kid in the back of class… acing every test while Apple and Samsung took all the credit. But now it’s ready to play ball in the big leagues… the $6.7 trillion AI data center race.


(Source: The Verge)

Cristiano Amon (Qualcomm’s CEO) wants it made known that these new chips aren’t for your phone… they’re for entire liquid-cooled server racks, each pulling 160 kilowatts of juice (make of that what you will). The AI200 drops in 2026, followed by the AI250 in 2027, with a third chip coming in 2028 (assuming things go well of course).

And Qualcomm’s not shy about the target. These new chips are built for AI inference… the part where models actually run, not train. In plain English: they’re made to do the thinking, not the learning. Translation? Faster, cheaper, and supposedly more efficient than Nvidia’s power-hungry chips. (Keyword: supposedly.)

If this sounds familiar, you’re remembering 2017 when Qualcomm tried to crash the data center scene with its “Centriq” platform. But that plan went the way of Zuck’s metaverse dream… This time, they’ve come prepared (or so they say). Their Hexagon NPUs (neural processing units) from smartphones are now being scaled up for full-rack AI servers… which if they play their cards right, could make Qualcomm the first real threat to Nvidia’s 90% market share in AI chips.

Durga Malladi, Qualcomm’s head of data centers isn’t lacking any confidence: “Once we built our strength elsewhere, it was easy for us to go up a notch into the data center level.” (Someone might need to remind him this is Nvidia they're dealing with, not Toshiba).

The funniest part is that Qualcomm might actually sell parts to its enemies. They’re offering “mix and match” server components… meaning Nvidia and AMD could end up buying Qualcomm’s CPUs to stick inside their AI racks. That’s like Nike asking Adidas for shoelaces… but hey, margins come first.

Let’s call it what it is: Qualcomm’s trying to escape its $6.3B handset dependency and grab a piece of the data center boom. With OpenAI, Microsoft, and Google all rolling out custom silicon, every tech giant wants in on the action before the next GPU shortage makes them beg Jensen Huang for scraps.

If Qualcomm’s power-efficiency claims hold up (and that’s a big “if”), it could become the go-to option for cloud providers drowning in Nvidia’s electric bills. For now, Wall Street’s in full “new chip smell” mode… and Qualcomm’s finally getting the love it’s been craving since the iPhone 4.

At the time of publishing this article, Stocks.News holds positions in Apple, Microsoft, Google, and Meta as mentioned in the article.