- Nvidia is selling a huge number of very expensive GPUs.
- Jensen Huang expects these chips and other AI data center upgrades to cost $1 trillion over 4 years.
- Amazon, Google, Microsoft, and Meta will probably pay quite a lot of this bill.
Nvidia is selling boatloads of GPUs. Great!
Someone has to pay for these incredibly expensive chips. Not so good.
Nvidia's CEO just put a price tag on this, and it's a shocker. Late on Wednesday, Jensen Huang predicted $1 trillion will be spent in 4 years on upgrading data centers for AI (GPUs being a big part of that). He even broke it down to the annual cost.
"There's about $1 trillion worth of data centers, call it, a quarter of a trillion dollars of capital spend each year," he said.
A lot of that bill will probably be paid by the leading cloud providers (aka hyperscalers) and other big tech companies, who are falling over themselves to dive into the generative AI race.
As I mentioned in July, tech giants are supposed to be spending less. With Nvidia sales surging, both things can't be happening at once.
These are the hyperscale cloud providers: Amazon, Microsoft, and Google. Then there's Meta, which is leaning into generative AI hard with Llama 2 and other AI models. How much cash do they have? Is it enough?
As of June 30, Amazon, Microsoft, Google, and Meta had about $334 billion in cash and cash equivalents. The total AI data center bill is $250 billion in the next year and then $750 billion over the following 3 years, according to Nvidia. That's some tough math. Amazon and Meta look the most exposed here: Amazon has $41 billion in cash and Meta has $64 billion.
I asked these companies on Thursday how much of this $250 billion annual bill they plan to pay. I didn't get any responses.
These big tech companies will generate more cash in the coming years. However, this concern was on show during the Nvidia conference call on Wednesday.
"Jensen, the question for you is, when we look at the overall hyperscaler spending, that pie is not really growing that much," Vivek Arya , an analyst at Bank of America Merrill Lynch, asked the Nvidia CEO. "So, what is giving you the confidence that they can continue to carve out more of that pie for generative AI? Just give us your sense of how sustainable is this demand as we look over the next one to two years."
The CEO kind of skipped the question. He's enthusiastic. Who can blame him? Here's what he said, according to a transcript of the call from Sentieo:
"The world has something along the lines of about $1 trillion worth of data centers installed in the cloud and enterprise and otherwise. And that trillion dollars of data centers is in the process of transitioning into accelerated computing and generative AI. We're seeing two simultaneous platform shifts at the same time. One is accelerated computing, and the reason for that is because it's the most cost-effective, most energy-effective, and the most performant way of doing computing now. And so, what you're seeing, all of a sudden enabled by generative AI, enabled by accelerated computing, generative AI came along. And this incredible application now gives everyone two reasons to transition, to do a platform shift from general purpose computing, the classical way of doing computing, to this new way of doing computing, accelerated computing.
"There's about $1 trillion worth of data centers, call it, a quarter of a trillion dollars of capital spend each year. You're seeing the data centers around the world are taking that capital spend and focusing it on the two most important trends of computing today, accelerated computing and generative AI. And so, I think this is not a near-term thing. This is a long-term industry transition, and we're seeing these two platform shifts happening at the same time."