Don’t worry about AI demand. It’s robust now and will be for the foreseeable future. And that means the beneficiaries of hyper-scaler spending will be raking it in for at least the next half a decade.
I know all of that because AMD told me so, and they’re as impartial vis-à-vis AI as Bill Pulte is vis-à-vis mortgages. And as Willy Wonka is vis-à-vis chocolate. And as the Keebler elves are vis-à-vis cookies.
“It’s not going to level off,” AMD CEO Lisa Su said, of the capacity buildout, while regaling investors in New York early this week. Su trafficked in Jensenian hyperbole. The AI landscape’s evolving at a pace “beyond anything I’ve ever seen,” she went on, and said of OpenAI’s spending commitments, “I wouldn’t bet against them.”
I wouldn’t either necessarily, but as detailed at some length in “Should Taxpayers Preemptively Bail Out OpenAI?” there’s something undeniably bizarre about a company with $13 billion in sales committing to spend $1.4 trillion.
Anyway, Su’s remarks were good for 8% or so on the stock. That’s ~$30 billion in market cap. Not bad for talking your book. She also managed to rinse away some of the aftertaste from CoreWeave’s rough session.
I realize the AI capex / mega-cap cash flow / buybacks tradeoff theme is a little repetitive by now, but it’s so critical that I feel compelled to highlight every “new” chart and every bit of incremental color that comes my way. With that in mind, here are a couple of visuals from Nomura:
That’s the tradeoff: Accelerating hyper-scaler capex on the left, and plummeting hyper-scaler FCF on the right. Not shown (and not yet known): The impact on buybacks.
As discussed and debated ad nauseam in these pages of late, the hyper-scalers will fund probably ~half of AI spending with debt, and notwithstanding that tech spreads have almost surely seen the tights, the biggest and best blue-chips are still able to borrow at rates that are in some cases as favorable as developed market sovereign borrowing costs.
Still, all of that new supply with invariably push spreads wider (spreads are a price, and like all prices, they’re a function of supply and demand), and as Nomura’s Charlie McElligott reiterated on Wednesday, “the FCF burn [will] negatively impact cash previously available” for buybacks.
“Over the past 15 years,” that free cash flow was “THE source of funds” for the corporate bid, he went on. The corporate bid was, in turn, the largest source of demand for US equities during most years.


“I wouldn’t bet against them (cough cough Michael Burry)” said Su.
Random thoughts: I’m not buying the 3 year shelf life of a data center. Sure, Moore’s law and all, but won’t prices gradually come down as companies find workarounds for cutting edge GPU’s? Maybe I’m oversimplifying, but would it be that hard to swap out some GPU’s every few years for faster and cheaper chips?
At some point, maybe someone will come up with a form of AI that fits on a home PC, rendering a centralized mass of GPUs in some data center irrelevant. And maybe crypto will be out of style, exposed for the sham it is.
Of course they will. All products have life cycles, especially in tech. The trick is to see what to bet on, the technology itself (NVDA) or its hundreds of individual applications. NVDA is profitable and growing for now, but it will be replaced. As to applications. I won’t be involved in any of that, not on purpose at least. “…In the room the women come and go, talking of Michelangelo … ” None of the small applications I have seen has an profitable business model to make a sustainable profit. NVDA has a model now but it is wearing a monster target — “replace me” on its back. I am old so I get a typo or two every day. I see many more with AI. You can read about this in a book by the late (I believe) Igor Ansoff called Implanting Strategic Management, Prentice Hall, 1984, pp 40-43. He shows how products and processes in technology divide themselves into two groups: 1)Turbulent Technologies, ie, steam turns into diesel, turns into electro-motility,… and 2) Fertile Technologies, able to be used for numerous short cycle products during the long evolution of the demand cycle. Betting wrong on this bifurcation can be fatal.
We’re getting there, but efficient AI math requires a boatload of RAM and, at its most power efficient, also requires specialized chips and specialized bus stuff to make the former two talk faster. Meanwhile, foundation models keep pushing the boundaries of resource constraints.
There will come a time when commodity PCs have enough AI acceleration and enough RAM to run older and smaller models. There may even come a time when the hyperscalers need to offload processing to the client and our PCs provide a helping hand to the much larger cloud-based model. (This will especially be true if MoE models continue their dominance.)
To run the latest, greatest frontier model, however, it’s likely that we will continue to need dedicated, purpose-built hardware that’s naturally expensive (Moore’s Law stopped applying to RAM decades ago) and hyperscalers will continue to treat most of their weights and biases – the stuff the LLM is made of – as highly proprietary and not to be transmitted to the client.
So: yes, for hobbyists, local inference is becoming a thing but I doubt it will change the economics of the market at the margins, where all of the impressive, mass-appeal, take-my-job models hand out.
As for crypto: I admire your optimism (on both counts actually) but the truth is that there is already a variant available that uses essentially no energy yet the obsession with Bitcoin persists. This fact has taught my that the mindset of the crypto afficionado is not only distrustful of governments, but of any concentration of power. The illusion of decentralization offered by proof-of-work models will continue to tempt these people for some time, I’m afraid to say.
Interestingly, the hardware required for Bitcoin mining is fairly different from that required for AI. We have the AI craze to thank for soaking up a lot of the specialized-hardware fab capacity out there and at least steepening the capital onramp for crypto miners.
Thanks for your reply!
Well, unlike her more fancied leather-jacket wearing cousin, Lisa Su is not known to make bombastic prognostications lightly.
It may be my endowment bias talking as a holder of AMD shares, but I am inclined to give more credence to her view on the outlook.