Some See Dot-Com Ghost In OpenAI-Nvidia Deal

Every week, there’s another AI capex boom story. Usually two or three.

In the eyes of markets smitten with sundry “new industrial revolution” narratives, the prevalence of such news appears to validate the hyperbole which typifies a Jensen Huang press release.

This week was certainly no exception. On Monday, Nvidia and OpenAI announced a $100 billion data center deal aimed at arming Sam Altman with the infrastructure, power and compute he’ll need to train and deploy next gen models. For its trouble, Nvidia will get equity in OpenAI, which in my judgment will be the most valuable company in the history of the world within five years, if not sooner (it’s already worth half a trillion). And orders. Nvidia will also get orders.

Huang hailed a great leap forward (common noun) while touting the prospective deployment of 10 gigawatts to power the data centers, which Bloomberg helpfully noted is “equivalent to the peak electricity demand of New York City.” Huang was unequivocal and immodest: “This is the biggest AI infrastructure project in history,” he said.

So, not just a big deal. The biggest deal. And yet, it felt a bit recursive to me. I wasn’t quite sure how best to communicate that, so I held off. On Tuesday, Nomura’s Charlie McElligott captured it beautifully, calling the announcement a “further iteration” of what looks, increasingly, like a circular funding strategy.

These entities — the hyper-scalers and the AI companies — are blurring the line between AI revenue and capex, “rehypothecating and shell-gaming one another’s billions back and forth” as McElligott put it, describing the maneuvering as “at least somewhat reminiscent of late-stage dot com-era vendor financing.”

If you’re old enough to remember 2000 — peak TMT bubble vendor financing — you know that’s not a flattering comparison. Vendor financing is exactly what it sounds like: You loan money to your customers so they can buy your equipment. The biggest problem with that from the perspective of investors is that it can, over time, make it very difficult to discern the actual trend in demand. By the time you figure out organic demand’s flagging, it’s too late.

There are, of course, any number of caveats when it comes to drawing a parallel between now and then. McElligott mentioned a few. “The current iteration is fueled primarily by cash, not debt and dot-com VF was [in the service of perpetuating] something without a money-making business model,” he wrote.

Caveats aside, it’s getting harder for the average investor to track the billions here. Have you noticed that? Because I certainly have. This needs to be watched and monitored more closely. If the sum total of AI capex outlays and associated commitments balloons such that it bears no resemblance to the free cash flow generated by the relevant entities, that’ll suggest the “revolution” is being funded in at least some cases by daisy-chained billions, some of which aren’t even real yet (a lot of this is based on assumptions about future revenue).

To be clear: I’m not suggesting — and neither is McElligott — that the current setup’s analogous in a strict sense to dot-com-era VF. All anyone’s saying is that, as Charlie put it Tuesday, “you’d think we’d see at least some increasing discomfort with the ongoing equities melt-up to fresh all-time highs,” given how heavily dependent it is on the AI capex narrative.

Instead, market participants continue to chase the dragon, in part because, as McElligott pointed out, pervasive macro skepticism left the right-tail “too cheap, for too long.” Now, two-dozen new SPX records later, people are finally grabbing for upside in that (formerly) cheap right-tail optionality.

The figure on the right, below, shows you that grab for crash-up / melt-up hedges. Call skew’s damn near 90%ile on a one-year lookback.

The figure on the left, above, suggests where venturing into “spot up, vol up” territory, which can be dicey, albeit a lot of fun.

“Within equities index vol, we are seeing SPX options skew absolutely swan dive off a cliff with an impulse in upside demand relative to downside hedges, with call skew knee-jerking to the steepest levels since the start of the year as OTM upside stay[s] bid while put skew rolls over as downside hedges bleed out into this relentless push higher,” McElligott went on.

In a note laying out both the bull and bear case for Nvidia in the context of the OpenAI deal, Vital Knowledge said skeptics are growing more concerned that Nvidia is “paying to prop up its biggest and most prominent customers” with the associated cash outlays destined to “make a round-trip back to Nvidia to purchase chips.”


 

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

19 thoughts on “Some See Dot-Com Ghost In OpenAI-Nvidia Deal

  1. I can’t stop seeing the Cisco/Nvidia parallels. Cisco was the infrastructure for the information super highway, they were selling equipment until they weren’t and then they were throwing it away. Nvidia is the infrastructure behind the AI revolution, at some point unless they are able to massively improve their offering, the upgrade cost won’t be worth the investment.

    And I also see parallels between the information super highway and the AI revolution. Both are/were revolutionary but being revolutionary doesn’t equate to being profitable. The invention of AC power generated practically no profit, and yet it was also revolutionary. With so much competition, with the ability of foreign competitors to challenge domestic producers, it seems highly unlikely that any of this investment will ever see ROI.

    That is unless we (the government) allows consolidation to a monopoly and then a ban on all foreign LLM providers which might artificially lift prices due to abolition of free markets. Now that doesn’t seem like something the GOP would do, does it?

  2. NVIDIA promises to invest $10 billion per gigawatt of capacity built, starting 2H next year. Each gigawatt of capacity built using NVIDIA Rubin severs will generate more than $10 billion for NVIDIA in revenue and profit. NVIDIA’s balance sheet is so strong that $10 billion per gigawatt investment is peanuts for them. Also look at RPO for Coreweave, Nebius, NScale, and Oracle Cloud Infrastructure, that demonstrates growing shortage of compute capacity. So those still floundering in clever Cisco comparisons, continue to miss the life-changing generational investment opportunities that we are blessed with. NVIDIA forward P/E is just below 30. Overthink and U miss it.

  3. Whereas it’s normally the common man fighting with his peers to get a leg up on the other, we now see FOMO catch hold among the greatest of the greatest: Big tech CEOs.

    Fireworks ensue, with mind boggling capex for this and that project, projects being torn down mid-construction for an even bigger version of the original. 100s of millions paid to individual engineers as recruitment incentives, not for a job well done, hoping to gain an edge on the rest.

    Good times for all, including Big tech CEOs which secretly enjoy splurging on hobbies they are quite sure won’t pay off – but the competition does the same and they definitely don’t want to be lame old Tim Cook, or pay dividends to shareholders (boring).

    In a few years these will still be leading businesses, just much more indebted and with a bit more D&A and impairments in the P&L.

  4. Paging JL!

    I cannot recall from memory the “eyewatering” (Good one, Deal Leader?) gap between what Open AI will owe to Oracle to pay for datacenter capacity and their ability to pay cash money for it. They will owe ORCL multiples of their forecast profits? Yes, but make that multiples of current annual revenues. I’ve seen estimates that ORCL will be lucky to be paid 33% of the total.

    When CSCO ruled the earth, Wired Mag (I think) quoted someone saying that the sales growth implied by forward estimates of Cisco’s revenue growth would mean that “by 2010, every man, woman and child on the planet will be employed by Cisco.”

    “Instead, market participants continue to chase the dragon.” Ah, a throwback Tuesday reference to the Vietnam war?

  5. Seems similar to MSFT investing in OpenAI and OpenAI returning the money as revenue to MSFT, though – partly answering @derek – some of NVDA’s money presumably goes to ORCL on its way back to NVDA. I still think MSFT got the better deal, what with the exclusivity and revenue share.

  6. The broader problem is are you building capacity for compute that is not needed if models become more efficient? NVIDIA can get away with that but the buyer of the capacity has to sell that to someone to make money.

    Remaining Performance Obligation (RPO) is a non-GAAP metric of booked but not yet delivered revenue which also has a cost. So OpenAI is betting that revenue booked will appear at some point in the future to justify the upfront cost now.

    1. Great point. Also, OpenAI was first to market but they are not innovating the technology. Model Context Protocol (MCP) which is one of two standards Anthropic created enables LLM’s to use tooling in a standardized way. As the early adopter of this solution, Anthropic has been able to leapfrog OpenAI and provide more capable LLM’s to their customers. MCP’s make LLM’s more capable (think go book me a hotel at X site) but also more efficient in executing very repeatable things.

      It’s only a matter of time before the first move advantage is overtaken by tooling that is out innovating and not trying to find cost optimizations.

  7. Woke up this morning thinking there has to be a name for this type of, for lack of a better word scam. Circular funding strategy doesn’t sound quite right, how about “two companies one cup”?

  8. I seem to lack the imagination to understand how this great AI revolution is going to make our lives so much better. How are they going to make all this money? At least with the dot com boom it was clear there was going to be a new, easier way to sell us stuff. But what is AI going to get us? Better fake videos and pictures? It will do all our writing for us? Medical improvements that most of us won’t be able to afford?
    How are they going to sell all this ‘intelligence’ back to us, and what kind of work are we going to be doing in that world to pay them for it?

    1. Yep. But making our lives better is not the point of AI. The whole premise of most LLM pitches is that it will enable companies to expand margins by reducing staff numbers. Either by “rightsizing” (the ghost of Al Dunlap resurrected!) or reducing new hiring. This is America! Everyone and EVRYTHING has to pay its own way. End of story.

Create a free account or log in

Gain access to read this article

Yes, I would like to receive new content and updates.

10th Anniversary Boutique

Coming Soon