Bubble Vision

Admittedly, I'm exhausted with bubble talk. "Are we witnessing a dot-com redux?" "Is the 'Magnificent 7' a divine heptad?" "If I sacrifice a goat to Nvidia, do I secure immortality?" "Should I buy those damn Apple goggles?" "Is this time different?!" I don't have the answers. Neither does anyone else, contrary to what's implicit in the sundry pontification we euphemistically call "analysis." As BofA's Michael Hartnett put it in the latest installment of his popular weekly "Flow Show" series, "

Join institutional investors, analysts and strategists from the world's largest banks: Subscribe today for as little as $7/month

View subscription options

Or try one month for FREE with a trial plan

Already have an account? log in

Speak your mind

This site uses Akismet to reduce spam. Learn how your comment data is processed.

10 thoughts on “Bubble Vision

  1. When a bubble is inflating, I think best to focus on 1) how to ride and/or survive it,
    2) what might pop it and what the warning signs might be, 3) your risk control for when the popping process starts.

    1) is pretty simple, to an unimaginative sort like me. You must have some exposure to the bubble trend. What and how much, depends on your mandate.

    2) is the fun part to think about. Some watch action and technicals, others track the chatter and newsflow, others dig into the technology and the economics.

    3) is partly the reverse of 1), but is also about guessing the knock-on effects. When the AI bubble pops, what does that do to semicap, networking, fab construction, Bay Area real estate, electricity consumption, investor and consumer sentiment, capital markets names, tax revenues, money flow among asset classes, etc.

    Myself, I’m trying to puzzle out the installed base of AI accelerator hardware. If the installed base of computing capacity is growing at X Gflops a quarter, and it takes Y Gflops x so many months to train a big LLM, and there are Z entities who will want to create big LLMs, when does the installed base capacity start approaching the demand? What growth in inference demand is necessary to keep total training + inference demand well ahead of installed capacity? Where will that growth come from and who pays for it? How fungible is installed capacity, can it simply shift from training to inference or will there be surplus capacity in one and a continued shortage in the other? What will be the effect of changes in Gflop/$ and Gflop/LLM, manufacturing capacity for AI accelerator hardware (die, packaging), etc? It’s all a lot of fun to think about. I expect the realistic goal isn’t to actually predict the inflection point – either positive or negative – but to figure out a realistic range of expectations so that when the inflection arrives, you have a chance of recognizing it.

    1. JL – great to read someone acknowledge that LLMs and inference based modeling have differences.

      I’m wondering if once you run a LLM to suss out the factors that matter most, will there be an ongoing need to analyze the whole data set when some of the most important factors change? Or will analysis of a much smaller data set be more than sufficient? (My guess is it’s the latter but it has been a long time since I ran that kind of data analysis.)

      But if it is the latter, will you need the max power offered by the Nvidia chips? Or will chips from AMD and Intel suffice?

      I look forward to hearing what you dig up there.

    2. Excellent framework! Crypto had ASICS and dedicated hardware for AI would dramatically reduce the cost of both training and inference – though more compute available should just allow for more development and usage.
      I’m mostly wondering who gets the recurring revenue, and more importantly a sticky business.

    3. I really love watching you work. You are much better than most at finding your way to the end. I always told my students that whenever they found a proposition they believed to be true they should ask two questions, why? and so what? I love to watch you attack those so whats. It’s not an easy row to hoe. And with bubbles, its often not too pleasant. The biggest bubble on the planet is China, the country creating infrastructure for a population of 1.4 bil when the future is believed to be headed to a level of about 0.6 bil or lower. What a mess.

  2. Possible AI bubble feels a bit like crypto in that a computer technology is sold as an important tool that will change the way the world works. As regards crypto, we are 10 years and counting for the change to appear. AI is more computer technology that in theory will change ‘everything’. Will it? When?

  3. There probably is a bubble in the names that are tied to building out the hardware and infrastructure. The infrastructure will catch up either through supply increasing via competitors, increased capacity, etc. or decreased demand through more efficient models. Given that, I wouldn’t be buying Nvidia at these levels.

    I also don’t think proprietary data will be the moat that some people expect. Data is going to become even harder to put behind walls once it gets ingested by the blob.

    I’d look for names that either benefit from network effects or become part of the plumbing for generative AI. For example, Salesforce is what it is because it’s built an ecosystem of users and is now part of the plumbing for many users. I’m not saying Salesforce is the right AI play, but I’d be looking for companies that look like they can be the central platform that companies build their AI structure around similar to how Salesforce has done that for Sales and Support functions and become nearly impossible to rip out once you’ve gone through the implementation.

    As a side note, I’d still expect the magnificent 7 to keep expanding profit margins. Engineering talent will become cheaper and cheaper. It’s no different than automating away factory jobs and that’s a big part of their costs. You still need some people to build and maintain the equipment, but the days of abundant high paying engineering jobs outside of a few specialty areas is going to be a thing of the past. Competition for startups will be cutthroat when everyone has access to cheap engineering talent (i.e. AI engineers).

NEWSROOM crewneck & prints