Is The A.I. Bubble In Its ‘Later Innings’?

There's palpable tension in Wall Street research when it comes to discussions around 2023's A.I. frenzy. On one hand, nobody wants to dismiss the technology's potential. On the other, there's considerable froth in the market, and it feels a lot like a bubble on some days. You can make the case (SocGen has) that were it not for the A.I. hype, US equities would be flat for 2023, at best. In a new note, Morgan Stanley's Edward Stanley and Matias Ovrum tried to strike a balance. "We have demonstra

Join institutional investors, analysts and strategists from the world's largest banks: Subscribe today

View subscription options

Already have an account? log in

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

11 thoughts on “Is The A.I. Bubble In Its ‘Later Innings’?

  1. Now every company claims to be an “AI company” in their earnings call. Reminds me of earlier narratives about being a “cloud” or “blockchain” company. Examples where they are actually using AI as opposed to looking at it are scarce. That will follow, some day.

    The bubble may be Nvidia and the whole Large Language Model (LLM) concept. (The computing power and energy required to run many LLMs eclipse even crypto mining!) The question is – just how many end users actually need a LLM? Does a call center looking to automate client interactions need to run models which include rainfall data over the past 100 years in Bosnia or monthly sales of legumes in India to populate their models?

    More focused “edge AI” apps and models will probably be sufficient for most actual users. Intel is already offering specialized edge AI chips and AMD is promising to offer their own line of edge AI chips at the end of this year.

    You don’t need 5000 horsepower run a simple automated call center model when 50 horsepower is sufficient. The notion that Nvidia has a wide, insurmountable moat is open to question.

    1. In my opinion, NVDA doesn’t even have an insurmountable moat in LLM. By most accounts, AMD’s hardware is competitive and the industry’s desire for multiple sources is such that software support for AMD is building rapidly. While no-one expects NVDA to lose market share leadership, the price/margin implications of 70% share are different from 100% share.

      1. Yes, Saint Lisa kept saying that in the earnings call, but folks in the deep tech world are scratching their heads. AMD should be able to reconfigure their GPUs to work with LLMs, but their edge AI offerings are still in the promised stage. Apparently, they first tried to employ a new packaging format which failed so they had to scramble to replace them. That’s what should start hitting the market late this year if they can get enough capacity from TSMC.

  2. The AI race is on and tech companies are spending big bucks on hardware and its implementation. It’s the hardware manufacturers and software enablers that will be the initial beneficiaries and those that are spending on AI will pay for the costs in this initial phase. There may be a price bubble with respect to Nvidia and some of it suppliers, but AMD and potentially other stocks seem to be trading on reasonable earnings multiples. AI is here to stay for sure and has tremendous productivity benefits that will play out over the long term.

      1. You make a good point regarding productivity, however, AI’s impact on productivity may be analogous to computers in 80’s; while there weren’t any productivity gains in that decade, productivity surged in the 90’s, once the technology was commonplace and fully integrated into the value chain. The timescale may be different this time around for AI, but there still may be a lag effect in how AI impacts productivity.

        1. Agreed. AI will be a cost center for many users for a while. but I suppose that every listed company will have to throw money at it to due to FOMO and to satisfy shareholders and analysts.

          But thinking a step further based on that Wired magazine link I posted above, the recent postponement of system & data security outlays looks really stupid, as does the decline in those stocks. What do you think John Liu?

          1. I think LLMs have great business application potential, not so much on their chatty hallucinating own, but as front-end interfaces to other, user-unfriendly systems that have the relevant information. This will be both public-facing (think call center, helpdesk) and internal corporate (think assistant, data analyst, junior).

            Companies still have information scattered over dozens of databases accessed by many different applications, with different interfaces and data formats. Humans spend a lot of time finding and pulling data from the various applications, including bringing Marge her favorite donuts and wheedling her to please run that report that only she knows how to run. I think LLMs, with the right prompt engineering and connected to all those applications, could go a long way toward making data accessible and useful, internally and externally.

            In the sea of corporate cubicles filling the typical office tower floor, I imagine at least one out of twenty people can be replaced outright, and five out of twenty can be made more productive so that one of them can be eliminated. Who wouldn’t like to cut the wage bill by even 10%?

            I don’t think it will take decades for that to happen; several years, maybe. By that time, I’m pretty sure those LLMs won’t be gigantic trillion-token supercomputer-devouring models, but slimmer purpose-built models focused on the corporate tasks and data systems actually relevant to their users, rather than the collected works of 4chan.

            I do think those models will mostly run in clouds – simply because no-one wants to put too much computing power in every cubicle – and all the incumbent and hopeful cloud providers need to build out that infrastructure. So do the application vendors; before long, the idea of getting trained on some arcane user interface to correctly complete a bunch of fields and parameters is going to be a non-starter. So I think demand for AI accelerators will remain very strong for the time being, even as that demand broadens out to not just the most powerful hardware.

            Does the average end-user company need to go out and buy a bunch of H100s? No. But they need to think about the security implications of making data easily and rapidly accessible online, to the public and to Marge.

            So, yes, I think cutting the security spend is stupid. I think and hope that its mostly just CIOs taking a couple quarters to do that thinking. I’m buying some of those names on the drop.

          2. Yes, reducing spend on security is really stupid, but the C-Suite likely sees system and data security as a cost center as well, for better or worse, and are likely taking security measures as much for the optics of being prepared as for the actual security of it. More over, as an entity, organizations are quite bad at understanding and projecting data and system security threats; despite vigilant employees working in security most companies still aren’t prepared to prevent current “run of the mill” intrusions into their systems. (In a past life, I spent some time interviewing CIOs and data security pros; they were quite focused on- and aware of the risks and how to mitigate them, however, selling that in to the CEO and Board, and getting funding was a very different story that always boiled down to the business case for the investment.)

            For a tortured metaphor, I’d compare it to the person driving their car with only liability insurance “I’m a good driver, I won’t crash my car; I don’t need collision insurance” forgetting about all the other drivers on the road.

          3. Thanks JL –

            Some good ideas and thoughts, as always, which is why I pinged you.

            Two sorta contrary points:

            1) Your narrative about disparate unstructured databases are the same pre-chat GPT arguments advocating for SNOW, DDOG and PLTR. (Which should remind us that AI is evolutionary rather than revolutionary.)

            2) Check out the opening remarks at the Black Hat USA Conference which just licked off in Las Vegas. https://www.scmagazine.com/editorial/news/black-hat-usa-keynote-in-ai-do-not-trust

  3. IMO the AI narrative is less a casuistry factor and more a function of price action in search of a narrative leading the talking heads in box to come up with the AI myth. AI isn’t some new revelation and has minimal immediate new acceptations that will materially effect P&Ls/Profitability on an aggregate scale. Lets remember that the general public has been using AI ever since Microsoft introduced spellcheck to Word a century ago or last century, whichever, but the point remains AI has be in mass production for decades. AI is not some new phenomena that had a miraculous break through earlier this year plus there are meaningful limitations on AI around compute capacity and cost. Current methods of compute are too expensive and lacking capacity to meet the demands of the vision of what all the AI evangelists are spreading. There are some very cool companies pursing early stage technologies that will redefine our aggregate compute capacity over the next decade or so. But, current compute capacity suppliers and LLM “factories” have started to limit/throttle access.

    The price action since Oct, IMO, is more likely late cycle/stage crowding into the 7 companies that will theoretically survive and thrive coming out of the current reset. Late stage crowding happens every time a cycle comes to an end.

    Lets remember that the GFC peaked and started to reset in 2006 and didn’t resolve until 2009, it was a 4 year process and as late as summer 2008 everyone was convinced that we were in for a soft landing, the worst was behind us and the rocket ship ride was about to resume. Sound familiar?

    So if March/April 2022 was the peak of this cycle, we have a long way to go before we’ve resolved the challenges in front of us.

    In short, I agree with Wilson and Kolanovic.

NEWSROOM crewneck & prints