Tradeoffs

Don't worry about AI demand. It's robust now and will be for the foreseeable future. And that means

You need a PLUS account to view this content. Try one month of PLUS for FREE.

Try PLUS for free

Already have an account? log in

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

6 thoughts on “Tradeoffs

  1. “I wouldn’t bet against them (cough cough Michael Burry)” said Su.

    Random thoughts: I’m not buying the 3 year shelf life of a data center. Sure, Moore’s law and all, but won’t prices gradually come down as companies find workarounds for cutting edge GPU’s? Maybe I’m oversimplifying, but would it be that hard to swap out some GPU’s every few years for faster and cheaper chips?

  2. At some point, maybe someone will come up with a form of AI that fits on a home PC, rendering a centralized mass of GPUs in some data center irrelevant. And maybe crypto will be out of style, exposed for the sham it is.

    1. Of course they will. All products have life cycles, especially in tech. The trick is to see what to bet on, the technology itself (NVDA) or its hundreds of individual applications. NVDA is profitable and growing for now, but it will be replaced. As to applications. I won’t be involved in any of that, not on purpose at least. “…In the room the women come and go, talking of Michelangelo … ” None of the small applications I have seen has an profitable business model to make a sustainable profit. NVDA has a model now but it is wearing a monster target — “replace me” on its back. I am old so I get a typo or two every day. I see many more with AI. You can read about this in a book by the late (I believe) Igor Ansoff called Implanting Strategic Management, Prentice Hall, 1984, pp 40-43. He shows how products and processes in technology divide themselves into two groups: 1)Turbulent Technologies, ie, steam turns into diesel, turns into electro-motility,… and 2) Fertile Technologies, able to be used for numerous short cycle products during the long evolution of the demand cycle. Betting wrong on this bifurcation can be fatal.

    2. We’re getting there, but efficient AI math requires a boatload of RAM and, at its most power efficient, also requires specialized chips and specialized bus stuff to make the former two talk faster. Meanwhile, foundation models keep pushing the boundaries of resource constraints.

      There will come a time when commodity PCs have enough AI acceleration and enough RAM to run older and smaller models. There may even come a time when the hyperscalers need to offload processing to the client and our PCs provide a helping hand to the much larger cloud-based model. (This will especially be true if MoE models continue their dominance.)

      To run the latest, greatest frontier model, however, it’s likely that we will continue to need dedicated, purpose-built hardware that’s naturally expensive (Moore’s Law stopped applying to RAM decades ago) and hyperscalers will continue to treat most of their weights and biases – the stuff the LLM is made of – as highly proprietary and not to be transmitted to the client.

      So: yes, for hobbyists, local inference is becoming a thing but I doubt it will change the economics of the market at the margins, where all of the impressive, mass-appeal, take-my-job models hand out.

      As for crypto: I admire your optimism (on both counts actually) but the truth is that there is already a variant available that uses essentially no energy yet the obsession with Bitcoin persists. This fact has taught my that the mindset of the crypto afficionado is not only distrustful of governments, but of any concentration of power. The illusion of decentralization offered by proof-of-work models will continue to tempt these people for some time, I’m afraid to say.

      Interestingly, the hardware required for Bitcoin mining is fairly different from that required for AI. We have the AI craze to thank for soaking up a lot of the specialized-hardware fab capacity out there and at least steepening the capital onramp for crypto miners.

  3. Well, unlike her more fancied leather-jacket wearing cousin, Lisa Su is not known to make bombastic prognostications lightly.

    It may be my endowment bias talking as a holder of AMD shares, but I am inclined to give more credence to her view on the outlook.

10th Anniversary Boutique

01/01/26