🧠 Neural Dispatch: The AI maths is broken, and a bubble that many pretend to not see, is deflatingThe biggest AI developments, decoded. November 26, 2025.Hello! The numbers simply don’t add up. And that is enough to warrant a temporary departure from the standard Neural Dispatch format, for this week. On November 20, Nvidia reported number for the third quarter fiscal 2026. Record revenue of $57.0 billion, up 22% from Q2 and up 62% from a year ago, the tech giant says. “Blackwell sales are off the charts, and cloud GPUs are sold out. Compute demand keeps accelerating and compounding across training and inference — each growing exponentially. We’ve entered the virtuous cycle of AI. The AI ecosystem is scaling fast — with more new foundation model makers, more AI startups, across more industries, and in more countries. AI is going everywhere, doing everything, all at once,” says Jensen Huang, founder and CEO of NVIDIA. If you look at just this summary (as most of you would, with limited attention spans, you’d be impressed). But it was the Wall Street that read between the lines, and the Nasdaq as well as S&P 500 slid…and then kept sliding. A snapshot — Dow was down 0.8%, S&P 500 down 1.6%, and Nasdaq down 2.2% on the day. There was much more to Nvidia’s earnings, which I’ll attempt to summarise to save you time (and re-emphasise that to call this AI conversation a “bubble” wouldn’t exactly be out of place).
Last time on Neural Dispatch: Microsoft’s AI chaos, Perplexity’s moment, and Firefox’s cool approach The bottomline is, the same pile of dollars are being circulated between different AI companies, hand holding each other in the hope that the bubble isn’t discovered, and gets counted as revenue at each corporate stop this wad of cash makes on its journey. American investor and hedge fund manager Michael Burry made a rather blunt post X after Nvidia’s earnings release. He wrote, “The idea of a useful life of depreciation being longer because chips from more than 3-4 years ago are fully booked confuses physical utilisation with value creation. Just because something is used doesn’t mean it is profitable.” He pointed out that airlines keep old planes around and in service, which come in handy during the festive period rush, but they are only marginally profitable. The reality is, Nvidia’s CFO had pushed back on the GPU accounting (which I’ve explained above) and in a statement said the useful life of Nvidia’s GPUs is a significant total cost of ownership advantage over rivals — and points to A100 GPUs that were shipped 6 years ago still being utilised at full capacity by customers. But it isn’t that simple. The A100s consume as much as 3x more power per compute (the unit is FLOP, or Floating-Point Operations Per Second) than the H100s that followed it. And that in itself is approximately 25x less power efficient than Blackwell generation chips. A debate is raging — should depreciation be 3 years, 5 years, or 7 years? Compulsion more than choice? Do check out my other newsletter, Wired Wisdom: Gemini 3 is here, EA’s F1 realignment, and Windows 11’s agentic disaster waiting to happen THINKING“We’re doing a 500 megawatts, gigawatts…It’s going to cost eight bazillion trillion dollars.” With Elon Musk, it is difficult to know if he was genuinely confused, or it was just an artificially induced fog. But this was Musk, introducing xAI’s planned 500 MW AI data centre partnership with Saudi Arabia. And of course, this is powered by Nvidia, which is why CEO Jensen Huang was almost sweating when he said “stop it” as Musk stumbled between megawatt and gigawatt. Not a casual occasion to stumble, for the man who many believe is the saviour of humanity (of course that mission will also be powered by Nvidia, but I digress). The Context: That is the whole AI bubble, condensed into one beautifully unhinged exchange. They thought no one would notice in the cloud of big numbers and excitement. A CEO who’s raising tens of billions for compute doesn’t know (or pretends not to know) the difference between megawatts and gigawatts. Understandably, the CEO of the world’s most valuable semiconductor company visibly panics because the quiet part — that no one really knows where this is going or how much it will cost — has just been said out loud, at an event filled with sovereign wealth funds. Has “fake it till you make it” morphed itself into “build it till the grid collapses and hope the ROI eventually materialises” for the AI era? A Reality Check: We find ourselves amidst a moment where AI companies are committing trillions of dollars to data centres without a clear business model beyond “AGI will pay us back at some point.” Power costs worldwide are going up, as is the demand for water. Something simply has to give, at some point. AI companies and startups are being funded with billions to be ready to buy GPUs that don’t exist yet, to train models nobody knows how to monetise, for customers who aren’t sure why they need them. Musk’s quote isn’t a joke — it’s as close as we’ll ever get to an accidental confession from the AI bros. The AI boom today is powered by physics, marketing and spreadsheets that print whatever number keeps the funding round alive. Nobody has any idea what true power requirements will be (that’ll after all depend on usage, and no one knows that too), how many chips are actually needed, or what the returns look like. And yet everyone keeps buying compute because everyone else is buying compute. This is how bubbles form, through collective delusion wrapped in technical jargon. And you can’t blame Jensen for sweating, because this is a market built on curating expectations, and the worst possible thing is someone admitting they… don’t know what they’re talking about. Neural Dispatch is your weekly guide to the rapidly evolving landscape of artificial intelligence. Each edition delivers curated insights on breakthrough technologies, practical applications, and strategic implications shaping our digital future. Written and edited by Vishal Mathur. Produced by Shad Hasnain. |


0 टिप्पणियाँ: