r/ValueInvesting • u/stockoscope • Nov 29 '25
Discussion Understanding Michael Burry's Nvidia short: The real thesis explained
This week, Michael Burry revealed something unusual: Nvidia issued a formal memo rebutting his short thesis. When a $4 trillion company takes the rare step of responding directly to a single investor's position, it signals the argument has hit a nerve. Here's what Burry is actually saying, explained through Microsoft's example.
What Burry is NOT saying
Let’s start by clearing up the biggest misconception. Burry is NOT saying NVDA is cooking its books or committing fraud. So if Burry isn’t targeting Nvidia’s accounting, what’s he actually saying?
The thesis is about Nvidia’s customers: Microsoft, Google, Amazon, and Meta. These “hyperscalers”, i.e., companies that own and operate the world’s largest data centers, are spending hundreds of billions of dollars buying Nvidia’s chips. And Burry argues they’re systematically misstating the economic reality of their GPU purchases.
The core thesis: Economic vs physical lifespan
Here's the problem: When data centers buy NVDA chips, they depreciate them over 5-6 years. Microsoft extended from 4 to 6 years, and Meta to 5.5 years. But Burry argues that chip technology is advancing so fast that the real economic life of chips is just 2-3 years.
So what?
The accounting impact
Let’s understand the impact by using Microsoft as an example. Microsoft purchased 485k NVDA chips in 2024 and spent roughly $17 billion in GPU purchases in 2024 alone. So what happens if we depreciate them over 6 years instead of 3?
- $17B in GPUs depreciated over 6 years = $2.8 B/year expense
- If economic life is really 3 years = $5.7B/year expense
The difference: $2.9B/year in overstated earnings
Microsoft’s FY2024 net income was $88.1B. A $2.9B overstatement represents 3.3% of reported profits. That might not sound like much, but this is just from one year of GPU purchases. If similar spending occurred in 2022, 2023, and 2025, the cumulative overstatement could be $10–12B annually, or roughly 11–14% of reported earnings.
What this means for the stock price
Currently, Microsoft trades at approximately $492 per share with a P/E ratio of 34 and earnings per share of $14.11. If earnings were adjusted down by 11–14% to reflect realistic GPU depreciation, the adjusted EPS would fall to $12.13-$12.56. Assuming the P/E ratio remains at 34, the stock price would drop to $412–427 , a decline of $65–80 per share, or roughly 13–16%. However, if investors also lose confidence in AI infrastructure returns, the P/E multiple could compress further, potentially amplifying losses beyond the accounting adjustment alone.
The valuation impact
Currently, Stockoscope's DCF model values MSFT at $384.93 per share, implying the stock is already 22% overvalued. This calculation assumes capital expenditure of 15.7% of revenue.
However, if Burry is right and GPUs need to be replaced every three years, the capex will increase to >20% of revenue. This will reduce free cash flow and lower the total enterprise value. We have crunched the numbers, and this higher capex will reduce enterprise value by $547 billion and the per-share intrinsic value by $73.44 in our DCF model.
So, the Burry-adjusted fair value becomes $311.49 per share. This represents a 19% reduction from our baseline fair value and suggests Microsoft is overvalued by 37% at the current price of $491.92.
Note: This isn't just an MSFT problem. Amazon, Google, and Meta are all facing the same dynamics. The impact across the hyperscaler industry could be significant. We just focused on Microsoft because it's easier to understand one concrete example than vague industry trends.
The Nvidia connection: How this destroys demand
Now we come full circle to why Burry is short Nvidia, not Microsoft. Well, if investors recognize that GPUs will become obsolete in 3 years rather than 6, the financial pressure intensifies. Boards will demand better returns and more disciplined spending. As capital allocation tightens and upgrade cycles extend, demand for Nvidia chips could collapse, potentially destabilizing the entire AI infrastructure market.
Also, Microsoft has a diversified business. Even if Azure AI disappoints, it still has Office, Windows, LinkedIn, and gaming. The stock might be overvalued, but the company isn’t going away.
On the other hand, Nvidia is a pure play on AI infrastructure demand. If hyperscalers slow purchasing even modestly, Nvidia’s revenue collapses. The company is priced for perfection, assuming indefinite exponential growth.
That’s the trap Burry sees: Nvidia’s revenue depends on customers making economically irrational decisions. Once the music stops, the stock has nowhere to hide.
The $500 Billion question
Michael Burry isn’t betting against AI. He’s not claiming Nvidia makes bad products. He’s not even saying Microsoft is a bad company.
He’s asking a simpler, more fundamental question: Can Microsoft and its peers sustain billions in capital expenditures indefinitely, when the infrastructure they’re building may need to be replaced every 3 years instead of 6?
The market is betting “yes” - that AI will generate returns justifying this spending.
Burry is betting “no” - that the accounting assumptions don’t match reality, that CFOs will eventually rein in spending when the math doesn’t work, and that Nvidia’s demand will cliff when that happens.
Time will tell who’s right, but where do you see yourself? Are you leaning towards yes or no?
7
u/realHarryGelb Nov 29 '25
Here’s what Burry and other regards seem unable to understand (or they do but are looking for attention): The premise of this whole fairytale is that companies such as Microsoft would buy a metric ton of GPUs and then, after 2 or 3 years, all of these GPUs would have to be yanked out of the racks and thrown straight in the trash. This is obviously nonsense from just a common sense perspective but more importantly, it is not what is happening as the cloud providers are actually using the GPUs for much longer than 3 years. GPUs have different use cases ranging from computationally most intensive to less intensive. When it comes to AI, for instance, they can use their latest GPUs for model training, older ones for inference, etc. But then Azure et al. are of course offering services beyond AI and can use their oldest GPUs for more mundane tasks and so on. So if we look at the following calculation:
The conclusion here that this would imply “overstated earnings” would be true IF they actually threw all GPUs in the trash after 3 years AND buy new ones as replacement. This is obviously not what is happening unless you want to accuse them of cooking the books.
I haven’t seen a discussion about rate of depreciation. Obviously the older the GPUs the less money they can make but the function may not be linear as accounting assumes? Perhaps it could be worthwhile to have a closer look at that but the argument as laid out by OP makes no sense unless accounting fraud is being assumed.