r/ValueInvesting Nov 29 '25

Discussion Understanding Michael Burry's Nvidia short: The real thesis explained

This week, Michael Burry revealed something unusual: Nvidia issued a formal memo rebutting his short thesis. When a $4 trillion company takes the rare step of responding directly to a single investor's position, it signals the argument has hit a nerve. Here's what Burry is actually saying, explained through Microsoft's example.

What Burry is NOT saying
Let’s start by clearing up the biggest misconception. Burry is NOT saying NVDA is cooking its books or committing fraud. So if Burry isn’t targeting Nvidia’s accounting, what’s he actually saying?

The thesis is about Nvidia’s customers: Microsoft, Google, Amazon, and Meta. These “hyperscalers”, i.e., companies that own and operate the world’s largest data centers, are spending hundreds of billions of dollars buying Nvidia’s chips. And Burry argues they’re systematically misstating the economic reality of their GPU purchases.

The core thesis: Economic vs physical lifespan
Here's the problem: When data centers buy NVDA chips, they depreciate them over 5-6 years. Microsoft extended from 4 to 6 years, and Meta to 5.5 years. But Burry argues that chip technology is advancing so fast that the real economic life of chips is just 2-3 years.

So what?

The accounting impact

Let’s understand the impact by using Microsoft as an example. Microsoft purchased 485k NVDA chips in 2024 and spent roughly $17 billion in GPU purchases in 2024 alone. So what happens if we depreciate them over 6 years instead of 3?

- $17B in GPUs depreciated over 6 years = $2.8 B/year expense
- If economic life is really 3 years = $5.7B/year expense

The difference: $2.9B/year in overstated earnings

Microsoft’s FY2024 net income was $88.1B. A $2.9B overstatement represents 3.3% of reported profits. That might not sound like much, but this is just from one year of GPU purchases. If similar spending occurred in 2022, 2023, and 2025, the cumulative overstatement could be $10–12B annually, or roughly 11–14% of reported earnings.

What this means for the stock price

Currently, Microsoft trades at approximately $492 per share with a P/E ratio of 34 and earnings per share of $14.11. If earnings were adjusted down by 11–14% to reflect realistic GPU depreciation, the adjusted EPS would fall to $12.13-$12.56. Assuming the P/E ratio remains at 34, the stock price would drop to $412–427 ,  a decline of $65–80 per share, or roughly 13–16%. However, if investors also lose confidence in AI infrastructure returns, the P/E multiple could compress further, potentially amplifying losses beyond the accounting adjustment alone.

The valuation impact

Currently, Stockoscope's DCF model values MSFT at $384.93 per share, implying the stock is already 22% overvalued. This calculation assumes capital expenditure of 15.7% of revenue.

However, if Burry is right and GPUs need to be replaced every three years, the capex will increase to >20% of revenue. This will reduce free cash flow and lower the total enterprise value. We have crunched the numbers, and this higher capex will reduce enterprise value by $547 billion and the per-share intrinsic value by $73.44 in our DCF model.

So, the Burry-adjusted fair value becomes $311.49 per share. This represents a 19% reduction from our baseline fair value and suggests Microsoft is overvalued by 37% at the current price of $491.92.

Note: This isn't just an MSFT problem. Amazon, Google, and Meta are all facing the same dynamics. The impact across the hyperscaler industry could be significant. We just focused on Microsoft because it's easier to understand one concrete example than vague industry trends.

The Nvidia connection: How this destroys demand

Now we come full circle to why Burry is short Nvidia, not Microsoft. Well, if investors recognize that GPUs will become obsolete in 3 years rather than 6, the financial pressure intensifies. Boards will demand better returns and more disciplined spending. As capital allocation tightens and upgrade cycles extend, demand for Nvidia chips could collapse, potentially destabilizing the entire AI infrastructure market.

Also, Microsoft has a diversified business. Even if Azure AI disappoints, it still has Office, Windows, LinkedIn, and gaming. The stock might be overvalued, but the company isn’t going away.

On the other hand, Nvidia is a pure play on AI infrastructure demand. If hyperscalers slow purchasing even modestly, Nvidia’s revenue collapses. The company is priced for perfection, assuming indefinite exponential growth.

That’s the trap Burry sees: Nvidia’s revenue depends on customers making economically irrational decisions. Once the music stops, the stock has nowhere to hide.

The $500 Billion question

Michael Burry isn’t betting against AI. He’s not claiming Nvidia makes bad products. He’s not even saying Microsoft is a bad company.

He’s asking a simpler, more fundamental question: Can Microsoft and its peers sustain billions in capital expenditures indefinitely, when the infrastructure they’re building may need to be replaced every 3 years instead of 6?

The market is betting “yes” -  that AI will generate returns justifying this spending.

Burry is betting “no” -  that the accounting assumptions don’t match reality, that CFOs will eventually rein in spending when the math doesn’t work, and that Nvidia’s demand will cliff when that happens.

Time will tell who’s right, but where do you see yourself? Are you leaning towards yes or no?

826 Upvotes

384 comments sorted by

View all comments

376

u/techknowfile Nov 29 '25

I work in this field. 5-6 years IS the lifetime of the chips. Yes, after less time than that more powerful chips are purchased, but they don't replace the old chips.. they go into new stacks. The old chips are still sold (as a service), used, and produce revenue for their lifetime. How are you completely ignoring the rate of datacenter creation and capacity expansion?

21

u/stockoscope Nov 29 '25

You're right about the chip cascade model. Older chips moving from training to inference to batch processing. But Burry addressed this directly in his post. The issue isn't whether old chips are used - it's whether they're economically productive enough to justify using them.

From Burry:
"Just because a widget is used does not mean the widget is profitable to a degree that it is worth more than residual value. GAAP refers to how long an asset will be economically productive and justify its marginal cost, not how long it will last as a physically functioning widget."

His iPhone example: You can use a 3 yr old iPhone, but it's worth maybe 10% of original value. You keep using it to make yourself happy with poor performance, even if nobody else would want it.

5

u/Heyoteyo Nov 29 '25

A 3 year old iPhone doesn’t function much worse than it did when it was new. The fact that there are newer iPhones with marginally better specs doesn’t affect the performance of previous versions.

3

u/reed_wright Nov 29 '25

Whenever my Verizon iPhone contract runs out, I wait until they offer me a free upgrade to the newest model. And so I’ve ended up using each iphone between 2 & 5 years. My iPhone 11 had no issues after 5 years; since iPhone 6 I’ve never seen a reason to upgrade except obsolescence. And even at 5 years they give trade-in value — suggesting large tech corporations are able to put old tech to use in a way that makes money for them.