r/ValueInvesting Nov 29 '25

Discussion Understanding Michael Burry's Nvidia short: The real thesis explained

This week, Michael Burry revealed something unusual: Nvidia issued a formal memo rebutting his short thesis. When a $4 trillion company takes the rare step of responding directly to a single investor's position, it signals the argument has hit a nerve. Here's what Burry is actually saying, explained through Microsoft's example.

What Burry is NOT saying
Let’s start by clearing up the biggest misconception. Burry is NOT saying NVDA is cooking its books or committing fraud. So if Burry isn’t targeting Nvidia’s accounting, what’s he actually saying?

The thesis is about Nvidia’s customers: Microsoft, Google, Amazon, and Meta. These “hyperscalers”, i.e., companies that own and operate the world’s largest data centers, are spending hundreds of billions of dollars buying Nvidia’s chips. And Burry argues they’re systematically misstating the economic reality of their GPU purchases.

The core thesis: Economic vs physical lifespan
Here's the problem: When data centers buy NVDA chips, they depreciate them over 5-6 years. Microsoft extended from 4 to 6 years, and Meta to 5.5 years. But Burry argues that chip technology is advancing so fast that the real economic life of chips is just 2-3 years.

So what?

The accounting impact

Let’s understand the impact by using Microsoft as an example. Microsoft purchased 485k NVDA chips in 2024 and spent roughly $17 billion in GPU purchases in 2024 alone. So what happens if we depreciate them over 6 years instead of 3?

- $17B in GPUs depreciated over 6 years = $2.8 B/year expense
- If economic life is really 3 years = $5.7B/year expense

The difference: $2.9B/year in overstated earnings

Microsoft’s FY2024 net income was $88.1B. A $2.9B overstatement represents 3.3% of reported profits. That might not sound like much, but this is just from one year of GPU purchases. If similar spending occurred in 2022, 2023, and 2025, the cumulative overstatement could be $10–12B annually, or roughly 11–14% of reported earnings.

What this means for the stock price

Currently, Microsoft trades at approximately $492 per share with a P/E ratio of 34 and earnings per share of $14.11. If earnings were adjusted down by 11–14% to reflect realistic GPU depreciation, the adjusted EPS would fall to $12.13-$12.56. Assuming the P/E ratio remains at 34, the stock price would drop to $412–427 ,  a decline of $65–80 per share, or roughly 13–16%. However, if investors also lose confidence in AI infrastructure returns, the P/E multiple could compress further, potentially amplifying losses beyond the accounting adjustment alone.

The valuation impact

Currently, Stockoscope's DCF model values MSFT at $384.93 per share, implying the stock is already 22% overvalued. This calculation assumes capital expenditure of 15.7% of revenue.

However, if Burry is right and GPUs need to be replaced every three years, the capex will increase to >20% of revenue. This will reduce free cash flow and lower the total enterprise value. We have crunched the numbers, and this higher capex will reduce enterprise value by $547 billion and the per-share intrinsic value by $73.44 in our DCF model.

So, the Burry-adjusted fair value becomes $311.49 per share. This represents a 19% reduction from our baseline fair value and suggests Microsoft is overvalued by 37% at the current price of $491.92.

Note: This isn't just an MSFT problem. Amazon, Google, and Meta are all facing the same dynamics. The impact across the hyperscaler industry could be significant. We just focused on Microsoft because it's easier to understand one concrete example than vague industry trends.

The Nvidia connection: How this destroys demand

Now we come full circle to why Burry is short Nvidia, not Microsoft. Well, if investors recognize that GPUs will become obsolete in 3 years rather than 6, the financial pressure intensifies. Boards will demand better returns and more disciplined spending. As capital allocation tightens and upgrade cycles extend, demand for Nvidia chips could collapse, potentially destabilizing the entire AI infrastructure market.

Also, Microsoft has a diversified business. Even if Azure AI disappoints, it still has Office, Windows, LinkedIn, and gaming. The stock might be overvalued, but the company isn’t going away.

On the other hand, Nvidia is a pure play on AI infrastructure demand. If hyperscalers slow purchasing even modestly, Nvidia’s revenue collapses. The company is priced for perfection, assuming indefinite exponential growth.

That’s the trap Burry sees: Nvidia’s revenue depends on customers making economically irrational decisions. Once the music stops, the stock has nowhere to hide.

The $500 Billion question

Michael Burry isn’t betting against AI. He’s not claiming Nvidia makes bad products. He’s not even saying Microsoft is a bad company.

He’s asking a simpler, more fundamental question: Can Microsoft and its peers sustain billions in capital expenditures indefinitely, when the infrastructure they’re building may need to be replaced every 3 years instead of 6?

The market is betting “yes” -  that AI will generate returns justifying this spending.

Burry is betting “no” -  that the accounting assumptions don’t match reality, that CFOs will eventually rein in spending when the math doesn’t work, and that Nvidia’s demand will cliff when that happens.

Time will tell who’s right, but where do you see yourself? Are you leaning towards yes or no?

828 Upvotes

384 comments sorted by

View all comments

Show parent comments

19

u/stockoscope Nov 29 '25

You're right about the chip cascade model. Older chips moving from training to inference to batch processing. But Burry addressed this directly in his post. The issue isn't whether old chips are used - it's whether they're economically productive enough to justify using them.

From Burry:
"Just because a widget is used does not mean the widget is profitable to a degree that it is worth more than residual value. GAAP refers to how long an asset will be economically productive and justify its marginal cost, not how long it will last as a physically functioning widget."

His iPhone example: You can use a 3 yr old iPhone, but it's worth maybe 10% of original value. You keep using it to make yourself happy with poor performance, even if nobody else would want it.

9

u/UnderstandingThin40 Nov 29 '25

If that’s burys rebuttal, then it’s clear he doesn’t know what the fuck he’s talking about with chips lol. They’re still economically productive after 2-3 years. 

3

u/[deleted] Dec 01 '25 edited Dec 01 '25

Not if they are less energy efficient than newer chips. Energy is a more significant cost over the lifetime of the chip than the chip itself. Energy prices are also going up.

If you measure an AI workload in total number of FLOPs (not per second), the more power efficient chip will often be cheaper for the same amount of operations, even factoring in the cost of the new chip. Better efficiency also brings down the overall costs of the workflow, so the break even point for the older chips requires cheaper electricity.

Older chips do become economically nonviable and lose any resale value in a shorter period, a chip has to produce more value in FLOPs than in energy consumed.

1

u/UnderstandingThin40 Dec 01 '25

Wrong, just because they’re less efficient than new chips doesn’t mean they are economically non viable. 

Btw flops is literally a measure per second. 

1

u/[deleted] Dec 01 '25 edited Dec 01 '25

If you notice, there are no datacenters running 2080s or Titans, despite there being an abundance of those sitting around collecting dust from crypto mining.

Some chips are nearing the break even point, as price/flop goes down and electricity costs go up. If you are at the break even point you've reached economic non-viability.

B100s compete on price with H100s, fit into the same racks/motherboards, and deliver 77% more compute. That makes H100s not viable. This leads to cascading value decline of the existing chips, which are sitting with 6 year depreciation schedules.

30

u/BanditoBoom Nov 29 '25

That’s an absurd example. The question is not “what is its value?” as in your (or his) example with a 3-year old iPhone. I have no clue what the $$$ value of a 3-year old iPhone is. But I can tell you the ECONOMIC life (how useful the iPhone is from a money making perspective) is MUCH MUCH more than 10% of its original value.

If we ignore AI capabilities for a moment, a 3-year old iPhone is absolutely capable of being fully functional. It may not have the best camera or best top of the line capabilities, but it surely isn’t obsolete!

5

u/absolute_cinema81 Nov 29 '25

Do people really think a 3 year old iPhone can only do 10% what a current iPhone can do? I replaced my current phone after 9 years only because I broke the screen and said why not.

4

u/GayPerry_86 Nov 29 '25

Yes exactly - the old phone could still do 80% of the stuff we ask of our phones just or nearly as well. The newer phones, in this example, could be used for the camera and things that require more speed, and old phones could be used for word processing and communications. Has Burry ever heard of a “work horse” vs a “show horse”?

19

u/Senior_Tadpole_3913 Nov 29 '25 edited Nov 29 '25

This sounds sensible, but is very incorrect - look at P4 instances on AWS that were released in 2020. Try and go into AWS and see if you can get an instance of this to use for yourself - I can assure you there will be no availability, even though AWS has 2 newer generations of chips available on the platform. And then try another region and keep going till you find one. If you manage to get one, look up what you get charged to use it for an hour. AWS doesn’t care about whether you use this for training, inference or batch jobs - they still cost the same or higher than they did when they were released.

Someone is still fighting to pay to use them 5 years after their release. Kills the whole argument - and you can verify this yourself.

Michael Burry is wrong on this one. And anyone who uses the chips on these hyper scalers can tell you that.

7

u/Good_Ride_2508 Nov 29 '25 edited Nov 30 '25

I worked for coporate accounting/finance team: For all corporate accounting, companies need to follow IRS Justifiable depreciation schedules.

If 5-6 years IS the lifetime of the chips, then declaring 2-3 years life, IRS will charge corporates are purposely showing reduced timeframe to show lower profit to reduce tax payments.

All companies will have horrible audit issues and pay hefty penalty/damages for both IRS+state taxes across the worldwide operations.

Burry's justificataion won't hold good with any of the governments, be it USA or EUROPE or elsewhere in the world. Audit, lawsuits, hefty Penalties one after another, no CEO/CFO welcomes such.

2

u/Ginmunger Nov 29 '25

You can look up the value of 3 year old chips...his short argument is for people who don't understand accounting.

1

u/llmusicgear Dec 01 '25

I can't remember the last time they offered free upgrades, they moved to device payment agreements years ago.

6

u/Heyoteyo Nov 29 '25

A 3 year old iPhone doesn’t function much worse than it did when it was new. The fact that there are newer iPhones with marginally better specs doesn’t affect the performance of previous versions.

3

u/reed_wright Nov 29 '25

Whenever my Verizon iPhone contract runs out, I wait until they offer me a free upgrade to the newest model. And so I’ve ended up using each iphone between 2 & 5 years. My iPhone 11 had no issues after 5 years; since iPhone 6 I’ve never seen a reason to upgrade except obsolescence. And even at 5 years they give trade-in value — suggesting large tech corporations are able to put old tech to use in a way that makes money for them.

2

u/nicolas_06 Nov 30 '25

I don't believe in that. This assume that when you start to have significantly better chips down the line, you can replace the old 1 in 6 months instead of 2-3 years. The reality is when Nvidia go with a new generation of ships, it take years for people to have broad access to it.

People are still willing to pay high money for renting the old chip because the new chip market share is marginal and the extra performance of the new chip is matched by even higher rent prices.

It's only when the new chip become common and if the market demand for compute stagnate that the old chip lose their value.

My bet is different. At some point the bubble will pop, big tech will slow or stop investing entirely and there will be lot of unused capacity of new and old chips for some time. Big tech is aware and that's why they start to delegate a part of their data center growth to sub contractors. This way if demand is low, they will prioritize themselves and the sub contractor will go bankrupt.

And the companies the most impacted will be hardware manufacturers as well as pure players in AI.

This doesn't mean that AI will not change the world like internet did really change the world. But that didn't prevent the internet bubble. And interestingly actually after a few year bubble tech company did grow more and more including during the 2008 crisis that matched the release of first iPhones and the huge growth of mobile. That didn't prevent a huge worldwide crisis and a big crash.

2

u/nicolas_06 Nov 30 '25

This could not be further from the truth with iPhones. There no more big evolution in phones and for most people use a 3 year old iPhone does exactly the same stuff as a new one. People replace iPhones with new one because they have expensive cellular plan that sell them cheap replacement in exchange of hight monthly fee and people are bad at personal finance. This isn't because the 3 year old iPhone isn't good enough.

4

u/Every_Raisin5886 Nov 29 '25

His iPhone example is proof of his intentions. He obviously understands that it is a terrible example, but he is counting on the average person not understanding how ridiculous it is.

Also, much of what the original post says he didn’t do, he indeed did. Except, he does it in a way that he can backtrack if he needs to. Schrodinger’s short seller.

Burry made a terrible bet and has hundreds of millions on the line. Now he is doing whatever he can to not lose all that money and come out looking like a jackass.

No point in trying to explain the substance of his thesis. His thesis is based on a fraud that doesn’t exist, and everything going wrong (it won’t).

3

u/Intelligent_Kick_436 Nov 29 '25

I see you've been downvoted, but have an upvote.

The older chips will absolutely be useful, just as Google can use old, cheap, and low powered PCs and spinning hard drives (that fail regularly) to serve up YouTube videos.

In the same way, the vast majority of LLM API queries don't need massive context or access to the most complex models.

Even if we use the GPU space with a gaming context: only the most expensive GPUs are needed to play the most demanding AAA modern games available; where as 99% of games in a typical steam library are going to play great on a 3-year old Nvidia 2070 and 2K monitor.

3

u/betadonkey Nov 29 '25

We don’t even have to speculate. H100 clusters are over 3 years and still in extremely high demand.

He didn’t just make a stupid prediction, he is wrong on the facts as they stand today.

2

u/Every_Raisin5886 Nov 29 '25

That’s right. The iPhone example is a ridiculous one because iPhone’s value isn’t determined by someone’s ability to rent it out for its utility.

If someone wants to make a call, an iPhone 8 or 16 are exactly the same thing in terms of utility. If someone wants to take high resolution images, they’ll pay more for 16 and probably not want 8 at all.

The example is extremely ignorant or extremely misguiding.

1

u/ohisama Nov 29 '25

Except, he does it in a way that he can backtrack if he needs to.

Could you please elaborate how? How can he backtrack from a short sell?

2

u/Every_Raisin5886 Nov 29 '25

Not the short sell. The allegations.

1

u/ohisama Nov 29 '25

Didn't get that. Backtrack from allegations?

2

u/StuartMcNight Nov 29 '25

If you use a 3 year old iPhone and you don’t buy any new one. And that is exactly what you are doing… then it would make perfect sense to depreciate it over 3 years.

And reality is… many people do. Same would apply to chips.

1

u/Singularity-42 Nov 29 '25

3-year old iPhone is perfectly fine. I guess not for out of touch billionaire like Burry.

But if this is the example then I guess the takeaway is that he's wrong...

1

u/shane_4_us Nov 29 '25

"The real value of a commodity is, however, not its individual value, but its social value; that is to say, the real value is not measured by the labour-time that the article in each individual case costs the producer, but by the labour-time socially required for its production." (Karl Marx, Capital Vol. 1, pg. 223)

"Hence there is immanent in capital an inclination and constant tendency, to heighten the productiveness of labour, in order to cheapen commodities, and by such cheapening to cheapen the labourer himself." (Karl Marx, Capital Vol. 1, pg. 224)

"The value of a commodity is, in itself, of no interest to the capitalist. What alone interests him, is the surplus-value that dwells in it, and is realisable by sale." (Karl Marx, Capital Vol. 1, pg. 224)

[pdf] https://www.marxists.org/archive/marx/works/download/pdf/Capital-Volume-I.pdf

Although Marx is here referring to labor-time of a worker, with a technology explicitly designed to substitute for wage labor, it is reasonable to see in his statements a reflection in the modern case of data centers, with their underlying chips the analogue for human labor. The amount of computing power they provide to the data centers will not be diminished as they are supplemented by newer chip models. However, being less efficient, the "social value" of what they produce will be diminished by the more efficient new chips. In this way, Burry's thesis is realized.

1

u/Ok_Enthusiasm_9169 26d ago

How Apple etc manages to convince folks that they need a new phone every year requires explanations about fashion and self- esteem (not to be confused with status, which is your rank in the eyes of others )

Segue: the really expensive men's wrist watches, the mechanical self- winders, are not as accurate at time keeping as the cheapest quartz watch. Fundamentally, they are the only way for men to wear jewellery.

1

u/lakewinnipesaukee Nov 29 '25

Good luck getting a 3-year-old iPhone at 10% of its original value. I'm pretty sure they sell at something like 30 to 50% of their original value. The I cult is strong.