There’s a debate happening right now about how long GPUs actually last. CoreWeave models ~ 6 years of useful economic life Nebius says closer to 4 years Google engineers report 1-3 years under heavy datacenter workloads And Michael Burry argues the whole market pops long before year 6 ever arrives. The problem to me is that they are using averages - which hides the real data and real usage scenarios. GPUs don’t age due to time. They age by behavior. This means that two identical B300s can have completely different economic lives: Result: A fleet's effective depreciation curve varied by 30–45% across different end-customers, even though the GPUs were identical models. In other words, certain customer workloads drove their hardware to lose value almost twice as fast than others. And this is exactly where lenders, investors, and operators get blindsided: - Mispriced salvage assumptions - Over - or under-collateralized deals - Loan terms that break halfway through - Portfolios that suddenly don’t pencil Telemetry is the new underwriting.