Unlock the Editor’s Digest for free
Roula Khalaf, Editor of the FT, selects her favourite stories in this weekly newsletter.
The Law That Made Nvidia the World’s Most Valuable Company is Starting to Break Down
The computational "law" that made Nvidia the world’s most valuable company is starting to break down. This is not the famous Moore’s Law, the semiconductor-industry maxim that chip performance will increase by doubling transistor density every two years.
The Scaling Law of Artificial Intelligence
For many in Silicon Valley, Moore’s Law has been displaced as the dominant predictor of technological progress by a new concept: the "scaling law" of artificial intelligence. This posits that putting more data into a bigger AI model — in turn, requiring more computing power — delivers smarter systems.
The Breakdown
The scaling law had its coming-out moment with the launch of ChatGPT. The breakneck pace of improvement in AI systems in the two years since then seemed to suggest the rule might hold true right until we reach some kind of "super intelligence", perhaps within this decade. Over the past month, however, industry rumblings have grown louder that the latest models from the likes of OpenAI, Google and Anthropic have not shown the expected improvements in line with the scaling law’s projections.
Consequences
Some of the scaling law’s earliest adherents such as Microsoft chief Satya Nadella, have attempted to recast its definition. It doesn’t matter if pre-training yields shrinking returns, defenders argue, because models can now "reason" when asked a complex question. "We are seeing the emergence of a new scaling law," Nadella said recently, referring to OpenAI’s new o1 model. But this kind of fudging should make Nvidia’s investors nervous.
Conclusion
The scaling law debate underlines just how much Nvidia’s future depends on Big Tech getting tangible returns on those huge investments. While training has soaked up most of Nvidia’s chips so far, demand for computing power for "inference" — or how models respond to each individual query — is expected to grow rapidly as more AI applications emerge.
FAQs
Q: What is the scaling law of artificial intelligence?
A: The scaling law posits that putting more data into a bigger AI model — in turn, requiring more computing power — delivers smarter systems.
Q: Why is the scaling law breaking down?
A: Industry rumblings have grown louder that the latest models from the likes of OpenAI, Google and Anthropic have not shown the expected improvements in line with the scaling law’s projections.
Q: What does this mean for Nvidia’s future?
A: Nvidia’s future depends on Big Tech getting tangible returns on those huge investments in AI. While training has soaked up most of Nvidia’s chips so far, demand for computing power for "inference" is expected to grow rapidly as more AI applications emerge.
Q: What is the solution to this issue?
A: The solution will require even more of Nvidia’s chips: so-called "test time scaling", as AI systems like OpenAI’s o1 have to "think" for longer to come up with smarter responses.

