Press "Enter" to skip to content

NVidia Says ‘The More You Buy, The More Revenue You Get’—But Is That Really True?

So the big tech news is the announcements from Nvidia’s GTC 2025.  Nvidia has announced the launch of its new Blackwell Ultra and Vera Rubin artificial intelligence. The Blackwell Ultra chips are expected to ship in the second half of 2025, promising significantly higher performance, allowing cloud providers to potentially earn up to fifty times more revenue compared to the previous generation. The company also introduced Vera, its first custom central processing unit, alongside the next-generation graphics processing unit, Rubin, slated for release in 2026. Nvidia aims to maintain an annual release schedule for new chip families, a shift from its previous practice of releasing new architectures every two years.

Nvidia CEO Jensen Huang highlighted the rapid advancements in artificial intelligence and the company’s ambitious plans for the future. He introduced the Blackwell architecture, which aims to revolutionize AI computing, promising one exaflop of computing power in a single rack. Huang revealed that the new AI operating system, Dynamo, could deliver 40 times better performance, underscoring the shift from traditional data centers to AI factories designed for large-scale intelligence generation. He projected that by the end of the year, all of Nvidia’s operations will be AI-assisted, with ten billion digital AI agents set to enter the workforce. Huang emphasized that the transition to AI-driven infrastructure is not just a trend but an economic necessity, stating, “The more you buy, the more revenue you get.”

For those that want to go very deep on the technical analysis here, I included a link to Semi Analysis, which has those details.  

Why do we care?

This announcement from Nvidia’s GTC 2025 underscores the company’s relentless push to solidify its dominance in AI infrastructure, but the implications go far beyond just chip improvements.

Huang’s statement that data centers are evolving into AI factories is a direct appeal to enterprise and cloud customers. AI isn’t just an enhancement to existing IT infrastructure—it’s being positioned as a core economic driver. The claim that “the more you buy, the more revenue you get” is a bold reframing of AI investments from a cost center to a revenue generator.  That said, I’m skeptical.  Annual investments?   50X revenue claims – that’s speculative.      And open-source AI models are reducing reliance on proprietary Nvidia software stacks, with the cost of intelligence going down.  

It’s a huge bet that usage will go up to this degree.   The idea that AI investment is an “economic necessity” conveniently aligns with Nvidia’s interests.

For IT services firms, the real play is helping customers navigate this landscape without becoming captive to Nvidia’s aggressive upgrade cycle. AI infrastructure isn’t a one-size-fits-all game, and the winners will be those who provide flexibility, efficiency, and strategic AI adoption pathways.