In the world of tech few events are as keenly awaited as Jensen Huang’s speech at Nvidia’s annual developer conference. And at this year’s gathering in San Jose on March 16th his talk did not disappoint. Over two hours, the boss of the world’s most valuable company unveiled new chips, artificial-intelligence models and systems for everything from space-based data centres to self-driving cars. He went on to claim that this array of new products will help Nvidia sell over $1trn-worth of AI-related hardware in the coming years.
Among engineers, the reaction was enthusiastic. Among investors, it was guarded.
Doubts have grown about the durability of the
AI boom. And Nvidia, the biggest beneficiary of the spending surge, has become a lightning rod for those concerns. On February 25th the firm reported record quarterly profits and forecast strong growth. Yet its share price fell the next day. Since peaking in October it has dropped by about 13%, even as an index of American chipmakers has risen by around 6%.
Such bearishness marks a change to Nvidia’s fortunes. The company’s graphics processing units (GPUs), the workhorse semiconductors used by AI models, account for over two-thirds of the total processing power available on the world’s AI chips. In the year to January the firm generated $216bn in revenue, eight times what it made three years earlier. It took nearly three decades for Nvidia to reach a market value of $1trn; it vaulted to $4trn barely two years later. Four months after that it briefly surpassed $5trn.
How high can Nvidia climb? Much higher, if Mr Huang is to be believed. He has claimed that the hundreds of billions of dollars spent so far on AI infrastructure are just the start and that “trillions” more will follow. What is more, Nvidia has the resources to exploit the opportunity. Its free cashflow is greater than those of the other tech giants (see chart 1). The firm holds more than $62bn in cash, a third of it generated in the past year.
Mr Huang therefore plans to change Nvidia into a “foundational company” on which the AI economy rests. That means selling different types of chips and hardware, bundling products into complete AI systems and embedding Nvidia’s technology more deeply into different industries. In short, Nvidia is becoming much more than an ai chipmaker.
The transformation is needed partly because Nvidia’s success has attracted competitors. Some are conventional rivals, such as AMD, an American chipmaker that has released decent alternatives to Nvidia’s GPUs. Others are startups spying opportunities. New chip designs are become commercially viable because the need for inference (AI models answering queries) is growing, and the process places a different set of demands on chips from training. According to PitchBook, a data firm, young chip firms raised $17bn in 2025, more than in the previous two years combined.
But the most formidable challengers are Nvidia’s customers. The hyperscalers—Alphabet, Amazon, Microsoft and Meta—which all rely on vast numbers of data centres to run their businesses, buy huge quantities of its chips. In the latest financial year just three of these hyperscalers accounted for over half of Nvidia’s receivables, money owed but not yet paid. Yet these same firms are also designing their own processors. This can slash the cost of AI chips by more than half, while improving performance by tailoring hardware to the software that runs on it.
Souring geopolitics has encouraged rivals abroad. Since October 2022 America’s government has barred Nvidia from selling its most advanced chips to China. Sales have slowed dramatically. Bernstein, a broker, says local suppliers such as Huawei, Cambricon and MetaX could grow from less than a fifth of
China’s AI-chip market in 2023 to more than nine-tenths by 2027. Jay Goldberg of Seaport Research Partners, a firm of analysts, notes that the threat may extend beyond China. The new rivals may not produce chips as powerful as Nvidia’s, but in some markets “good enough” could prove good enough.
Nvidia’s response is to expand in all directions. Mr Huang has compared the AI industry to a “five-layer cake”: energy, chips, networking infrastructure, models and applications. Nvidia intends to take bites out of three of the five layers.
Having conquered the market for GPUs, the firm plans to sell different types of chips. In December Nvidia paid $20bn to license technology and hire engineers from Groq, a startup specialising in inference chips. On March 16th the company unveiled a new chip using the startup’s knowhow. It is also pushing into central processing units (CPUs), a type of general-purpose chip. This is an area long dominated by Intel, a beleaguered giant. Nvidia already builds CPUs using designs from Arm, a British firm, which are used in its AI servers. Now it plans to sell them more broadly. In February Nvidia struck a deal with Meta to supply CPU-only servers.
Nvidia is also investing in other layers. As AI systems scale, moving data between processors has become as important as the processors themselves. The firm is betting heavily on networking equipment, the technology that links chips together. In its most recent quarter this business generated $11bn in revenue, making Nvidia one of the largest players in the field.
Model-making is the third layer. Nvidia has released several families of open-source AI models. These are specialised and aimed at specific industries. That includes Alpamayo for self-driving cars, GR00T for robotics and BioNeMo for biomedical research. They often rank highly on open-source AI leaderboards. Nvidia plans to invest billions to expand its capabilities in this layer of the stack.
One reason for owning the “full stack”, as Silicon Valley calls vertical integration, is that it makes it easier to co-ordinate the different layers. By tightly linking chips, data-centre equipment and models, the company says it can extract better performance than by each part being designed separately. Mr Huang has compared building AI systems without integration to connecting “too many cats and dogs”.
It also means Nvidia can sell its hardware in bundles. Increasingly the company describes its products not as chips but as components of “AI factories”, its term for specialised AI data centres. Some of these factories are being sold directly to governments under the banner of “sovereign AI”, the label for state-led efforts to build domestic AI infrastructure. Revenue from sovereign AI tripled last fiscal year to more than $30bn, about 15% of Nvidia’s AI sales.
The company is also trying to rely less on the hyperscalers that dominate its customer list. One approach is to push deeper into industry. In carmaking, Mercedes-Benz will soon ship vehicles equipped with Nvidia’s self-driving systems. In pharmaceuticals, Eli Lilly uses Nvidia’s infrastructure and models to accelerate drug discovery. Dion Harris, an Nvidia executive, says the aim is to work more closely with end customers, such as Lilly and Mercedes, to understand their needs and shape the next wave of AI. But Nvidia is not the only one to say it is working closely with clients. Such moves put the firm on a collision course with the hyperscalers, which offer similar services.
Another approach is to create demand through its investments. Nvidia-backed firms, the idea goes, are more likely to buy its chips. Thus the firm is now one of Silicon Valley’s most prolific investors. Since 2020 it has made some 200 investments, committing over $65bn (see chart 2). That includes such big bets as a $30bn investment in OpenAI, and small ones on firms in robotics, software and AI applications.
The firm’s investments also help to secure its supply chain. This March Nvidia put more than $4bn into companies developing optical interconnects, which use light to transfer data rather than wires. Most AI data centres still rely on copper cables to link their equipment. Nvidia’s bet suggests it expects optical connections to become increasingly important. Ben Bajarin of Creative Strategies, a consultancy, compares the strategy to Apple’s early moves to corner components for the iPod.
Nvidia is using its cash pile to strengthen other parts of its supply chain. The semiconductor industry is
prone to shortages when demand surges. Supplies of advanced memory—critical for
AI chips—are already sold out for this year and for much of next. Nvidia bought most of the memory it will need this year, and part of next, well in advance.
None of this ensures Nvidia’s continued dominance. Rivals may erode its margins. The industry’s shift from training models to running them may favour chips from other vendors. And if AI spending cools, sales could slow sharply. But for now, the champion of the AI age remains dominant—and seems intent on expanding its empire. ■
To track the trends shaping commerce, industry and technology, sign up to “The Bottom Line”, our weekly subscriber-only newsletter on global business.