Nvidia is facing a deeper challenge from the same cloud giants that helped make it dominant. Amazon and Google are pressing ahead with in-house AI chips, and both companies now have enough scale to make that effort matter, even as Nvidia forecasts $1 trillion in sales from its Blackwell and Vera Rubin architectures across 2026 and 2027.
Amazon said its chip business grew 40% sequentially in the first quarter of 2026 and is now running at more than $20 billion a year, or closer to $50 billion if chip sales to itself for AWS data centers are included. The company also said access to Trainium is fully booked, with customers including Anthropic, OpenAI, Uber and Meta Platforms using Amazon’s custom AI processors. Amazon has $225 billion in purchase commitments for Trainium chips, a sign that the market for alternatives to Nvidia is no longer theoretical.
Nvidia remains the clear leader. It controls an estimated 81% of the AI data center chip market, according to IDC, and its GPUs are still the default choice for training large language models across the industry. Amazon launched Trainium in December 2020, while Google introduced the first version of its Tensor Processing Unit in 2015, years before the current AI boom turned semiconductors into the most coveted hardware in tech.
That history matters because the latest push is not coming from start-ups trying to chip away at Nvidia’s business. It is coming from the biggest buyers in the market, the same hyperscalers — Amazon, Microsoft, Meta Platforms and Alphabet’s Google — that have long relied on Nvidia hardware to train AI models. Their incentive is straightforward: the high cost of Nvidia graphics cards and persistent supply constraints have pushed them to build more of the stack themselves.
Google, too, has widened the effort. The company has sizable deals with Meta Platforms and Anthropic to deploy its TPUs, giving the chip race a commercial footprint that goes well beyond internal testing. Meta also uses Amazon’s Graviton central processing unit to support agentic AI applications, another sign that the big cloud and social platforms are hedging their bets.
The question now is not whether Nvidia will keep selling chips. It will. The issue is how much of the next wave of AI infrastructure spending can be captured by the companies that used to be its biggest customers — and whether their custom silicon can keep gaining ground before Nvidia’s next architecture cycle resets the contest again.






