Sopa Photos | Lightrocket | Getty Photos
Nvidia has established itself because the undisputed chief in synthetic intelligence chips, promoting massive portions of silicon to many of the world’s greatest tech corporations en path to a $4.5 trillion market cap.
Considered one of Nvidia’s key purchasers is Google, which has been loading up on the chipmaker’s graphics processing models, or GPUs, to attempt to hold tempo with hovering demand for AI compute energy within the cloud.
Whereas there isn’t any signal that Google can be slowing its purchases of Nvidia GPUs, the web large is more and more displaying that it isn’t only a purchaser of high-powered silicon. It is also a developer.
On Thursday, Google introduced that its strongest chip but, known as Ironwood, is being made extensively accessible within the coming weeks. It is the seventh technology of Google’s Tensor Processing Unit, or TPU, the corporate’s customized silicon that is been within the works for greater than a decade.
TPUs are application-specific built-in circuits, or ASICs, which play a essential function in AI by offering extremely specialised and environment friendly {hardware} for specific duties. Google says Ironwood is designed to deal with the heaviest AI workloads, from coaching massive fashions to powering real-time chatbots and AI brokers, and is greater than 4 occasions sooner than its predecessor. AI startup Anthropic plans to make use of as much as 1 million of them to run its Claude mannequin.
For Google, TPUs provide a aggressive edge at a time when all of the hyperscalers are speeding to construct mammoth knowledge facilities, and AI processors cannot get manufactured quick sufficient to fulfill demand. Different cloud corporations are taking an identical method, however are properly behind of their efforts.
Amazon Net Companies made its first cloud AI chip, Inferentia, accessible to prospects in 2019, adopted by Trainium three years later. Microsoft did not announce its first customized AI chip, Maia, till the tip of 2023.
“Of the ASIC gamers, Google’s the one one which’s actually deployed these items in big volumes,” mentioned Stacy Rasgon, an analyst masking semiconductors at Bernstein. “For different large gamers, it takes a very long time and plenty of effort and some huge cash. They’re the furthest alongside among the many different hyperscalers.”
Initially skilled for inner workloads, Google’s TPUs have been accessible to cloud prospects since 2018. Of late, Nvidia has proven some degree of concern. When OpenAI signed its first cloud contract with Google earlier this 12 months, the announcement spurred Nvidia CEO Jensen Huang to provoke additional talks with the AI startup and its CEO, Sam Altman, in keeping with reporting by The Wall Avenue Journal.
In contrast to Nvidia, Google is not promoting its chips as {hardware}, however slightly offering entry to TPUs as a service via its cloud, which has emerged as one of many firm’s large development drivers. In its third-quarter earnings report final week, Google mum or dad Alphabet mentioned cloud income elevated 34% from a 12 months earlier to $15.15 billion, beating analyst estimates. The corporate ended the quarter with a enterprise backlog of $155 billion.
“We’re seeing substantial demand for our AI infrastructure merchandise, together with TPU-based and GPU-based options,” CEO Sundar Pichai mentioned on the earnings name. “It is likely one of the key drivers of our development over the previous 12 months, and I believe on a going-forward foundation, I believe we proceed to see very sturdy demand, and we’re investing to fulfill that.”
Google would not escape the scale of its TPU enterprise inside its cloud phase. Analysts at D.A. Davidson estimated in September {that a} “standalone” enterprise consisting of TPUs and Google’s DeepMind AI division could possibly be valued at about $900 billion, up from an estimate of $717 billion in January. Alphabet’s present market cap is greater than $3.4 trillion.
A Google spokesperson mentioned in an announcement that the corporate’s cloud enterprise is seeing accelerating demand for TPUs in addition to Nvidia’s processors, and has expanded its consumption of GPUs “to fulfill substantial buyer demand.”
“Our method is certainly one of selection and synergy, not substitute,” the spokesperson mentioned.
‘Tightly focused’ chips
Customization is a serious differentiator for Google. One important benefit, analysts say, is the effectivity TPUs provide prospects relative to aggressive services and products.
“They’re actually making chips which are very tightly focused for his or her workloads that they anticipate to have,” mentioned James Sanders, an analyst at Tech Insights.
Rasgon mentioned that effectivity goes to turn out to be more and more vital as a result of with all of the infrastructure that is being constructed, the “possible bottleneck most likely is not chip provide, it is most likely energy.”
On Tuesday, Google introduced Undertaking Suncatcher, which explores “how an interconnected community of solar-powered satellites, outfitted with our Tensor Processing Unit (TPU) AI chips, may harness the total energy of the Solar.”
As part of the undertaking, Google mentioned it plans to launch two prototype solar-powered satellites carrying TPUs by early 2027.
“This method would have super potential for scale, and in addition minimizes affect on terrestrial sources,” the corporate mentioned within the announcement. “That may take a look at our {hardware} in orbit, laying the groundwork for a future period of massively-scaled computation in area.”
Dario Amodei, co-founder and chief govt officer of Anthropic, on the World Financial Discussion board in 2025.
Stefan Wermuth | Bloomberg | Getty Photos
Google’s largest TPU deal on file landed late final month, when the corporate introduced an enormous enlargement of its settlement with OpenAI rival Anthropic valued within the tens of billions of {dollars}. With the partnership, Google is predicted to deliver properly over a gigawatt of AI compute capability on-line in 2026.
“Anthropic’s option to considerably broaden its utilization of TPUs displays the sturdy price-performance and effectivity its groups have seen with TPUs for a number of years,” Google Cloud CEO Thomas Kurian mentioned on the time of the announcement.
Google has invested $3 billion in Anthropic. And whereas Amazon stays Anthropic’s most deeply embedded cloud companion, Google is now offering the core infrastructure to assist the subsequent technology of Claude fashions.
“There’s such demand for our fashions that I believe the one means we might have been in a position to function a lot as we have been in a position to this 12 months is that this multi-chip technique,” Anthropic Chief Product Officer Mike Krieger instructed CNBC.
That technique spans TPUs, Amazon Trainium and Nvidia GPUs, permitting the corporate to optimize for value, efficiency and redundancy. Krieger mentioned Anthropic did plenty of up-front work to verify its fashions can run equally properly throughout the silicon suppliers.
“I’ve seen that funding repay now that we’re in a position to come on-line with these huge knowledge facilities and meet prospects the place they’re,” Krieger mentioned.
Hefty spending is coming
Two months earlier than the Anthropic deal, Google cast a six-year cloud settlement with Meta price greater than $10 billion, although it isn’t clear how a lot of the association contains use of TPUs. And whereas OpenAI mentioned it’ll begin utilizing Google’s cloud because it diversifies away from Microsoft, the corporate instructed Reuters it isn’t deploying GPUs.
Alphabet CFO Anat Ashkenazi attributed Google’s cloud momentum within the newest quarter to rising enterprise demand for Google’s full AI stack. The corporate mentioned it signed extra billion-dollar cloud offers within the first 9 months of 2025 than within the earlier two years mixed.
“In GCP, we see sturdy demand for enterprise AI infrastructure, together with TPUs and GPUs,” Ashkenazi mentioned, including that customers are additionally flocking to the corporate’s newest Gemini choices in addition to companies “similar to cybersecurity and knowledge analytics.”

Amazon, which reported 20% development in its market-leading cloud infrastructure enterprise final quarter, is expressing comparable sentiment.
AWS CEO Matt Garman instructed CNBC in a current interview that the corporate’s Trainium chip collection is gaining momentum. He mentioned “each Trainium 2 chip we land in our knowledge facilities right this moment is getting bought and used,” and he promised additional efficiency beneficial properties and effectivity enhancements with Trainium 3.
Shareholders have proven a willingness to abdomen hefty investments.
Google simply raised the excessive finish of its capital expenditures forecast for the 12 months to $93 billion, up from prior steering of $85 billion, with a fair steeper ramp anticipated in 2026. The inventory worth soared 38% within the third quarter, its greatest efficiency for any interval in 20 years, and is up one other 17% within the fourth quarter.
Mizuho not too long ago pointed to Google’s distinct value and efficiency benefit with TPUs, noting that whereas the chips had been initially constructed for inner use, Google is now profitable exterior prospects and greater workloads.
Morgan Stanley analysts wrote in a report in June that whereas Nvidia’s GPUs will possible stay the dominant chip supplier in AI, rising developer familiarity with TPUs may turn out to be a significant driver of Google Cloud development.
And analysts at D.A. Davidson mentioned in September that they see a lot demand for TPUs that Google ought to take into account promoting the methods “externally to prospects,” together with frontier AI labs.
“We proceed to imagine that Google’s TPUs stay the most effective different to Nvidia, with the hole between the 2 closing considerably over the previous 9-12 months,” they wrote. “Throughout this time, we have seen rising optimistic sentiment round TPUs.”
WATCH: Amazon’s $11B knowledge middle goes reside: Here is an inside look


