Google takes aim at Nvidia, rolls out its most powerful AI chip

Newspoint
Google is making its most advanced chip to date, the seventh-generation Tensor Processing Unit (TPU) known as Ironwood, widely available to customers in the coming weeks — a move aimed at drawing more artificial intelligence developers to its cloud platform. Initially unveiled in April for testing and limited deployment, Ironwood marks Google’s latest push to win business from AI companies by offering custom-built silicon designed to train and run massive machine-learning models.
Hero Image

Built in-house, Ironwood is designed for both training large AI models and powering real-time applications such as chatbots and digital agents. Google said the chip can connect up to 9,216 TPUs in a single pod, effectively eliminating “data bottlenecks for the most demanding models” and allowing users “to run and scale the largest, most data-intensive models in existence.”

The company said Ironwood delivers more than four times the performance of its previous TPU generation. AI startup Anthropic plans to use as many as 1 million Ironwood chips to power its Claude language model, Google added.

The rollout comes amid an intensifying race between Google, Microsoft, Amazon, and Meta to dominate AI infrastructure. While most large language models currently rely on Nvidia’s GPUs, Google’s TPUs are part of a growing wave of custom silicon designed to improve efficiency, performance, and cost for specialized AI workloads.

Alongside the new chip, Google announced upgrades across its cloud computing platform, aimed at making it “cheaper, faster, and more flexible.” The company is seeking to close the gap with larger rivals Amazon Web Services and Microsoft Azure, both of which continue to lead the cloud market.

Google Cloud posted $15.15 billion in third-quarter revenue, a 34% year-over-year increase, compared with 40% growth for Microsoft Azure and 20% for AWS. Google also said it has signed more billion-dollar cloud contracts in the first nine months of 2025 than in the previous two years combined.

To meet growing AI infrastructure demand, Google raised its capital expenditure forecast for 2025 to $93 billion, up from an earlier estimate of $85 billion.

“We are seeing substantial demand for our AI infrastructure products, including TPU-based and GPU-based solutions,” CEO Sundar Pichai said on a recent earnings call. “It’s one of the key drivers of our growth over the past year, and we continue to see very strong demand as we invest to meet it.”