This development is part of Google’s long-term investment in AI technology, presenting a viable alternative to Nvidia’s dominant chips in the market. Google’s tensor processing units (TPUs), accessible through the company’s cloud services, provide a competitive edge by streamlining AI model development and operational costs.
Follow THE FUTURE on LinkedIn, Facebook, Instagram, X and Telegram
The Ironwood chip, introduced at a recent cloud conference, is optimized for running AI applications, known as inference tasks, working in massive groups of up to 9,216 chips. These advancements consolidate previous chip designs while increasing memory capacity, making them ideal for modern AI challenges.
Amin Vahdat, Google’s Vice President, emphasized that inference computing’s importance is rapidly increasing. Ironwood chips offer twice the performance efficiency compared to last year’s Trillium chips. While the specific manufacturer of these chips remains undisclosed, the integration of the Ironwood chip into Google’s Gemini AI models is notable.