Breaking news

Accelerating AI: Google Introduces The Ironwood Chip

In a significant leap for artificial intelligence, Alphabet (GOOGL.O) has unveiled its innovative seventh-generation AI chip, the Ironwood. This new processor is set to enhance the speed and efficiency of AI applications, such as those powered by OpenAI’s ChatGPT, by performing high-speed data crunching known as ‘inference’ computing.

This development is part of Google’s long-term investment in AI technology, presenting a viable alternative to Nvidia’s dominant chips in the market. Google’s tensor processing units (TPUs), accessible through the company’s cloud services, provide a competitive edge by streamlining AI model development and operational costs.

The Ironwood chip, introduced at a recent cloud conference, is optimized for running AI applications, known as inference tasks, working in massive groups of up to 9,216 chips. These advancements consolidate previous chip designs while increasing memory capacity, making them ideal for modern AI challenges.

Amin Vahdat, Google’s Vice President, emphasized that inference computing’s importance is rapidly increasing. Ironwood chips offer twice the performance efficiency compared to last year’s Trillium chips. While the specific manufacturer of these chips remains undisclosed, the integration of the Ironwood chip into Google’s Gemini AI models is notable.

Nvidia Paves The Way For Orbital Data Centers In Space Computing Revolution

Nvidia introduced computing platforms designed for orbital data centers during its GTC 2026 conference. The systems are intended to support artificial intelligence workloads in space-based environments. CEO Jensen Huang said the development reflects a shift toward processing data closer to where it is generated, including in orbit

Redefining The Final Frontier Of Computing

During the keynote, Huang said satellite networks are expanding rapidly, increasing the need for computing infrastructure beyond Earth. He stated that AI systems may need to operate directly within space-based data environments. These developments are linked to the growth of satellite constellations and space-based data collection.

Innovative Modules And Strategic Partnerships

Nvidia introduced the Vera Rubin Space-1 module, which combines IGX Thor and Jetson Orin processors adapted for space conditions. The hardware is designed to operate within constraints related to size, weight and power.

The company said it is working with partners including Axiom Space, Planet Labs and Starcloud on related initiatives.

Overcoming Engineering Challenges

Huang noted that cooling systems remain a key technical challenge in space environments. Heat dissipation differs from Earth-based systems, as cooling relies on radiation rather than convection. These constraints require adjustments in hardware design for orbital use.

Expanding The Scope Of AI And Data Centers

The initiative comes as energy consumption and operating costs increase for terrestrial data centers. Space-based systems could rely on solar energy, which remains more consistently available in orbit.  Companies, including Google and SpaceX are also exploring concepts related to space-based infrastructure and AI systems.

Looking Ahead

As orbital data centers inch closer to reality, the integration of space computing into AI infrastructure represents a transformative leap for technology. Nvidia’s bold vision underscores an industry-wide shift, promising to expand the capabilities of digital infrastructure even beyond the confines of Earth.

eCredo
Uol
The Future Forbes Realty Global Properties
Aretilaw firm

Become a Speaker

Become a Speaker

Become a Partner

Subscribe for our weekly newsletter