Google CEO Sundar Pichai announced the launch of the eighth-generation Tensor Processing Unit (TPU) at Cloud Next 2026, introducing separate chips for training and inference for the first time. The TPU 8t, designed for training, can connect 9,600 chips in a super-node, delivering 121 ExaFlops of computing power and 2PB of shared memory, tripling the performance of the previous Ironwood generation. The TPU 8i, focused on inference, connects 1,152 chips per pod, significantly enhancing memory and reducing latency with the new Boardfly network topology. Both chips will be available on Google Cloud AI Hypercomputer later in 2026.
Google Unveils 8th-Gen TPU with Separate Training and Inference Chips
Disclaimer: The content provided on Phemex News is for informational purposes only. We do not guarantee the quality, accuracy, or completeness of the information sourced from third-party articles. The content on this page does not constitute financial or investment advice. We strongly encourage you to conduct you own research and consult with a qualified financial advisor before making any investment decisions.
