Google Unveils TPU 8 Chips with Enhanced AI Capabilities

Global AI Watch··3 min read·Tom's Hardware
Google Unveils TPU 8 Chips with Enhanced AI Capabilities

Google announced its eighth-generation Tensor Processing Units (TPUs) on April 22, introducing two distinct chip designs: TPU 8t for large-scale model training and TPU 8i for low-latency inference. Both chips are fabricated using TSMC's N3 process with advanced HBM3E memory and will be available to Google Cloud customers later this year. Notably, this marks a shift in Google's strategy as they end Broadcom’s exclusive role in TPU development by bringing MediaTek on board as a silicon design partner.

This dual-chip architecture strategizes to enhance Google's competitive position in AI by catering to specific workload requirements, while also offering alternatives to Nvidia and AMD products. The TPU 8 uses an innovative Boardfly interconnect designed for efficient data transfer, aligning with Google's objective to support high-demand training workloads that favor throughput over raw performance numbers. This move may bolster national AI infrastructure by diversifying chip options in cloud environments, shifting dependency away from Nvidia's technologies.

Google Unveils TPU 8 Chips with Enhanced AI Capabilities | Global AI Watch | Global AI Watch