Hardware·Americas

Meta Unveils Four MTIA Chips for Accelerated AI Inference

Global AI Watch · Editorial Team··4 min read·Tom's Hardware
Meta Unveils Four MTIA Chips for Accelerated AI Inference

Meta has announced the introduction of its Meta Training and Inference Accelerator (MTIA) chips, created in collaboration with Broadcom. These four chips—MTIA 300, 400, 450, and 500—will be rolled out over the next two years, with the first in production for recommendations training. Enhancements include a significant 4.5x increase in HBM bandwidth and a 25x increase in compute performance from MTIA 300 to MTIA 500, supporting heavy AI inference workloads.

Strategically, the MTIA chips aim to modularize AI deployment across data centers, significantly reducing the changeover time for new hardware. By targeting HBM bandwidth improvements rather than just raw compute FLOPs, Meta's chips are positioned to tackle existing market leaders like Nvidia's H100. This evolution not only aligns with Meta's goal of an inference-first technology while building based on industry standards, but also suggests a potential shift in the competitive landscape of AI hardware, promoting more domestic capabilities and less reliance on foreign technology.

Free Daily Briefing

Top AI intelligence stories delivered each morning.

Subscribe Free →
SourceTom's HardwareRead original

Related Articles

Explore Trackers