Micron Unveils 256GB LPDRAM for AI Data Centers

Micron Technology has announced the shipment of its 256GB SOCAMM2 LPDRAM module, regarded as the highest capacity memory module for AI data centers. This new release follows the introduction of its 128GB SOCAMM module last year, co-developed with Nvidia. The 256GB SOCAMM2 offers one-third more capacity than the previous model, allowing for 2TB of LPDRAM per 8-channel CPU, and improves power efficiency and rack density compared to conventional RDIMMs. Internal tests indicate a significant enhancement in performance and efficiency for AI workloads with this new architecture.
The 256GB SOCAMM2 not only pushes technological boundaries but signifies meaningful collaboration between Micron and Nvidia, reflecting a culture of co-design aimed at meeting the rising demands of AI data infrastructure. The advancements in memory technology promise to accelerate processing capabilities for complex inference tasks and position Micron as a key player in enhancing the U.S.'s own AI infrastructure. This development may reduce reliance on foreign tech providers by strengthening domestic manufacturing capabilities in the AI memory sector.