SK hynix Initiates Mass Production of AI Server Memory

Global AI Watch··3 min read·Korea Herald / Yonhap / Chosun (GDELT)
SK hynix Initiates Mass Production of AI Server Memory

SK hynix Inc. has officially commenced mass production of its 192GB SOCAMM2 memory module, specifically tailored for AI servers, as the company aims to solidify its standing in the AI infrastructure sector. This next-gen module utilizes sixth-generation 10-nanometer-class LPDDR5X low-power DRAM technology, designed to offer enhanced bandwidth and power efficiency. The SOCAMM2 module is optimized for Nvidia's Vera Rubin AI platform and targets the challenges in memory performance, especially for large-scale AI models with hundreds of billions of parameters.

The introduction of the SOCAMM2 module positions SK hynix to play a pivotal role in addressing memory bottlenecks associated with training and inference processes in AI applications. By delivering more than double the bandwidth and over 75% improvement in power efficiency compared to traditional RDIMMs, this product sets a new benchmark in AI memory solutions. The significant advancements in memory technology not only enhance overall system performance but also mark an important step in reducing reliance on conventional memory solutions, further supporting the evolution of AI capabilities.

Source
Korea Herald / Yonhap / Chosun (GDELT)https://www.koreaherald.com/article/10720775
Read original