October 22, 2025
Micron Technology, Inc. (Nasdaq: MU) today announced it has begun customer sampling of the industry’s highest-capacity 192GB SOCAMM2 module, a breakthrough designed to accelerate the adoption of low-power memory in AI data centers. Building upon its first-to-market LPDRAM SOCAMM, the new module delivers a 50% capacity increase within the same compact form factor, a critical advancement for real-time AI inference.
This significant boost in capacity can reduce the time to first token (TTFT) by more than 80% in inference workloads, drastically speeding up AI response times. The 192GB SOCAMM2 leverages Micron's most advanced 1-gamma DRAM process node to achieve greater than 20% improvement in power efficiency. This enhanced efficiency is pivotal for optimizing power design in large-scale data center clusters, where a single AI server rack can utilize over 40 terabytes of CPU-attached low-power DRAM.
The module's design not only improves serviceability but also establishes a foundation for future capacity expansion. This innovation is the result of a five-year collaboration with NVIDIA, through which Micron pioneered the use of low-power server memory for data centers. The SOCAMM2 brings the inherent benefits of LPDDR5X—exceptionally low power consumption and high bandwidth—to the main memory of AI systems, setting a new standard for both AI training and inference.
"As AI workloads become more complex and demanding, data center servers must achieve increased efficiency, delivering more tokens for every watt of power," said Raj Narasimhan, senior vice president and general manager of Micron’s Cloud Memory Business Unit. "Micron’s proven leadership in low-power DRAM ensures our SOCAMM2 modules provide the data throughput, energy efficiency, capacity and data center-class quality essential to powering the next generation of AI data center servers."
Through specialized design and rigorous testing, Micron has transformed low-power DRAM, originally intended for mobile devices, into a robust, data center-class solution. Compared to conventional RDIMMs, the SOCAMM2 improves power efficiency by more than two-thirds and packs its performance into a module one-third the size. This compact design optimizes the physical footprint in data centers and aids in the design of advanced, liquid-cooled servers.
Micron has been an active contributor to the JEDEC SOCAMM2 specification and is collaborating with industry partners to drive standards that will accelerate the adoption of low-power memory across the AI industry. Customer samples of the 192GB SOCAMM2, with speeds up to 9.6 Gbps, are now shipping. High-volume production will be aligned with customer launch schedules.
SOURCE hpcwire