SK hynix to showcase 16-layer HBM3E, 122TB enterprise SSD, LPCAMM2, and more at CES
Apart from its latest HBM technology, the company will also unveil new solutions for enterprises and AI-driven components.
Leading South Korean memory manufacturer SK hynix announced that it will showcase a suite of advanced memory solutions tailored for artificial intelligence (AI) applications at this year's Consumer Electronics Show (CES) in Las Vegas.
Building upon its 12-layer High Bandwidth Memory (HBM) technology, the company will display samples of its latest 16-layer HBM3E products, officially announced in November 2024. This advancement employs advanced MR-MUF processes to enhance thermal performance and mitigate chip warping, achieving industry-leading results.
With capacities of 48GB (3GB per individual die) per stack, the increased density will allow AI accelerators to utilize up to 384GB of HBM3E memory in an 8-stack configuration. The 16-layer HBM3E is designed to significantly boost AI learning by up to 18% and inference performance by up to 32% compared to the 12-layer version.
Nvidia's next-gen Rubin chips are slated for mass production later next year, thus the existence of HBM3E could be shortlived, as the new upcoming Nvidia chips will be based on HBM4. That shouldn't be a concern, though, as reports indicate that SK hynix achieved its tape-out phase in October 2024.
Addressing the escalating demand for high-capacity storage in AI data centers, SK hynix will also introduce new SSD solutions for enterprise users, including the 122TB 'D5-P5336' enterprise SSD, developed by its subsidiary Solidigm. This model is said to boast the highest capacity currently available in its category and is poised to set new standards in data storage solutions.
The memory and storage manufacturer will also talk about Compute Express Link (CXL) and Processing-In-Memory (PIM) technologies, which are said to be pivotal to the next generation of data center infrastructures. Modularized solutions like the CMM-Ax and AiMX will be featured, with the CMM-Ax being hailed as a groundbreaking solution that combines the scalability of CXL with computational capabilities, boosting performance and energy efficiency for next-generation server platforms.
With on-device AI becoming a popular trend, SK hynix also has plans to showcase 'LPCAMM2' and 'ZUFS4.0,' designed to enhance data processing speed and power efficiency in edge devices such as PCs and smartphones. These innovations aim to facilitate the integration of AI capabilities directly into consumer electronics, broadening the scope of AI applications.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
The company announced last year that it was also working on a range of other products, including PCIe 6.0 SSDs, high-capacity QLC (Quad Level Cell) eSSDs made specifically for AI servers, and UFS 5.0 for mobile devices. SK hynix is also working on an LPCAMM2 module and soldered LPDDR5/6 memory using its 1cnm-node to power laptops and handheld consoles.
Kunal Khullar is a contributing writer at Tom’s Hardware. He is a long time technology journalist and reviewer specializing in PC components and peripherals, and welcomes any and every question around building a PC.