February 04, 2024 – SK Hynix Anticipates Significant Growth in Generative AI Market and Unveils Plans for HBM4 Production
SK Hynix has announced its projections for the generative AI market, anticipating a substantial annual growth rate of 35%. Additionally, the company has revealed its ambitions to commence mass production of the next-generation High Bandwidth Memory 4 (HBM4) by 2026.
The evolution of HBM products has progressed through several generations, including HBM (first-generation), HBM2 (second-generation), HBM2E (third-generation), HBM3 (fourth-generation), and HBM3E (fifth-generation). Among these, HBM3E represents an extended version of HBM3, paving the way for HBM4 as the sixth iteration in the product line.
A notable change introduced by HBM4 is the modification of the interface design, which has featured a 1024-bit interface since 2015. The upcoming HBM4 will adopt a 2048-bit interface, marking the most significant alteration in HBM technology since its inception, as it doubles the bit width.
Currently, a single HBM3E stack boasts a data transfer rate of 9.6GT/s and a theoretical peak bandwidth of 1.2TB/s. When six stacks are combined to form a memory subsystem, the bandwidth soars to an impressive 7.2TB/s. However, due to considerations related to reliability and power consumption, the actual speed typically does not reach the maximum theoretical bandwidth. For instance, the peak bandwidth of H200 stands at 4.8TB/s.
Furthermore, HBM4 introduces variations in the number of stack layers. While the initial offering will feature 12 layers of vertical stacking, memory manufacturers are expected to introduce 16 layers by 2027, further enhancing the capabilities and performance of this cutting-edge technology.