Micron Begins Mass Production of HBM3E Memory, Promises 30% Lower Power Consumption

February 27, 2024 – Micron Technology has announced the commencement of mass production for their highly anticipated HBM3E high-bandwidth memory. According to the company, the power consumption of the HBM3E is expected to be 30% lower when compared to competing products in the market.

It has been reported that Micron’s HBM3E will be featured in NVIDIA’s next-generation AI chip, the H200 Tensor Core GPU. Previously, NVIDIA sourced their HBM exclusively from SK Hynix, but now Micron and Samsung are set to join the supply chain.

HBM (High Bandwidth Memory) is a type of graphics DDR memory that vertically stacks multiple DRAMs using advanced packaging techniques like Through-Silicon Via (TSV) technology. It is directly integrated with the GPU through an interposer layer, providing exceptional bandwidth and power efficiency.

The advantages of HBM lie in its ability to overcome memory bandwidth and power consumption bottlenecks. CPUs handle a wider range of tasks, often more random in nature, and are more sensitive to speed and latency. In contrast, the characteristics of HBM make it ideal for handling intense data processing and computations alongside GPUs. NVIDIA’s new generation of AI chips are all equipped with HBM memory.

Since its inception, HBM technology has evolved through four generations: HBM, HBM2, HBM2E, and HBM3. Now, with the mass production of the fifth-generation HBM3E, it is being heralded as the new generation of DRAM for the age of artificial intelligence.

Leave a Reply