The HBM3E will consumer 30% less power than rival offerings and could help tap into soaring demand for chips that power generative AI applications.
Micron Technology, Inc., a memory and storage solutions provider, has commenced volume production of its High Bandwidth Memory 3E (HBM3E) solution. Micron’s 24GB 8H HBM3E is set to be integrated into NVIDIA H200 Tensor Core GPUs, which are expected to ship in the second quarter of 2024.
Sumit Sadana, executive vice president and chief business officer at Micron, highlighted the importance of memory bandwidth and capacity for AI workloads and expressed confidence in Micron’s position to support the anticipated AI growth.
The HBM3E (High Bandwidth Memory 3E) will consume 30% less power than rival offerings and could help tap into soaring demand for chips that power generative AI applications. This advancement aims to meet the rapidly growing demands of artificial intelligence (AI) applications, supercomputers, and data centres by providing swift data access for AI accelerators.