Micron Launches 12-Hi HBM3E Memory for Advanced AI GPUs

Monday, 9 September 2024, 11:19

Micron has launched its latest 12-Hi HBM3E memory chips, tailored for the next generation of AI GPUs. These high-performance stacks provide up to 36GB per unit and achieve impressive speeds exceeding 9.2 GT/s, setting a new benchmark in the tech industry for AI and high-performance computing applications.
LivaRava_Technology_Default_1.png
Micron Launches 12-Hi HBM3E Memory for Advanced AI GPUs

Revolutionizing GPU Performance with HBM3E

Micron's newly announced 12-Hi HBM3E memory stacks are now production-ready, promising to empower the latest AI and HPC workloads. With a stunning capacity of 36 GB per stack and speeds that surpass 9.2 GT/s, these chips are engineered to meet the demands of cutting-edge processors like Nvidia’s H200.

Key Features of Micron's 12-Hi HBM3E Chips

  • Up to 36 GB capacity per stack
  • Speed capabilities over 9.2 GT/s
  • Designed for AI and HPC applications

Implications for AI and HPC

The advancement of memory technology is crucial in enhancing the performance and efficiency of AI systems. Micron’s 12-Hi HBM3E chips are set to significantly push the boundaries of what's possible in the field.


This article was prepared using information from open sources in accordance with the principles of Ethical Policy. The editorial team is not responsible for absolute accuracy, as it relies on data from the sources referenced.


Related posts


Newsletter

Subscribe to our newsletter for the most reliable and up-to-date tech news. Stay informed and elevate your tech expertise effortlessly.

Subscribe