Samsung, SK Hynix and Micron Battle for HBM3e AI Memory

Samsung, SK Hynix and Micron Battle for HBM3e AI Memory

High Bandwidth Memory (HBM) is a kind of DRAM innovation that provides a variety of benefits:

Lower voltages– HBM is developed to run at lower voltages, which implies it creates less heat.

Greater capability– HBM can keep more information and procedure it at the same time than previous generations.

Faster training times– HBM3 Gen2 uses over 2.5 x efficiency enhancement per watt, which can be useful for AI and HPC work.

HBM3E provides increased efficiency per watt for AI and HPC work. Micron developed an energy effective information course that lowers thermal impedance, allows higher than 2.5 x enhancement in performance/watt compared to previous generation. Micron anticipates that by 2025, around half of all cloud facilities servers will be AI servers, requiring a sixfold boost in DRAM. The NVIDIA H100 AI GPUs is a 7-die bundle with TSMC’s Chip-on-Wafer-on-Substrate product packaging architecture, which has the core GPU calculate system at the center surrounded by 6 HBM blocks.

AMD’s MI 300 AI accelerator, AMD declares is quicker than the Nvidia H100, has 8 HBM memory stacks in each system, with 12 vertically stacked DRAM passes away with through-silicon by means of on a base reasoning pass away.

Unique memory innovations, such as MRAM, RRAM, CBRAM, and so on, have the possible to either enhance or offer options to traditional DRAM.

Lower expense– HBM is a special type of DRAM innovation that offers exceptionally high bandwidth at significantly lower expense than SRAM.

Greater density– HBM can be packaged to offer much greater densities than are offered with SRAM.

Samsung has about 40% HBM market share.
SK Hynix has abour 30% HBM market share.
Micron has 26% HBM market share.
Nanya has 2%.

Micron is ranked 3rd worldwide with HBM memory. Micron revealed its HBM3E memory on July 26, 2023. Micron strategies to begin delivering HBM3E memory in high volume in early 2024. The preliminary release is anticipated to be 24 GB 8-stack HBM3e memory in early 2024, followed by 36 GB modules with 12 stacks in 2025.

HBM3e will have items released by each of the 3 leaders in Q2-Q3 2024. Micron ought to simply vanquish Hynix with a release. Samsung will lag by a couple of months.

They will each be racing for HBM4 around 2026.

Here is a link to a 14 page item details guide to HBM2E memory which began mass production in 2020.

Find out more

Leave a Reply

Your email address will not be published. Required fields are marked *