During the event Memory Tech DaySamsung has unveiled its new ones memorie HBM3E which promise to set new records in bandwidth and memory capacity. Obviously intended for high-performance systems, the HBM3E sets a new standard in the sector, a reference point that other manufacturers will soon look to.
Capable of meeting the growing needs of high-end processorsthe new Samsung memories HBM3E Shinebolt they are offered in a maximum capacity of 36 GB, a value which represents a 50% increase compared to the HBM3 equivalents (Icebolt) yes 24 GB. to bandwidth maximum per pin rises to 9.8 Gbps, more than 50% compared to previous HBM3. This means that the single HBM3E module can handle more than 1.2 TB/s (Terabytes per second).
Stacking of HBM3E memories
The term stack refers to a group of memory chips stacked vertically in a single module. These memory chip they are arranged in overlapping layers in the same package: this means that they are physically one on top of the other rather than being placed horizontally as happens in many types of memory. This is a design that allows for greater memory density in a smaller physical space.
In the case of Samsung’s HBM3E Shinebolt, the South Korean company mentions stack 8Hi e stack 12Hi. An 8Hi stack contains eight stacked memory chips, while a 12Hi stack contains twelve.
Having multiple memory chips stacked in a single stack increases the overall capacity of the memory module, allowing you to have more data stored in the same physical space. It is especially useful in high performance applications where bandwidth and memory capacity are critical, such as in high-end processors and artificial intelligence applications.
Each HBM3E stack offers a minimum of 1 TB/s of bandwidth up to a maximum of 1,225 TB/s. Taking a high-end GPU (e.g. NVIDIA H100) as an example, a 6-stack chip would be able to offer up to 216GB of memory with an aggregate memory bandwidth of up to 7.35TB/sec.
The other features of the new Samsung memories
For what concern energy consumption, the HBM3E Shinebolt are described as more energy efficient with 10% less power consumption than the HBM3. However, due to the significant increase in amount of data transferred Due to the increase in operating frequency, the power consumption of HBM memories is expected to increase with the next generation.
Samsung has in fact hinted at memory development plans HBM4: They will use a larger 2048-bit memory interface. This is a technical change necessary to avoid an excessive increase in energy consumption due to the further increase in clock frequency.
Company spokespersons confirm that the new HBM3 memories will be mass produced during 2024, together with GDDR7 which will strengthen the performance of new generation GPUs.