Artificial intelligence requires enormous computing power to function properly. In addition to powerful GPUs, like those from NVIDIA, they require memory with high bandwidth. Well, SK Hynix has just announced that its HBM4 memories will be produced by TMSC, although it will not be before 2026 when those memories start producing those memories.
HBM is nothing other than the acronym for High bandwidth memory or high bandwidth memories, in Spanish. As the name suggests, these VRAMs offer huge bandwidth, which allows the GPU to access the data stored in this memory faster.
AI-centric memories
The agreement between SK Hynix and TSMC is quite interesting, since it will allow the development of new capacities for HBM memories. This agreement is expected to improve the performance of these memories through trilateral collaboration between product design, foundry and memory supplier.
Initially, they will focus on improving the performance of the base die on which the rest of the layers are mounted. You should know that an HBM memory is nothing other than a stack of layers of DRAM memory on a base layer that has TSV technology. This allows stacked layers of DRAM to communicate vertically with the base via control and management using TSV. The base die is the one that communicates directly with the GPU.
SK Hynix for their HBM3E memories used a base chip of their own design which they patented. They now want to adopt an advanced logic process from TSMC for HBM4 core dies. This seeks to add additional functions in an extremely limited space. Additionally, this allows SK Hynix to produce customized HBMs for its customers based on each customer’s particular performance and energy efficiency needs.
This collaboration agreement will also allow the integration of CoWoS technology (property of TSMC) in the next HBM4 memories. With this collaboration, they seek to offer joint solutions for all the needs of customers who request these souvenirs.
A report that will mainly focus on offering solutions for artificial intelligence. This area requires high-bandwidth memories to improve performance and capabilities.
HBM4 Features
JEDEC is the regulator of memory chip specifications, including HBM4. Although the development process is not complete, it is in a fairly advanced phase. This assumes that most of the important characteristics are defined.
We know that each HBM4 memory chip can have up to 64 GB of capacity, which is a lot. It will offer a 2,048-bit data bus per chip and an initial working frequency of 6 GHz, although it is expected that 10 GHz could be reached in the future.
These memories focus on artificial intelligence, which requires high-capacity, high-bandwidth memories. GDDR memories, despite their large capacities, fail to satisfy the needs of this segment. GDDR memories have long been reserved for gaming and HBMs, in particular because of their high manufacturing cost, for the large capacity computing segment.