Nvidia CEO Jensen Huang described HBM as a "technology miracle." / Yonhap

When Intel CEO Pat Gelsinger said semiconductors are “critical to every aspect of human existence” last year, he was not exaggerating. Semiconductors, or chips, have become the heart of modern economies. Chips power everything digital, from smartphones, automobiles and data centers to military applications. Experts believe chips are the modern equivalent of crude oil - an essential but limited resource.

Chips come in various sizes and shapes, and one particular technology is garnering attention with the rise of artificial intelligence: high-bandwidth memory (HBM). Samsung Electronics and SK Hynix, the world’s two largest memory chipmakers, plan to boost HBM investment and double production. U.S. chipmaker Micron has joined the fray by ramping up production of next-generation HBM chips.

Tech giant Nvidia has been procuring HBM chips in bulk. “We are spending a lot of money on HBM,” said Nvidia CEO Jensen Huang earlier this year. “HBM memory is very complicated and the value added is very high.”

SK Hynix's fifth-generation high-bandwidth memory, the 'HBM3E' / News1

But what exactly is HBM?

HBM is an advanced, high-performance memory chip. It is a crucial component of Nvidia’s graphics processing units (GPUs), which power generative AI systems such as OpenAI’s ChatGPT. HBM transfers data faster than any other memory chip, making it particularly suitable for large AI workloads.

HBM chips achieve “high bandwidth,” or the capability to quickly transfer a large amount of data per second between different components, by stacking multiple layers of the most advanced dynamic random access memory (DRAM). Stacking multiple DRAM dies vertically allows for wider data paths between the memory and the processor, which increases the overall data bandwidth, enabling faster data transfer rates. Current HBM stacks contain up to eight DRAM dies, and chipmakers have been working on increasing the number of DRAMs per stack.

The technical complexity involved in manufacturing HBM chips makes it five to ten times more expensive than DRAM chips. But since HBM has distinct performance advantages in terms of AI applications and because few companies are capable of mass-producing HBM chips, HBM demand currently far outpaces supply.

“HBM supply is already sold out for 2024 and for most of next year,” said SK Hynix CEO Kwak Noh-Jung during a press conference in May. SK Hynix has been the sole supplier of fourth-generation HBM3 chips to Nvidia. Micron began supplying HBM3E chips to Nvidia for its next-generation H200 graphic processing units in the second quarter of this year. Samsung Electronics has yet to pass Nvidia’s qualification tests.

When SK Hynix produced the first HBM chip in 2013, it was a relatively unknown, niche technology. HBM chips were initially used in gaming graphics cards. The technology behind HBM has since evolved from the first generation (HBM) to the second generation (HBM2), the third generation (HBM2E), the fourth generation (HBM3) and now the fifth generation (HBM3E). The most advanced HBM chip to date, the HBM3E, stacks 12 DRAMs and delivers up to 1,280 gigabytes per second, nearly double its previous model.

The Chosunilbo

Who’s leading the HBM chip race?

SK Hynix leads the HBM market with a 53% share as of last year, followed by Samsung Electronics at 38% and Micron trailing at 9%, according to market research firm Trendforce. The two Korean chipmakers dominate around 90% of the global HBM market. Micron, a late entrant, is expanding HBM production at full throttle, aiming to triple its HBM market share to around 25% by next year.

The three chipmakers are now locked in an HBM technology race that is expected to intensify as explosive AI growth fuels the demand for AI chips. A massive amount of HBM will be required to support large language models underpinning generative AI, including advanced AI chatbots like ChatGPT, and enable more energy-efficient AI training. TrendForce estimates that ChatGPT may need as many as 30,000 Nvidia A100 GPUs in the near future, which means HBM sales are set to surge along with GPU demand.

While HBM made up only 9% of the global DRAM market last year, it is projected to account for more than 18% this year, according to market research firm Omnia. Goldman Sachs predicts that the global HBM market will grow at a compound annual growth rate of approximately 100% between 2023 and 2026, reaching $30 billion in 2026.