Samsung Electronics’ fifth-generation high-bandwidth memory (HBM), or HBM3E, has reportedly passed Nvidia’s quality tests.
Reuters reported on Aug. 7 that the South Korean chipmaker’s HBM3E eight-layer memory recently passed these tests, and a supply agreement is expected soon. HBM, a high-performance memory semiconductor that stacks multiple DRAMs to increase speed and reduce power consumption, is essential for artificial intelligence (AI) semiconductors. The HBM3E chips will be used in Nvidia’s latest AI accelerators, the H100, and the next-generation B200.
According to Reuters, Samsung redesigned the HBM3E chips to pass the verification tests. The British news outlet had previously reported that the company’s HBM3E faced difficulties passing the tests due to heat and power consumption issues.
Nvidia’s approval of Samsung’s HBM3E chips comes amid a surge in demand for AI accelerators driven by the generative AI boom. The U.S. tech giant’s AI accelerators are in short supply, with SK Hynix being the primary supplier of HBM for these components. SK Hynix began supplying HBM3E chips to Nvidia in February.
Market researcher TrendForce predicts that HBM3E chips will become the mainstream HBM product in the market in the second half of this year. Last month, Samsung Electronics said in its second-quarter earnings call, “HBM sales grew by 50% compared to the previous quarter,” and that “the fifth-generation HBM3E will enter mass production in the third quarter.”
However, some industry experts believe that now may not be the right time for Nvidia to announce the passing of HBM tests. Bloomberg and other outlets recently reported that a design flaw in Nvidia’s next-generation AI accelerator, the Blackwell series, will delay the delivery of its top-tier product, the GB200, to customers until the first quarter of next year. The Blackwell series will feature eight 192GB HBM3E modules.
Samsung has announced that it is conducting verification tests with major customers.