
Samsung Electronics is betting on Processing-in-Memory (PIM) as the next-generation memory chip for artificial intelligence (AI) applications, with hopes that PIM could overtake high-bandwidth memory (HBM). PIM is a new type of semiconductor that combines data storage and processing capabilities.
During the ‘Samsung AI Forum 2024,’ held over two days starting Nov. 4, Samsung shared its AI strategy with global experts, including Professor Yoshua Bengio from the University of Montreal.
“AI is transforming our lives at an unprecedented rate, and the question of how to use AI more responsibly is becoming increasingly important,” said Samsung Electronics CEO Han Jong-hee in his opening remarks. “Samsung Electronics is committed to fostering a more efficient and sustainable AI ecosystem.”
HBM is an essential component of data center AI accelerators, but Samsung has lagged behind rival SK Hynix in this area. Samsung seeks to reassert itself by focusing on PIM, a new AI chip that stores and processes data simultaneously, significantly reducing power consumption. The company developed the first-ever HBM-PIM semiconductor in 2021, integrating an AI processor within HBM chips for optimized efficiency.
During the event, Samsung highlighted its partnership with American fabless semiconductor company AMD, Nvidia’s rival in the AI chip market. Samsung is reportedly supplying AMD with its fifth-generation HBM, the HBM3E.
The company also announced plans to build an AI ecosystem by equipping home appliances and mobile devices with on-device (embedded) AI technology. In its third-quarter earnings call last month, Samsung presented its “Multi-Device AI Strategy,” based on its SmartThings platform, which has 360 million users. Samsung aims to connect devices seamlessly to provide consumers with personalized AI experiences under a vision the company named “everyday AI.”