Samsung may power Nvidia's next AI leap with HBM4 memory chips
Samsung Electronics confirmed that it is in “close discussion” with Nvidia to supply its next-generation HBM4 ( High Bandwidth Memory ) chips, signaling a major strategic move in the global race to power artificial intelligence infrastructure. As reported by Reuters, the announcement was made during the Asia-Pacific Economic Cooperation (APEC) CEO Summit in Gyeongju, where Samsung and Nvidia executives shared the stage to highlight deepening tech collaboration.   

HBM4 chips are critical components in advanced AI processors, offering ultra-fast memory bandwidth essential for training large language models and powering generative AI applications. Samsung’s potential deal with Nvidia could help the South Korean tech giant close the gap with rival SK Hynix, which currently leads the market in supplying HBM chips to Nvidia
  
Samsung’s AI ambitions
  
Samsung plans to begin marketing its HBM4 chips in 2026, though it has not yet confirmed a shipping timeline. The company is also building an AI-powered semiconductor factory equipped with 50,000 Nvidia GPUs, further cementing its commitment to AI-driven manufacturing and chip design.
  
  
“In addition to our ongoing collaborations, Samsung and Nvidia are also working together on HBM4,” the company said in a statement.
  
The chips are expected to be used in Nvidia’s next-generation Blackwell GPUs, which are being deployed across South Korea’s AI infrastructure initiatives.
  
  
Nvidia’s expanding supply chain
  
Nvidia, which recently announced partnerships with several South Korean firms including Samsung, SK Group, and Hyundai Motor Group, confirmed it is in “key supply collaboration for HBM3E and HBM4.” The company did not elaborate on the scope or timing of the deal.
  
SK Hynix, Nvidia’s current top supplier of HBM chips, said earlier this week it plans to begin shipping its own HBM4 chips in Q4 2025 and expand sales in 2026.
  
  
Samsung’s entry into the HBM4 supply chain could reshape the competitive dynamics of the AI memory market. While SK Hynix has maintained a lead in HBM technology, Samsung’s scale and manufacturing capabilities position it as a formidable challenger.
HBM4 chips are critical components in advanced AI processors, offering ultra-fast memory bandwidth essential for training large language models and powering generative AI applications. Samsung’s potential deal with Nvidia could help the South Korean tech giant close the gap with rival SK Hynix, which currently leads the market in supplying HBM chips to Nvidia
Samsung’s AI ambitions
Samsung plans to begin marketing its HBM4 chips in 2026, though it has not yet confirmed a shipping timeline. The company is also building an AI-powered semiconductor factory equipped with 50,000 Nvidia GPUs, further cementing its commitment to AI-driven manufacturing and chip design.
“In addition to our ongoing collaborations, Samsung and Nvidia are also working together on HBM4,” the company said in a statement.
The chips are expected to be used in Nvidia’s next-generation Blackwell GPUs, which are being deployed across South Korea’s AI infrastructure initiatives.
Nvidia’s expanding supply chain
Nvidia, which recently announced partnerships with several South Korean firms including Samsung, SK Group, and Hyundai Motor Group, confirmed it is in “key supply collaboration for HBM3E and HBM4.” The company did not elaborate on the scope or timing of the deal.
SK Hynix, Nvidia’s current top supplier of HBM chips, said earlier this week it plans to begin shipping its own HBM4 chips in Q4 2025 and expand sales in 2026.
Samsung’s entry into the HBM4 supply chain could reshape the competitive dynamics of the AI memory market. While SK Hynix has maintained a lead in HBM technology, Samsung’s scale and manufacturing capabilities position it as a formidable challenger.
Next Story