Samsung's $73B AI chip blitz to ease supply crunch for Nvidia, AMD
Technology

Samsung's $73B AI chip blitz to ease supply crunch for Nvidia, AMD

Korean tech giant targets SK Hynix's HBM dominance with record investment in high-bandwidth memory

Samsung Electronics plans to invest a record $73 billion in semiconductor facilities and research this year, a massive capital commitment aimed at challenging SK Hynix's dominance in the high-bandwidth memory market that could ease supply constraints for Nvidia and AMD.

The South Korean technology giant announced plans to spend more than 110 trillion Korean won ($73.3 billion) on facilities and R&D in 2026, surpassing Taiwan Semiconductor's budget. The investment focuses heavily on advanced memory markets, including HBM4—next-generation memory critical for artificial intelligence accelerators.

The move comes as the global HBM market grapples with what industry analysts describe as a "supercycle" driven by surging AI demand. SK Hynix currently commands 62% of HBM shipments, with Samsung trailing at approximately 17%. The shortage has become so acute that production is nearly sold out through 2025, with all major suppliers having committed capacity through 2026.

For Nvidia, which relies on HBM for its AI GPUs, Samsung's expansion could provide critical supply diversification. The chipmaker has purchase obligations reaching $45.8 billion to secure HBM supply, primarily from SK Hynix, who is expected to provide approximately two-thirds of the HBM4 chips for Nvidia's upcoming Vera Rubin AI platform. Samsung's increased capacity could reduce Nvidia's dependence on a single supplier.

AMD has already moved to secure Samsung's support. The company partnered with Samsung to acquire HBM4 for its next-generation Instinct MI455X AI accelerators, positioning Samsung as a key HBM4 supplier for AMD's future AI GPUs.

Samsung's strategy involves rapidly scaling production capacity. The company plans to boost HBM output to approximately 250,000 wafers per month by the end of 2026, a 47% increase from its capacity at the end of December 2025. The company has already begun commercial shipments of its latest HBM4 chips, shipping industry-first commercial HBM4 with ultimate performance for AI computing.

The HBM market is expected to remain tight through the end of the decade. SK Group Chairman Chey Tae-won has forecast that the memory chip shortage will last until 2030, underscoring the strategic importance of Samsung's capacity expansion.

However, Samsung still faces significant challenges in the foundry market, where Taiwan Semiconductor holds nearly 70% market share compared to Samsung's approximately 7%. The $73 billion investment represents Samsung's most aggressive push yet to narrow that gap and reclaim leadership in the AI semiconductor era.

For Nvidia and AMD, the competitive dynamics in HBM supply could prove crucial as they scale AI infrastructure production. Diversified supply reduces single-source risk and supports the massive production increases needed to meet global demand for AI computing power.