Summary: Samsung and OpenAI signed letters of intent in Seoul to supply huge volumes of memory targeting up to 900,000 DRAM wafers a month and to co-develop floating data centers for OpenAI’s $500B Stargate build out. SK hynix is a parallel partner. Expect Korean data center projects and seawater-cooled designs to be explored.
Table of Contents
What was signed in Seoul and why it matters
On Oct 1, 2025, Samsung and OpenAI announced LOIs covering four Samsung units Electronics, SDS, C&T, and Heavy Industries to support next-gen AI infrastructure. The deal spans memory supply, data-center design/operations, and R&D into floating facilities. SK hynix signed a separate, parallel LOI with OpenAI the same day.
The memory pledge: up to 900k wafers/month
OpenAI’s demand signal is eye-watering: up to 900,000 DRAM wafer starts per month over the ramp numbers that, depending on mix/yields, could represent a hefty slice of global DRAM output. Industry coverage pegs the requirement at as much as ~40% of worldwide DRAM. Translation: AI is now the memory market.
Floating data centers: what they solve and what they risk
Why float? Near-city sites are scarce, land is expensive, and cooling is the new bottleneck. Floating designs promise abundant siting, cheap seawater heat rejection, and lower emissions on paper. But complexity and permitting are real hurdles. Microsoft’s Project Natick showed subsea concepts can be reliable, yet long-term ops and environmental approvals remain the big question marks.
Korea’s AI ambitions and the Stargate timeline
OpenAI says Stargate’s five new U.S. sites push planned capacity to nearly 7 GW and >$400B committed, with a path to $500B/10 GW by year-end. In Korea, the parties will evaluate additional sites, with Samsung SDS acting as an enterprise partner and reseller for ChatGPT-class services.
Comparison Table: Floating vs Land Data Centers
| Factor | Floating DCs | Land-based DCs |
|---|---|---|
| Siting near cities | Easier offshore siting | Hard in dense metros |
| Cooling | Seawater heat rejection (efficient) | Air/liquid cooling (costly) |
| Emissions | Potentially lower | Depends on power/cooling mix |
| Complexity & permits | High (maritime + env) | Lower, but land/power tough |
| Maintenance | Harder access | Easier access |
The Bottom line
The LOIs are not final build contracts, but they set direction: massive memory scaling and experimentation with floating infrastructure. If the partners can thread the needle on engineering, regulation, and power sourcing, Korea becomes a pivotal node in OpenAI’s compute backbone. If not, the fallback is conventional land-based builds just pricier to cool.
Frequently Asked Questions (FAQs)
Is SK hynix part of the deal?
Yes SK hynix signed a separate LOI to support the DRAM ramp for Stargate.
Will Korea get a Stargate site?
The partners will evaluate Korean sites; Samsung SDS will help design/operate and resell OpenAI services.
What does “900k wafers/month” imply?
It signals an extreme DRAM ramp; some analyses equate it to a large portion of global output. Exact impact depends on process mix and yields.
Are floating data centers proven?
Trials like Microsoft’s Project Natick showed reliability benefits, but broad deployment faces engineering and regulatory tests.
When do first builds go live?
LOIs outline intent; specific site timelines will be announced later as feasibility work concludes.
How does this affect HBM supply?
It tightens alignment between AI demand and DRAM/HBM roadmaps from Samsung and SK hynix.
Featured Answer Boxes
What did Samsung and OpenAI agree to?
They signed LOIs to supply huge volumes of DRAM (targeting up to 900k wafers/month) and to explore floating data centers with Samsung C&T and Samsung Heavy Industries, plus enterprise services via Samsung SDS. SK hynix also signed a parallel memory LOI.
Why floating data centers?
To overcome land scarcity near cities and cut cooling costs/emissions using seawater. The approach is promising but complex, with permitting and maintenance challenges still unresolved at scale.
How big is OpenAI’s Stargate?
OpenAI cites nearly 7 GW of planned capacity and over $400B committed so far, aiming for $500B and 10 GW by end-2025 across multiple sites.
Source: Samsung Newsroom | OpenAI
