There's one number the entire tech industry should be paying attention to: in 2025, SK Hynix made more money than Samsung. Not in a single quarter. For the full year. It's the first time this has ever happened in the history of the semiconductor industry, and the reason comes down to three letters: HBM.
SK Hynix posted an operating profit of 47.2 trillion won ($33.1 billion), up 101% from 2024. Samsung came in at 43.6 trillion won ($30.5 billion). The percentage gap might look narrow, but the significance is enormous: we're talking about a company that dominated memory for decades, dethroned by a rival that bet everything on a type of chip that almost nobody knew about five years ago.
My verdict is clear: this isn't a fluke or a temporary blip. It's the result of 15 years of strategic investment by SK Hynix and a series of Samsung missteps that the AI revolution exposed with brutal clarity.
What HBM Memory Is and Why It Dominates the Industry
Before diving into the numbers, it's important to understand what HBM (High Bandwidth Memory) is. It's a type of 3D-stacked DRAM designed to process massive amounts of data at speeds that conventional memory can't come close to matching.
Think of it as an eight-lane highway versus a single-lane road. That's HBM versus conventional DDR memory. AI models like GPT-5.2 or Claude need to process billions of parameters in milliseconds. Without HBM, that simply isn't possible.
HBM Evolution in Numbers
| Generation | Bandwidth | Capacity per Stack | Launch |
|---|---|---|---|
| HBM2 | 307 GB/s | 8 GB | 2016 |
| HBM3 | 665 GB/s | 24 GB | 2022 |
| HBM3E | 1,200 GB/s | 36 GB | 2024 |
| HBM4 | 2,000 GB/s | 48 GB | 2025 |
Each generation roughly doubles the performance of the previous one. And here's where SK Hynix pulled ahead: they were the first to mass-produce every single generation, while Samsung consistently arrived late.
The Numbers That Explain Everything
After months of hands-on analysis of both companies' quarterly results, the data tells a compelling story.
SK Hynix: The Year of Records
| Metric | 2024 | 2025 | Change |
|---|---|---|---|
| Revenue | 66.19T KRW | 97.1T KRW | +47% |
| Operating Profit | 23.5T KRW | 47.2T KRW | +101% |
| Net Profit | 19.7T KRW | 42.9T KRW | +117% |
| HBM Revenue | — | More than doubled vs 2024 | >100% |
HBM revenue more than doubled in 2025. In Q4, SK Hynix posted an operating profit of 19.17 trillion won, up 137% year over year. The company is printing money, literally.
Samsung: The Crisis and Late Recovery
Samsung's 2025 story is one of a company that stumbled, fell, and is trying to get back up.
| Quarter | Operating Profit | Context |
|---|---|---|
| Q1 2025 | 1.1T KRW | 42% YoY decline |
| Q2 2025 | 4.6T KRW | -56% YoY, chip division -94% |
| Q3 2025 | 12.2T KRW | Recovery, 2x vs Q2 |
| Q4 2025 | ~20T KRW | Record quarter (recovery) |
Q2 2025 was the low point: Samsung's chip division lost 94% of its operating profit compared to the prior year. I won't sugarcoat it: it was a catastrophe that exposed years of strategic inertia.
Why Samsung Fell: 18 Months Behind on HBM3E
Samsung's fall wasn't a market problem. It was an execution problem.
The Nvidia Validation Fiasco
Nvidia is the most important customer in the HBM market. Their H100, H200, and Blackwell GPUs need the fastest, most reliable memory available. And Nvidia is extremely demanding on quality.
Samsung tried to pass Nvidia's validation tests for its 12-layer HBM3E chips and failed three consecutive times. The issues:
- Excessive heat: Chips generated more temperature than permitted
- High power consumption: Exceeded Nvidia's power limits
- Insufficient speed: Didn't meet performance requirements
Samsung had to completely redesign the DRAM core in March 2025 to address these thermal issues. They finally passed in September 2025, but by then they were the third supplier to be approved, after SK Hynix and Micron.
18 months of delay. In an industry where every quarter counts, that's an eternity.
HBM Market Share: The Historic Shift
| Period | SK Hynix | Samsung | Micron |
|---|---|---|---|
| Q2 2025 | 62% | 17% | 21% |
| Q3 2025 | 57% | 22% | 21% |
In Q2 2025, Micron actually surpassed Samsung for the first time, pushing the Korean giant down to third place in a market it should have been leading. Samsung regained some ground in Q3, but the gap with SK Hynix remains massive.
The SK Hynix-Nvidia Alliance: A Money-Printing Machine
If you ask me directly why SK Hynix won this battle, the answer is simple: Nvidia.
SK Hynix is the primary HBM memory supplier for Nvidia's entire GPU lineup:
- H100: HBM3 supplied by SK Hynix
- H200: HBM3E with 50% more bandwidth
- Blackwell (B200/GB200): SK Hynix as primary HBM3E supplier
- Rubin (next generation): SK Hynix secures 70% of HBM4 supply, according to UBS
The relationship is so tight that in 2025, Nvidia and SK Group announced they would jointly build an AI factory in South Korea. This isn't just a supplier-customer relationship: it's a strategic alliance.
SK Hynix confirmed that its DRAM, NAND, and HBM production capacity is completely sold out through 2026. Every chip they make has a buyer before it leaves the factory. That's the definition of market dominance.
The Price Impact: Prepare Your Wallet
SK Hynix's HBM dominance has direct consequences for consumers and businesses alike.
Memory Prices in 2026
| Product | Expected Increase | Context |
|---|---|---|
| HBM3E | +20% in 2026 | Samsung and SK Hynix raising prices |
| Server DRAM | +70% in H1 2026 | Insatiable AI demand |
| General DRAM | +40-50% in H1 2026 | Spillover effect |
| Smartphones | +6.9% average price | Double previous forecast |
| PCs | +4-6% average price | Moderate scenario |
TrendForce calls the Q1 2026 price increase "unprecedented": over 50% compared to Q4 2025. The reason is structural: memory manufacturers are reallocating advanced production capacity to high-margin HBM chips for AI servers, squeezing supply for consumer devices.
In other words: AI is making your next phone and laptop more expensive. And this isn't changing anytime soon. The memory shortage could extend into 2027-2028, when new fab capacity comes online.
Samsung's Counterattack: Too Little, Too Late?
Samsung isn't standing still. Their 2026 recovery plan is aggressive:
- +50% HBM capacity in 2026 (from 170,000 to 250,000 wafers/month)
- HBM4 certification with Nvidia in progress, possible entry in Q2 2026
- Massive investment continuing from 2025's 40.9 trillion won
- Factory expansion in Pyeongtaek (South Korea) and Taylor (Texas)
Samsung co-CEO Jun Young-hyun stated that customers have said "Samsung is back" regarding HBM4. Morgan Stanley projects Samsung's earnings per share will grow over 150% in 2026, with total earnings potentially exceeding $60 billion.
But if you ask me directly, Samsung is playing catch-up, not leading. Recovering market share from 17% to 30% is feasible. Overtaking SK Hynix in HBM again is an entirely different matter.
The Geopolitical Chip War
There's a factor most outlets barely mention: geographic concentration. The vast majority of global HBM production is located in South Korea. This makes SK Hynix and Samsung strategic assets in the U.S.-China tech war.
U.S. export restrictions have already forced both companies to limit their advanced chip sales to China. The revocation of VEU (Validated End-User) licenses forced Samsung and SK Hynix to recalibrate their Chinese operations.
Micron, the third player with ~21% share, is positioning itself as the "geopolitically safer" supplier for customers worried about South Korean concentration. That's an advantage that shouldn't be underestimated in a world where supply chains are foreign policy weapons.
For companies that rely on AI infrastructure like what Meta is building or the servers powering GPT-5.2, this geographic concentration is a real risk that belongs in any supply chain analysis.
What's Next in 2026: HBM4 and the Battle for Nvidia Rubin
The next battleground is already defined: HBM4, the fourth generation of high-bandwidth memory.
HBM4 vs HBM3E
| Specification | HBM3E | HBM4 |
|---|---|---|
| Interface | 1,024 bits | 2,048 bits (2x) |
| Bandwidth | 1.2 TB/s | 2 TB/s |
| Max Capacity | 36 GB | 64 GB |
| Channels | 16 | 32 |
HBM4 doesn't rely on higher clock speeds. Instead, it doubles the interface width from 1,024 to 2,048 bits, achieving greater bandwidth with better power efficiency. It's an elegant design shift that proves innovation isn't always about brute force.
SK Hynix was the first company to complete mass production of HBM4 (September 2025) and in January 2026 demonstrated 16-layer HBM4 chips (48 GB at 10 GT/s) at CES. They also announced a new $13 billion advanced packaging mega-fab to consolidate their lead.
UBS estimates SK Hynix will secure 70% of HBM4 supply for Nvidia's Rubin platform, the next generation of AI data center GPUs. If that forecast holds, SK Hynix's dominance isn't temporary: it's structural.
Pros and Cons of the HBM Memory Market
Pros
- Unstoppable AI demand: Every data center from Meta, Google, Microsoft, and Amazon needs more HBM
- Rising prices: 20-50% increases in 2026 improve margins
- SK Hynix sold out: Capacity reserved through 2027
- HBM4 doubles performance: Innovation cycle guarantees upgrade demand
- Solid Nvidia alliance: 70% of Rubin secures future revenue
Cons
- Geographic concentration: Nearly all production is in South Korea
- Geopolitical risk: U.S.-China tensions affect exports
- Samsung comeback possible: 50% capacity increase threatens share
- Memory cycle risk: The sector is historically cyclical and volatile
- HBM4 competition: If Samsung certifies with Nvidia, the advantage narrows
- Demanding valuation: SK Hynix stock has already risen 384% from lows
Frequently Asked Questions
Why did SK Hynix surpass Samsung in annual profit?
SK Hynix accumulated 15 years of investment in HBM (High Bandwidth Memory), the type of chip that Nvidia's GPUs need for training and running AI models. Their position as Nvidia's primary HBM supplier for the H100, H200, and Blackwell GPUs, combined with Samsung's quality problems with HBM3E (18 months of certification delays), gave them a decisive advantage. In 2025, SK Hynix posted 47.2 trillion won in operating profit versus Samsung's 43.6 trillion.
What is HBM memory and why is it so important for AI?
HBM (High Bandwidth Memory) is 3D-stacked DRAM that delivers up to 10 times the bandwidth of conventional DDR memory. Large AI models like GPT-5.2, Claude, and Gemini need to process billions of parameters at extreme speeds, and only HBM can feed data fast enough. The current generation (HBM3E) reaches 1.2 TB/s, and HBM4 will double that to 2 TB/s.
How does this affect smartphone and laptop prices?
Memory manufacturers are reallocating advanced production capacity to high-margin HBM chips for AI servers, squeezing supply for consumer devices. TrendForce estimates DRAM price increases of 40-50% in the first half of 2026. As a result, average smartphone prices will rise 6.9% and PC prices will increase 4-8% in 2026.
Can Samsung regain the lead in 2026?
Samsung plans to increase HBM capacity by 50% in 2026 and is working on HBM4 certification with Nvidia. Morgan Stanley projects their earnings will grow over 150% in 2026. However, SK Hynix has already secured 70% of HBM4 supply for Nvidia's Rubin platform. The most likely scenario is Samsung recovering share to around 30%, but not overtaking SK Hynix in the near term.
What geopolitical risks exist in the memory market?
Nearly all global HBM production is concentrated in South Korea (SK Hynix + Samsung). U.S. export restrictions to China have already limited advanced chip sales. Micron, with 21% share and U.S.-based production, is positioning as a geopolitically safer alternative. South Korean concentration is a real risk for global AI supply chains.




