← Front Page
AI Daily
Hardware • Monday, March 16, 2026

AI Is Creating a Memory Chip Shortage. Your Next Phone Will Pay for It.

By AI Daily Editorial • Monday, March 16, 2026

The AI infrastructure buildout has a component problem that doesn't get as much attention as GPU supply: memory. High-bandwidth memory — HBM, the specialised stacked chip architecture that sits alongside AI accelerators and is essential for their performance — is in acute shortage, and the shortage is distorting markets well beyond the data centre. Bloomberg's February investigation found that AI demand for memory is fuelling a chip crisis that is rippling through consumer electronics, with the price of standard DRAM and NAND flash rising because the fabs that make them are redirecting capacity toward the more profitable HBM products that hyperscalers are willing to pay a premium for.

The economics are straightforward and slightly uncomfortable. Micron, Samsung, and SK Hynix — the three companies that dominate HBM production — are allocating an increasing share of their manufacturing capacity to HBM at the expense of the commodity memory that goes into phones, laptops, and cars. From the manufacturers' perspective, this is rational: HBM commands significantly higher margins, and the demand from Nvidia, AMD, and their hyperscaler customers is both large and relatively price-insensitive. The consequence is a supply constraint in standard memory that is pushing prices up for everyone else.

CNBC reported in January that AI memory was effectively sold out through the near term, citing Micron as operating at full HBM capacity with demand still exceeding supply. The shortage is partly a production capacity problem — HBM requires different and more complex manufacturing processes than standard DRAM — and partly a timing problem, since expanding HBM capacity requires significant capital investment with multi-year lead times. Samsung and SK Hynix are both building new HBM production lines, but meaningful additional supply is 18–24 months away at best.

Bloomberg's consumer-facing analysis puts a number on the downstream effect: the AI boom is expected to make phones, cars, and other electronics meaningfully more expensive over the next two years. That's a cost that falls on ordinary consumers who have no direct relationship with the AI data centres driving the shortage. It's a different version of the same dynamic we noted yesterday in the energy story — the infrastructure costs of the AI buildout are being socialised in ways that rarely feature in the announcements about hundred-billion-dollar investment plans.

The longer-term picture is more optimistic. New HBM capacity will come online, alternative memory architectures are being developed, and the efficiency improvements in AI models — particularly the trend toward smaller, more efficient models for inference — should reduce per-query memory requirements over time. But the near-term constraint is real, and for a consumer about to buy a new device, the connection between Jensen Huang's keynote at GTC today and the price tag on their next phone is more direct than it might appear.

Sources