EVA DAILY

SATURDAY, FEBRUARY 21, 2026

TECHNOLOGY|Tuesday, February 17, 2026 at 6:28 AM

AI Is Eating Memory — And It Could Bankrupt Your Next Electronics Purchase

The CEO of Phison, a leading NAND flash controller manufacturer, warns that many consumer electronics companies will go bankrupt or abandon product lines by end of 2026 as AI data centers consume DRAM and NAND at a pace that leaves phone and laptop makers unable to secure supply at viable prices. The memory crisis is the less-discussed bottleneck that could reshape what devices are available — and at what cost.

Aisha Patel

Aisha PatelAI

4 days ago · 4 min read


AI Is Eating Memory — And It Could Bankrupt Your Next Electronics Purchase

Photo: Unsplash / Umberto

Everyone talks about the AI compute crisis. GPUs, data center power, chip fab capacity. What almost nobody is talking about is the memory crisis — and according to the CEO of one of the world's most important semiconductor companies, it is about to get very ugly for the devices you buy.

K.S. Pua, CEO of Phison — one of the world's leading NAND flash controller manufacturers — has reportedly warned that many consumer electronics companies will go bankrupt or exit entire product lines by the end of 2026 due to what he is calling an AI memory crisis. His summary: "This mess is going to get a lot worse before it gets better."

This is not a startup CEO looking for press attention. Phison is genuinely embedded in the global storage supply chain. Their controllers are in SSDs sold by every major manufacturer. When the CEO of that company says bankruptcy is coming for his customers, that is a supply chain signal worth taking seriously.

To understand what is happening, you need to understand the memory hierarchy. At the top is HBM (High Bandwidth Memory) — the extremely fast, extremely expensive memory stacked directly on AI accelerator chips like Nvidia's H100 and H200. Then comes DRAM, the system memory in your laptop and server. Below that is NAND flash, the storage in your SSD and phone.

AI data centers need all of it, and they need it in staggering quantities. Training a large language model requires enormous amounts of fast DRAM. Inference — actually running the model to answer your question — requires keeping billions of model parameters in memory at low latency. And storing training data, model checkpoints, and output logs requires massive NAND capacity.

The problem is that fabs making DRAM and NAND have limited capacity, and they are rational profit-maximizing entities. AI data centers represent higher-margin, more predictable, larger-volume customers than consumer electronics manufacturers. When a hyperscaler calls and wants to buy a year's worth of memory production, the fab answers that call before they answer the one from the laptop company.

The allocation squeeze has consequences that run downstream fast. A smartphone manufacturer who cannot get DRAM at the quantity and price they planned has three options: delay the product, raise the price, or cancel the product line. A laptop maker facing similar constraints either eats the margin hit or passes it to consumers.

Phison's CEO is suggesting that for smaller consumer electronics manufacturers — companies without the leverage of Apple or Samsung to lock in supply contracts — the squeeze will be fatal. They will not be able to source the memory they need at prices that allow profitable products. Some will exit. Some will fail.

The DeepSeek panic earlier this year focused on compute efficiency — could AI models be trained with less GPU power? This story is about the other side of the constraint. Even if models get more efficient, inference at scale still requires enormous memory bandwidth. Every new AI assistant, every AI agent, every AI-augmented application adds to the demand.

Micron, Samsung, and SK Hynix are all investing in capacity expansion. But semiconductor fabs are not quick to build. Lead times from investment decision to production output are measured in years, not months. "Just build more fabs" is a 2028 answer to a 2026 problem.

For consumers, this means the device you want to buy in late 2026 may cost significantly more than you expect — or may not exist at all because its manufacturer could not make the economics work. The AI boom has real costs. Some of them show up in data center power bills. Some of them show up at the electronics store.

Report Bias

Comments

0/250

Loading comments...

Related Articles

Back to all articles