AI memory crunch will make smartphones more expensive
Your next smartphone or PC could be more expensive with lower speed and less memory because of AI datacenters
AI data centers are draining global memory supply, forcing Dynamic Random Access Memory (DRAM) and NAND (Not And) flash memory used in smartphones and PCs to shrink or cost more.
This lack of supply will change devices, apps, and the mobile ecosystem through 2027.
When SK Hynix said its supply of AI memory chips for 2026 was already sold out, phone makers understood the message; the AI data centers would eat first, and everyone else would wait.
CEO of MEF (Mobile Ecosystem Forum).
Inside device design teams the conversation has changed. Engineers used to ask how much RAM users wanted. Now they ask how much they can afford.
Memory manufacturers are shifting their production toward high-bandwidth chips used in artificial intelligence systems. That leaves less conventional DRAM and NAND for phones and PCs.
Prices rise and some devices will cost more., while others will ship with less memory.
Consumers may not notice the change at first, but the mobile industry does. And so do the people building the apps that run on those phones.
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
The Zero-Sum Chip
“Every wafer allocated to an HBM stack for an Nvidia GPU is a wafer denied to a smartphone or laptop”, says IDC analyst Tom Mainelli. Artificial intelligence runs on memory, not processors. And it needs a lot of memory.
Large Language Models move enormous volumes of data. That requires high-bandwidth memory, or HBM. It is fast, it is complex, and it is expensive.
So, predictably the three companies that dominate the memory market—Samsung Electronics, SK Hynix, and Micron Technology—are building more of it. This is where their focus is, not on the memory needed for smartphones and computers and workstations.
The problem is simple; a chip factory cannot produce everything at once. Every wafer used for AI memory is one that cannot become smartphone DRAM. The result is a squeeze where chipmakers enjoy record profits and device makers get the bill.
The price of smarts
The effect of the memory crunch is that devices cost more and margins shrink.
PC makers have already raised prices, and smartphones will follow. Analysts expect phone materials costs to jump 15 percent or more. Some mid-range models may lose RAM. Entry-level devices may shrink specs. Premium models may plateau instead of climbing.
The irony bites. AI promises smarter phones, but building the brains that power AI drives the cost of the body—the device—higher.
Even Apple cannot escape completely. It buys huge amounts of memory and signs long-term contracts, which means it can absorb spikes better than most. But it still depends on the same chip makers. Eventually, the wall will be hit.
Smaller Android makers cannot hide; they will raise prices or ship weaker devices. Some may skip low-margin models altogether.
AI arms race
There is already an AI arms race with Samsung pushing Galaxy AI, Google embedding generative models in Android, and Chinese firms experimenting with local assistants. All these features need memory, more RAM, faster chips and more bandwidth.
Competition adds cost. Every AI feature is a little more expensive to run on limited memory. Every new function forces trade-offs. The richer the AI, the pricier the phone.
DRAM, HBM, and the squeeze
DRAM is what most phones use. It is standard memory and (relatively) easy to produce—until chip makers redirect capacity.
HBM is stacked memory. It is high-bandwidth. Close to GPUs (Graphics Processing Units) and hard to make, making it expensive. It is also required for AI training and inference.
Memory manufacturers are chasing profit, and HBM yields higher margins whereas standard DRAM yields less. So, the manufacturing shift is structural, rather than temporary. Analysts already expect the shortage to last through 2027.
Even if factories expand, HBM is slow to scale so the increase in supply won’t be felt quickly. This is because precision stacks fail easily and each defect ruins a stack meaning yields are low.
Impact on the mobile ecosystem
The entire mobile ecosystem is affected, not just device manufacturers. For example, developers must rethink assumptions; phones may not get faster every year, memory may be capped and storage may shrink in some models.
Cloud-based AI will remain critical. On-device intelligence is attractive but it is also expensive, so most users will rely on cloud inference (i.e., the process of running a trained AI model on powerful remote servers to make predictions on new, unseen data.) APIs, messaging platforms, AI services—all of these remain central.
As platform power consolidates, Apple and Samsung will control the majority of both hardware and supply, meaning smaller device makers will struggle.
The AI boom is redrawing the hardware map and mobile ecosystem needs to be ready.
Lessons for entrepreneurs
For entrepreneurs, the lesson is clear: build for platforms, not just devices. Assume devices may plateau in capability so plan services that tolerate memory limits. And optimize for cloud-first execution.
1. Expect scarcity – Components may be limited. Design around what is available.
2. Price is elastic – Users may tolerate slightly higher costs if features feel essential.
3. Optimize software – Heavy apps may require cloud offloading. Memory limits matter.
4. Watch platforms – Ecosystem control matters more than device specs.
5. Plan for delays – Product cycles may lengthen. Low-margin devices may disappear.
AI is a new tax on smartphones, laptops and consoles. Developers must build knowing that the devices running their software are not getting cheaper or more powerful automatically.
The future will be smarter, but the devices may be smaller, and will very likely be slightly thinner on RAM.
We list the best email apps for iPhones..
This article was produced as part of TechRadar Pro Perspectives, our channel to feature the best and brightest minds in the technology industry today.
The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/pro/perspectives-how-to-submit
CEO of MEF (Mobile Ecosystem Forum).
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.