Global Memory Shortage Crisis: Market Analysis
Key topics
A heated debate erupted over the potential impact of a global memory shortage on the tech industry, with some commenters veering off to dissect the real-world implications of AI-generated content on social media. While some blamed boomers for devouring "AI slop," others pointed out that millennials are just as guilty, with one millennial commenter noting that their generation is actually repulsed by it. As the discussion unfolded, a consensus emerged that the true issue lies not with age, but with the increasingly sophisticated methods of manipulating consumer behavior. Meanwhile, others focused on the market analysis, with some seeing a potential opportunity for Apple to gain ground as Android specs stagnate.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
4h
Peak period
56
0-6h
Avg / period
17.5
Based on 105 loaded comments
Key moments
- 01Story posted
Dec 28, 2025 at 10:51 AM EST
5d ago
Step 01 - 02First comment
Dec 28, 2025 at 2:31 PM EST
4h after posting
Step 02 - 03Peak activity
56 comments in 0-6h
Hottest window of the conversation
Step 03 - 04Latest activity
Jan 1, 2026 at 3:26 PM EST
1d ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
I find it very odd when people proudly proclaim they used, say, Grok to answer a question. Their identity is so tied up in it that if you start talking about the quality of the information they get incredibly defensive. In contrast: I never felt protective of my Google search results.
They also don’t care about the communities they are impacting in the slightest. https://lailluminator.com/2025/11/22/meta-data-center-crashe...
It is, though. We're just in the part leading up to WWIII.
You want to be born into the utopia, not before.
Just wait until the next great collapse, a disaster big enough to force change. Hopefully we'll have the right ideas lying around at the time to restructure our social communication system.
Until then, it's slow decline. Embrace it.
https://news.ycombinator.com/item?id=46413716
Boomers might be out there consuming those AI youtube videos that are just tiktok voice over with a generated slide show but Millennials think since they can identify this as slop that they are not affected. That is incorrect, and just as bad.
It's shocking how quickly my family normalized consuming obvious AI slop short-form videos, one after the other, for hours. It's horrifying.
And if you think that somebody buys an iPhone because they compare the specs with Android :)))))
"What do you mean my status flagship iPhone costs only half as much as a flagship Android???"
Nah. The marginal utility of more smartphone ram is near zero at this point. The vast majority of people wouldn't even notice if the memory in their phone tripled overnight.
How scarce does memory have to get before it makes health care half as expensive?
1. https://en.wikipedia.org/wiki/Baumol_effect
Every functionality be will subscription-based. You'll own nothing and you'll be happy.
Or consuming 2 GB of RAM to have Teams running in the background doing nothing?
Yeah, if we got rid of that as a result of RAM shortages, that’d be great.
The economy says nothing about requiring humans to exist.
The wafers are not DRAM. This is more likely burning oil wells so your enemy can't use them.
https://www.mooreslawisdead.com/post/sam-altman-s-dirty-dram...
There are plenty of workloads where I’d love to double the memory and halve the cores compared to what the memory-optimised R instances offer, or where I could further double the cores and halve the RAM from what the compute-optimised C instances can do.
“Serverless” options can provide that to an extent, but it’s no free lunch, especially in situations where performance is a large consideration. I’ve found some use cases where it was better to avoid AWS entirely and opt for dedicated options elsewhere. AWS is remarkably uncompetitive in some use cases.
Well, except IBM. Maybe Yahoo.
I wonder if this will result in writing more memory-efficient software? The trend for the last couple of decades has been that nearly all consumer software outside of gaming has moved to browsers or browser-based runtimes like Electron. There's been a vicious cycle of heavier software -> more RAM -> heavier software but if this RAM shortage is permanent, the cycle can't continue.
Apple and Google seemed to be working on local AI models as well. Will they have to scale that back due to lack of RAM on the devices? Or perhaps they think users will pay the premium for more RAM if it means they get AI?
Or is this all a temporary problem due to OpenAI's buying something like 40% of the wafers?
(Source: I maintain an app integrated with llama.cpp, in practice no one likes 1 tkn/s generation times that you get from swapping, and honestly MoE makes RAM situation worse because in practice, model developers have servers and batch inference and multiple GPUs wired together. They are more than happy to increase the resting RAM budget and use even more parameters, limiting the active experts is about inference speed from that lens, not anything else)
However, the customers do not care and will not pay more so the business cannot justify it most of the time.
Who will pay twice (or five times) as much for software written in C instead of Python? Not many.
I do not think it is surprising that there is a Jevons paradox-like phenomena with computer memory and like other instances of it, it does not necessarily follow that this must be a result of a corresponding decline in resource usage efficiency.
What do you mean it can't continue? You'll just have to deal with worse performance is all.
Revolutionary consumer-side performance gains like multi-core CPUs and switching to SSDs will be a thing of distant past. Enjoy your 2 second animations, peasant.
Ideally, llm should be able to provide the capability to translate from memory inefficient languages to memory efficient languages, and maybe even optimize underlying algorithms in memory use for this.
But I'm not going to hold my breath
From what I see in other comments, if you can confidently assert “AI bubble no one will want GPUs soon” it makes sense, but the COVID stuff is a head scratcher.
> As a result, IDC expects 2026 DRAM and NAND supply growth be below historical norms at 16% year-on-year and 17% year-on-year, respectively.
This is an odd claim. It’s like saying that car companies historically produced more coupes than sedans, but suddenly there are new enormous orders for millions of sedans. All cars get massively more expensive as a result — car makers charge 50-200% more than before. Sure, they need to retool a little bit and buy more doors, but somehow the article claims that “limited … capital expenditure” means that overall production will grow more slowly than historical rates?
This only makes sense either on extremely short timescales (as retooling distracts form expansion) or if the car makers decide not to try to compete with each other. Otherwise some of those immediately available profits would turn into increased capital expenditure and more RAM would be produced. (Heck, if RAM makers think the new demand is sustainable, they should be happy to increase production to sell more units at current prices.)
I mean, the lack of affordable consumer hardware may end up further reducing the need for AI.
On the other hand, it may end up shifting workloads to the cloud instead.
Heck, time will tell.
Perhaps Apple will just delay it, or maybe Apple has a special memory deal negotiated years in advance and can undercut the whole market (I wish).
DRAM is a notoriously cyclical market, though, and wise investors are leery of jumping into a frothy top. So, it’ll take a while before anyone decides the price is right to stand up a new competitor.
> PC market contract by 4.9% compared with a 2.4% year-on-year decline in the November forecast. Under a more pessimistic scenario, the decline could deepen to 8.9%.
Imagine if China comes to the rescue and supplies the world with affordable RAM and open sources the technology, like how they did with DeepSeek.