As AI Gobbles Up Chips, Prices for Devices May Rise
Key topics
The AI chip frenzy is driving up demand for memory chips, sparking concerns that device prices may skyrocket. Commenters chimed in, pointing out that prices are already inflated, with some noting that major manufacturers like Micron, Samsung, and SK Hynix dominate the market. A heated debate ensued over whether companies like Asus are genuinely ramping up production or just making empty claims to manipulate the market, with some calling for government intervention to stabilize prices. As the discussion unfolded, a surprising tangent emerged, with some commenters advocating for a more mindful approach to consumerism and even sharing tips on disabling JavaScript to avoid manipulative marketing tactics.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
17m
Peak period
63
6-12h
Avg / period
14.5
Based on 160 loaded comments
Key moments
- 01Story posted
Dec 28, 2025 at 5:52 PM EST
4d ago
Step 01 - 02First comment
Dec 28, 2025 at 6:09 PM EST
17m after posting
Step 02 - 03Peak activity
63 comments in 6-12h
Hottest window of the conversation
Step 03 - 04Latest activity
Jan 1, 2026 at 3:26 AM EST
1d ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
Prices are already through the roof...
https://www.tomsguide.com/news/live/ram-price-crisis-updates
So lets see if they might "save us"
And a couple of smaller ones: CXMT (if you’re not afraid of the sanctions), Nanya, and a few others with older technology
If I recall correctly, RAM is even more niche and specialized than the (already quite specialized) general chip manufacturing. The structure is super-duper regular, just a big grid of cells, so it is super-duper optimized.
You’re correct that DRAM is a very specialized process. The bit cell capacitors are a trench type that is uncommon in the general industry, so the major logic fabs would have a fairly uphill battle to become competitive (they also have no desire to enter the DRAM market in general).
https://www.tomshardware.com/pc-components/dram/no-asus-isnt...
My bad
Governments need to intervene here. This is a mafia scheme now.
I purchased about three semi-cheap computers in the last ~5 years or so. Looking at the RAM prices, the very same units I bought (!) now cost 2.5x as much as before (here I refer to my latest computer model, from 2 years ago). This is a mafia now. I also think these AI companies should be extra taxed because they cause us economic harm here.
2028 is another story depending on whether this frenzy continues / fabs being built (don’t know whether they are as hard as cpu)
I highly recommend disabling javascript in your browser.
Yes, it makes many sites "look funny", or maybe you have to scroll past a bunch of screen sized "faceplant" "twitverse" and "instamonetize" icons, but, there are far fewer ads (like none).
And of course some sites won't work at all. That's OK too, I just don't read them. If it's a news article, its almost always available on another site that doesn't require javascript.
Life online without javascript is just better. I've noticed an increase in sites that are useful (readable) with javascript disabled. Better than 10 years ago, when broken sites were rampant. Though there are still the lazy ones that are just blank pages without their javascript crutch.
Maybe the hardware/resource austerity that seems to be upon us now will result in people and projects refactoring, losing some glitter and glam, getting lean. We can resolve to slim down, drop a few megs of bloat, use less ram and bandwidth. It's not a problem; it's an opportunity!
In any case, Happy New Year! [alpha preview release]
But I use NoScript and it is definitely a big help.
E.g. IDEs could continue to demand lots of CPU/RAM, and cloud providers are able to deliver that cheaper than a mostly idle desktop.
If that happens, more and more of its functionality will come to rely on having low datacenter latencies, making use on desktops less viable.
Who will realistically be optimising build times for usecases that don't have sub-ms access to build caches, and when those build caches are available, what will stop the median program from having even larger dependency graphs.
This will only serve to increase the power of big players who can afford higher component prices (and who, thanks to their oligopoly status, can effectively set the market price for everyone else), while individuals and smaller institutions are forced to either spend more or work with less computing resources.
The optimistic take is that this will force software vendors into shipping more efficient software, but I also agree with this pessimistic take, that companies that can afford inflated prices will take advantage of the situation to pull ahead of competitors who can’t afford tech at inflated prices.
I don’t know what we can do as normal people other than making do with the hardware we have and boycotting Big Tech, though I don’t know how effective the latter is.
These big companies are competing with each other, and they're willing and able to spend much more for compute/RAM than we are.
> I don’t know what we can do as normal people other than making do with the hardware we have and boycotting Big Tech, though I don’t know how effective the latter is.
A few ideas:
* Use/develop/optimise local tooling * Pool resources with friends/communities towards shared compute.
I hope prices drop sooner than projects dev tools all move to the cloud.
A dad comes home and tells his kid, “Hey, vodka’s more expensive now.” “So you’re gonna drink less?” “Nope. You’re gonna eat less.”
next stage is paving everything with solar panels.
The RAM looks like cornering market. Probably something OpenAI should be prosecuted for if they end up profiting from it.
Looks like the frame.work desktop with Ryzen 128GB is shipping now at same price it was on release, Apple is offering 512GB Mac studios
Are snapdragon chips the same way?
The bigger the company = longer the contract.
However it will eventually caught up even to Apple.
It is not prices alone but the manufacturing redirection from something like lpddr in iphones to hbm and what have you for servers and gpu
https://www.google.com/amp/s/www.indiatoday.in/amp/technolog...
https://www.indiatoday.in/technology/news/story/ram-shortage...
SoCs with on-die memory (which is, these days, exclusively SRAM, since I don't think IBM's eDRAM process for mixing DRAM with logic is still in production) will not be effected. SiPs with on-package DRAM, including Apple's A and M series SiPs and Qualcomm's Snapdragon, will be effected -- they use the same DRAM dice as everyone else.
To answer the original question: the Framework Desktop is indeed still at the (pretty inflated) price, but for example the Bosgame mini PC with the same chip has gone up in price.
It's funky, I get it wrong all the time. Effect and Affect both have noun and verb uses.
"dice" is the plural for the object used as a source of randomness, but "dies" is the plural for other noun uses of "die".
Note that the memory is on the board for Ryzen AI Max, not on the package (as it is for Intel’s Lunar Lake and Apple’s M-series processors) or on die (which would be SRAM). As noted in another comment, whether the memory is on the board, on a module, or on the processor package, they are all still coming from the same extremely constrained three memory die suppliers, so costs are going up for all of them.
All your competitors are in the same boat, so consumers won’t have options. It’s much better to minimize the risk of blowing up by sticking as closely to spot at possible. That’s the whole idea of lean. Consumers and governments were mad about supply chains during the pandemic, but companies survived because they were lean.
In a sense this is the opposite risk profile of futures contracts in trading/portfolio management, even though they share some superficial similarities. Manufacturing businesses are fundamentally different from trading.
They certainly have contracts in place that cover goods already sold. They do a ton of preorders which is great since they get paid before they have to pay their suppliers. Just like airlines trade energy futures because they’ve sold the tickets long before they have to buy the jet fuel.
the risk is that such longer contracts would then lock you into a higher cost component for longer, if the price drops. Longer contracts only look good in hindsight if ram prices increased (unexpectedly).
If demand exceeds supply, either prices rise or supply falls, causing shortages. Directly controlling sellers (prices) or buyers (rationing) results in black markets unless enforcement has enough strength and integrity. The required strength and integrity seems to scale exponentially with the value of the good, so it's typically effectively impossible to prevent out-of-spec behavior for anything not cheap.
If everyone wants chips, semiconductor manufacturing supply should be increased. Governments should subsidize domestic semiconductor industries and the conditions for them to thrive (education, etc.) to meet both goals of domestic and economic security, and do it in a way that works.
The alternative is decreasing demand. Governments could hold bounty and incentive programs for building electronics that last a long time or are repairable or recyclable, but it's entirely possible the market will eventually do that.
If there is already demand at this inflated price, shouldn’t we ask why more capacity is not coming online naturally first?
...and why it has been consistently the case for a long while.
https://www.merriam-webster.com/dictionary/oligopoly
e.g., the Phoebus cartel https://en.wikipedia.org/wiki/Phoebus_cartel
The non financial parts, which include mandated restructuring and penalties to directors including incarceration however, are not tokenistic. They'd be appealed and delayed, but at some point the shareholders would seek redress from the board. Ignoring judicial mandated instructions isn't really a good idea, current WH behaviour aside. If the defence here is "courts don't matter any more" that's very unhelpful, if true. At some point, a country which cannot enforce judicial outcomes has stopped being civil society.
My personal hope the EU tears holes in the FAANG aside, the collusive pricing of chips has been a problem for some time. The cost/price disjunction here is strong.
Isn't Micron stopping all consumer RAM production? So their factories won't help anyway.
Also, even if no Micron RAM ever ended up in consumer hands, it would still reduce prices for consumers by increasing the supply to other segments of the market.
Industry mandate should have become 16GB RAM for PCs and 8GB for mobile, since years ago, but instead it is as if computing/IT industry is regressing.
New budget mobiles are being launched with lower-end specs as well (e.g., new phones with Snapdragon Gen 6, UFS2.2). Meanwhile, features that were being offered in budget phones, e.g., wireless charging, NFC, UFS3.1 have silently been moved to the premium mobile segment.
Meanwhile the OSes and software are becoming more and more complex, bloated and more unstable (bugs) and insecure (security loopholes ready for exploits).
It is as if the industry has decided to focus on AI and nothing else.
And this will be a huge setback for humanity, especially the students and scientific communities.
The problem is coming up with a viable business model for providing hardware and software that respect users’ ability to shape their environments as they choose. I love free, open-source software, but how do developers make a living, especially if they don’t want to be funded by Big Tech?
Mint 22.x doesn't appear to be demanding any more of my machine than Mint 20.x. Neither is Firefox or most websites, although YouTube chat still leaks memory horrendously. (Of course, download sizes have increased.)
I think root comment is looking at the overall picture of what all customers can get for their money, and see it getting worse.
This wasn’t mentioned, but it’s a new thing for everyone to experience, since the general trend of computer hardware is it gets cheaper and more powerful over time. Maybe not exponentially any more, but at least linearly cheaper and more powerful.
A $999 MacBook Air today is vastly better than the same $999 MacBook Air 5 years ago (and even more so once you count inflation).
I sometimes feel M$ is deliberately making its Windows OS clunkier, so it can turn into a SaaS offering with a pricey subscription, like it has already successfully done with its MS-Office suite (Office 365 is the norm in corporates these days, though individuals have to shell out $100 per year for MS Office 365 Personal edition). We can still buy MS Office 2024 as standalone editions, but they are not cheap, because Micro$oft knows the alternatives on the market aren't good enough to be a serious threat.
Modern software is fine for the most part. People look at browsers using tens of gigabytes on systems with 64GB and complain about waste rather than being thrilled that it's doing a fantastic job caching stuff to run quickly.
but you cannot consider this in isolation.
The developed markets have vastly higher spending consumers, which means companies cater to those higher spending customers proportionately more (as profits demand it). Therefore, the implication is that lower spending markets gets less investment and less catered to; after all, R&D spending is still a limiting factor.
If the entirety of the market is running on lower powered devices, then it would get catered for - because there'd be no (or not enough) customers with high powered devices to profit off.
Both didn’t run great on the “average consumer hardware”.
But I’ll admit this is cherry picking from my side :)
We'll probably end up in an even more bifurcated world where the well off have access to lot of great products and services that most of humanity is increasingly unable to access.
If someone with a low-memory laptop wants to get into coding, modern software-development-related services are incredible memory hogs.
There is the law of uncertainty override it eg trade wars, tariffs , etc.
No 1 is going all in with new capacity.
Apps are optimized for the install base, not for the engineer's own hardware.
That's like 100B+ instructions on a single core of your average superscalar CPU.
I can't wait for maps loading times being measured in percentage of trip time.
https://youtu.be/qqUgl6pFx8Q?si=x3CpsW9Aane7GHHV&t=1875
On the bright side, I'm not responsible for the UI abominations people seem to complain about WRT laptop specs.
It's just in last 5 years integrated GPUs become good enough even for mid-tier gaming let alone running browser and hw accel in few work apps.
And even before 5 years ago majority of dedicated GPUs in relatively cheap laptops was garbage barely better than intrgrated one. Manufacturers mostly put them in there for marketing of having e.g Nvidia dGPU.
Without dedicated GPUs, we consumers will get only weaker hardware, slower software and the slow death of graphics software market. See the fate of Chromebooks market segment - it is almost dead, and ChromeOS itself got abandoned.
Meanwhile, the same Google which made ChromeOS as a fresh alternative OS to Windows, Mac and Linux, is trying to gobble the AI market. And the AI race is on.
And the result of all this AI focus and veering away from dedicated GPUs (even by market leader nVidia, which is no longer having GPUs as a priority) is not only the skyrocketing price hikes in hardware components, but also other side effects. e.g., new laptops are being launched with NPUs which are good for AI but bad for gaming and VFX/CAD-CAM work, yet they cost a bomb, and the result is that budget laptop market segment has suffered - new budget laptops have just 8GB RAM, 250GB/500GB SSD, and poor CPU, and such weak hardware, so even basic software (MS Office) struggles on such laptops. And yet even such poor laptops are having a higher cost these days. This kind of deliberate market crippling affects hundreds of millions of students and middle class customers who need affordable yet decent performance PCs.
Your experience is extremely weird
All the popular mass matket games work on iGPU: fortnite, roblox, mmos, arena shooters, battlen royales. Good chunk of cross platform console titles also work just fine.
You can play Cyberpunk or BG3 on damn Steam Deck. I wont call this low end.
Number of games that dont run to some extent without dGPU is limited to heavy AAA titles and niche PC only genres.
The pattern of lazy almost non existent optimization combined with blaming consumers for having weak hardware, needs to stop.
On my 16GB ram lunar lake budget laptop CachyOS( Arch) runs so much smoother than Windows.
This is very unscientific, but using htop , running Chrome/YouTube playing music, 2 browser games and VS code having Git Copilot review a small project, I was only using 6GBs of ram.
For the most part I suspect I could do normal consumer stuff( filing paperwork and watching cat videos) on an 8GB laptop just fine. Assuming I'm using Linux.
All this Windows 11 bloat makes computers slower than they should be. A part of me hopes this pushes Microsoft to at least create a low ram mode that just runs the OS and display manager. Then let's me use my computer as I see fit instead of constantly doing a million other weird things.
We don't *need* more ram. We need better software.
Option A: We do a better job at optimizing software so that good performance requires less RAM than might otherwise be required
Option B: We wish that things were different, and that additional RAM were a viable option like it has been at many times in the past.
Option C: We use our time-benders to hop to a different timeline where this is all sorted more favorably (hopefully one where the Ballchinians are friendly)
---
To evaluate these in no particular order:
Option B doesn't sound very fruitful. I mean: It can be fun to wish, but magical thinking doesn't usually get very far.
Option C sounds fun, but my time-bender got roached after the last jump and the version of Costco we have here doesn't carry them. (Maybe someone else has a working one, but they seem to be pretty rare here.)
That leaves option A: Optimize the software once, and duplicate that optimized software to whomever it is useful using that "Internet" thing that the cool kids were talking about back in the 1980s.
Why on earth you think RAM uses NAND flash ?
Insane that this is seen as "better software". I could do basically the same functionality in 2000 with 512mb. I assume this is because everything runs through chrome with dozens more layers of abstraction but
512MB in 2000 was like HEDT level (though I'm not sure that acronym existed back then)
64MB = w98se OK, XP will swap a lot on high load, nixlikes really fast with fvwm/wmaker and the like. KDE3 needs 128MB to run well, so get a bit down. No issues with old XFCE releases. Mozilla will crawl, other browsers will run fine.
128MB = w98se really well, XP willl run fine, SP2-3 will lag. Nixlikes will fly with wmaker/icewm/fvwm/blackbox and the like. Good enough for mozilla.
192MB = Really decent for a full KDE3 desktop or for Windows XP with real life speeds.
256MB = Like having 8GB today for Windows 10, Gnome 3 or Plasma 6. Yes, you can run then with 2GB and ZRAM, but, realistically, and for the modern bloated tools, 8GB for a 1080p desktop it's mandatory. Even with UBlock Origin for the browser. Ditto back in the day. With 256MB XP and KDE3 flied and they ran much faster than even Win98 with 192MB of RAM.
Win11, on the other hand, meh..
Though Win10 will stop getting updates, but M$ is mistaken if it thinks it can force customers to switch to more expensive, buggy, bad performance Win11.
That's why I switched to Linux for my old PC (a cute little Sony Viao), though it worked well with Win10. Especially after I upgraded it to an 1TB SATA SSD (since even old SATA1.0 socket works with newer SATA SSDs, as SATA is backward compatible), some additional RAM (24GB (8+16) - 16GB repurposed from another PC), and a new battery (from Amazon - it was simply plug and play - simply eject the old battery from its slot and plug in the new battery).
I find it refreshing to see how easy it was to upgrade old PCs, I think manufacturers are deliberately making it harder to repair devices, especially mobile phones. That's why EU and India were forced to mandate the Right to Repair.
I was being lazy, but optimized I guess I could get down to 4GB of ram.
You can already do this. For example, I use `systemd-run` to run browsers with CPU quotas applied. Firefox gets 400% CPU (i.e. up to 4 cores), and no more.
Example command: systemd-run --user --scope -p CPUQuota=400% firefox
You can limit CPU usage for a program in Windows by adjusting the "Maximum processor state" in the power options to a lower percentage, such as 80%. Additionally, you can set the program's CPU affinity in Task Manager to restrict it to fewer CPU cores.
You can also use a free tool like Process Lasso or BES to limit the CPU for a Windows application. You can use a free tools like HWInfo, SysInternals (ProcMon, SysMom, ProcDump) to monitor and check for CPU usage, especially to investigate CPU spikes caused by rogue (malware or poor performance) apps.
it was only less than 10 yrs ago that a high end PC would have this level of ram. I think the last decade of cheap ram and increasing core count (and hz) have spoiled a lot of people.
We are just returning back on trend. May be software would be written better now that you cannot expect the average low budget PC to have 32G of ram and 8 cores.
A budget PC today at a certain price point (say, $500) would certainly have a lot more powerful CPU and faster disk storage and faster RAM than a similarly priced PC just 5 years ago.
But as OSes and programs get more bloated and more complex, I feel 1TB+ SSDs are required these days in PCs. Windows OS and programs themselves can take up hundreds of GBs of space.
But you will notice that budget PCs are being launched these days with lower-grade CPUs (i3 or equivalent), lesser RAM (8GB), lesser storage (256 or 500GB) SSD.
We seem to be regressing backwards in specs for PCs and even mobiles (good luck finding features like NFC and wireless charging in budget phones these days; though these features were available in same segment (budget phones) few years ago).
And it is not just due to AI, it is due to hardware manufacturers and assemblers thinking they can hoodwink consumers and sell us less value for more price.
Of course, some performance-focused software (e.g. Zed) does start near-instantly on my MacBook, and it makes other software feel sluggish in comparison. But this is the exception, not the rule.
Even as specs regress, I don't think most people in software will care about performance. In my experience, product managers never act on the occasional "[X part of an app] feels clunky" feedback from clients. I don't expect that to change in the near future.
Imagine Ford, upon the invention of push-button climate controls, just layered those buttons on top of the legacy sliders, using arms and actuators so pressing "Heat Up" moved an actuating arm that moved that underlying legacy "Heat" slider. Then when touch screens came about, they just put a tablet over those buttons, so selecting "Heat Up" fired a solenoid that pressed the "Heat Up" button that moved the arm to slide the "Heat Up" slider.
Ford, or anyone else, would never implement this, for a long obvious list of reasons.
But in software? That's just Thursday. Hence software has seemed stuck in time for 30 years while processing speed has done 10,000x. No need to redesign the whole system, just type out a few lines of "actuating arm" code.
If you want a powerful laptop for cheap, get a gaming PC. The build quality and battery life probably won't be great, but you can't be cheap without making compromises.
Same idea for budget mobiles. A Snapdragon Gen 6 (or something by Mediatek) with UFS2.2 is more than what most people need.
351 more comments available on Hacker News