Jedec Developing Reduced Pin Count Hbm4 Standard to Enable Higher Capacity
Key topics
The tech world is abuzz with JEDEC's new reduced pin count HBM4 standard, sparking debate on its potential impact on consumers and the GPU market. While some commenters lament that advancements like HBM4 primarily benefit AI-focused companies, others point out that Nvidia's dominance and the duopoly in the GPU market may stifle innovation. As one commenter noted, even with more players entering the market, it may not necessarily lead to better outcomes for consumers, citing the memory manufacturers as an example. The discussion also highlights the unlikely prospect of HBM4 trickling down to consumer GPUs anytime soon, with some predicting it won't happen before 2030.
Snapshot generated from the HN discussion
Discussion Activity
Moderate engagementFirst comment
7d
Peak period
9
168-180h
Avg / period
8.5
Based on 17 loaded comments
Key moments
- 01Story posted
Dec 18, 2025 at 11:11 AM EST
16 days ago
Step 01 - 02First comment
Dec 25, 2025 at 4:13 AM EST
7d after posting
Step 02 - 03Peak activity
9 comments in 168-180h
Hottest window of the conversation
Step 03 - 04Latest activity
Dec 25, 2025 at 7:36 PM EST
9 days ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
> Mian Quddus, chairman of the JEDEC Board of Directors, said: “JEDEC members are actively shaping the standards that will define next generation modules for use in AI data centers, driving the future of innovation in infrastructure and performance.”
It's nice to see that there still is progress to be made given that a lot of modern semiconductor technology is at the edge of what plain physics and chemistry allow... but hell I can't say I'm happy that it, like with low-latency/high bandwidth communications and HFT, it will again be only the uber rich that can enjoy the new and fancy stuff for years. It's not like you can afford an average decent mid/upper range GPU these days thanks to the AI bros.
I mean, Nvidia was greedy even before then and AMD just did “Nvidia - 50 USD” or thereabout.
Intel Arc tried shaking up the entry level (retailers spit on that MSRP though) but sadly didn’t make that big of a splash despite the daily experience being okay (I have the B580). Who knows, maybe their B770 will provide an okay mid range experience that doesn’t feel like being robbed.
Over here, to get an Nvidia 5060 Ti 16 GB I'd have to pay over 500 EUR which is fucking bullshit, so I don’t.
The bad part is everyone wants to be on the AI money circle line train (see the various money flow images available) and thus everything caters for that. At this point i'd rather have nvidia and amd quit the gpu business and focus on "ai" only, that way a new competitor can enter the business and cater the the niche applications like consumer gpus.
Nvidia is expected to sell GPU intellectual property at a bargain to the entry-level segment, making it unprofitable for Intel to develop a competitive product range. This way, Intel would lack both the competence and the infrastructure internally to eventually break Nvidia’s market share in the higher segments.
The Intel Arc B60 probably would have made a splash if they had actually produced any of the damn things. 24GB vram for low prices would have been huge for the AI crowd, and there was a lot of excitement and then Intel just didn't offer them for sale.
The company is too screwed up to take advantage of any opportunities.
Yeah, maybe in a decade. And the "benefits" will be a metric shit ton of job losses plus a crash that will make 2000's dotcom plus 2007ff real estate/euro combined look harmless...
You are getting 3nm and 2nm along with GAA later this year precisely because of AI.
One big issue with HBM is the amount of idle power it consumes. A single MI355 is ~230W, just idle.
But to answer - memory is progressing very slowly. DDR4 to DDR5 was not even a meaningful jump. Even PCIe SSDs are slowly catching up to it which is both funny and sad.
https://news.ycombinator.com/item?id=46302002
https://morethanmoore.substack.com/p/solving-the-problems-of...