Groq Investor Sounds Alarm on Data Centers
Key topics
A Groq investor's warning about the data center market sparked a lively debate, with some commenters weighing in on the perils of "speculative capacity" and the "build it and they will come" strategy. As the discussion unfolded, a tangent emerged about Axios's allegedly hijacked back button, with some users decrying the behavior as "shabby" and others chiming in with potential workarounds, like browser extensions or a hypothetical "toggle scripts" button. Meanwhile, others dissected the investor's concerns, pointing out that hyperscalers often use complex financial maneuvers to fund their data centers. The thread's dual focus on data center economics and web development frustrations made for a fascinating, if disjointed, conversation.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
15m
Peak period
68
0-12h
Avg / period
14.8
Based on 74 loaded comments
Key moments
- 01Story posted
Dec 30, 2025 at 7:47 AM EST
10 days ago
Step 01 - 02First comment
Dec 30, 2025 at 8:02 AM EST
15m after posting
Step 02 - 03Peak activity
68 comments in 0-12h
Hottest window of the conversation
Step 03 - 04Latest activity
Jan 7, 2026 at 3:20 PM EST
1d ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
Back in the 90s, I could hit the Esc key and stop all of the animations, allowing my ADHD brain to regain the focus needed to read the actual article.
Is this actually true? I thought that hyperscalers keep datacenters at arm's length, using subsidiaries and outsourcing a lot of things.
They can run other ML stuff but I don't see how the world could absorb the amount of compute/RAM for these other workflows.
Since the hardware itself depreciates quite fast compared to the buildings, the long lasting physical structure is probably the best bet, being repurposed for more general computing (since all HVAC, electricity, and other expensive infrastructure is already in place).
So they're sitting on real estate with access to massive amounts of water, electricity, and high bandwidth network connections. Seems like that combination of resources could be useful for a lot of other things beyond just data centers.
Like you could probably run desalination plants, large scale hydroponic farms, semiconductor manufacturing, or chemical processing facilities. Anything that needs the trifecta of heavy power, water infrastructure, and fiber connectivity could slot right in.
I don't see any reasonable path moving forward for these datacenters for the amount of money that they have invested.
And all other industries also don't really seem to me to have any overlap with the datacenter industry as much aside from having water access and land and electricity but like I doubt that they would get used enough to be justified their costs, especially the costs of the overpriced GPU's and ram and other components
In my opinion, These large datacenters are usually a lost cause if the AI bubble bursts since they were created with such a strong focus of GPU's and other things and their whole model of demand is related to AI
If the bubble bursts, I think that auctioning server hardware might happen but I doubt how much of that would be non-gpu / pure compute related servers or perhaps gpu but good for the average consumer.
Depending on where, and (more importantly) when you last read about this, there's been some developments. The original book that started this had a unit conversion error, and the reported numbers were off by about 4500x what the true numbers are (author claimed 1000 times more water than an entire city consumption, while in reality it was estimated at ~22% of that usage).
The problem is that we're living in the era of rage reporting, and corrections rarely get the same coverage as the initial shock claim.
On top of this, DCs don't make water "disappear", in the same way farming doesn't make it disappear. It re-enters the cycle via evaporation. (also, on the topic of farming, don't look up how much water it takes to grow nuts or avocados. That's an unpopular topic, apparently)
And thirdly, DCs use evaporative cooling because it's more efficient. They could, if push came to shove, not use that. And they do, when placed in areas without adequate water supply, use regular cooling.
My point is simple: the utility infrastructure is the hard part. The silicon sitting on raised floors is disposable and will be obsolete in a few years. But the power substations, fiber connections, and water infrastructure? That takes years to permit and build, and that's where the real value is.
Building that infrastructure (trenches for water lines, electrical substations, laying fiber) is the actual constraint and where the long term value lies. Whether they're running GPUs or something else entirely, industries will pay for access to that utility infrastructure long after today's AI hardware is obselete.
You're lecturing me about evaporative cooling efficiency while completely missing the point.
Still, I do feel there must be some difference between farming and cooling use by evaporation. As at least part of water is run off back to rivers and then seep back to ground water. These again depend largely on location.
(I recommend this video by Hank Green on the subject: https://www.youtube.com/watch?v=H_c6MWk7PQc . Water usage of data centers is a complex and quite localized concern, not something that's going to be a constant across every deployment)
I can't be the only one who's noticed that the volume of crypto stuff being pushed slowed as the AI fervor ramped up?
Then the inference mega datacentre investment becomes a losing proposition for those who paid too much for the wrong kind of computers, but AI users still get more compute, lower prices etc.
But I don't really think that focusing on GPU market is gonna work as I don't really see much reason to cater to GPU intensive workloads unless you are affiliated with AI (for most part, some GPU compute is good but datacenters shouldn't really invest so much in GPU compute)
I feel like I am more optimistic about buying stuff (preferably compute) 3-5 years down the board when things might become cheap in my opinion
But this is still a very long time. On one hand I want Ai bubble to burst asap to lessen the impacts of financial crisis but on the other I really think about the financial crisis and think if there are any ways to mitigate the economy's loss so that it doesn't become 2008 crisis
But reality to me feels as if the chances of Bubble popping up is inevitable, the only question is if we can do anything to lessen the strain on the average person all around the world. I think America will be impacted the most but I wonder how geo-politically this might impact other countries as compared to the 2008 crisis.
I don't really think the american govt. is doing anything to lessen the strain the bubble can cause tho, perhaps even promoting things like stargate/AI bubble itself.
It's just sad for 3-4 years. Lets hope economy gets better and more reasonable than AI hype
After "AI": Anticipating a post-LLM science and technology revolution (evalapply.org)
https://news.ycombinator.com/item?id=46419416https://www.evalapply.org/posts/after-ai/index.html
> I, for one, welcome the coming age of the post-LLM-datacenter-overinvestment-bust-fueled backyard GPU supercomputer revolution.
I'd bet anything that they know people are pissed the market has wildly misallocated capital and are trying to save face by framing it as a cost that directly impacts consumers. In reality, the fear is knowing that our investor class is really too dumb to handle the money we've stuck in a box called "the world's retirement funds" and handed to them.
all it took was a chatbot that says "you're a genius"
What a great analogy. People building too many steam engines was one of the main factors that lead to the first world war.
https://en.wikipedia.org/wiki/Railway_Mania
Those popped all over Europe during the entire century, with the last one popping at the 1890s. For some reason, people focus only on this large one on the UK.
Sources for how 19th Century imperialism had a lot of support from extend-and-pretend infrastructure bubbles are harder to search.
Routing thick black DP cables and USB3 throughout the house feels a lot less natural than just installing ethernet ports in each room. :(
I once had plans like this...
This glut of finances leaves those few with a problem: what to do with it? And since accumulation of wealth is the ultimate goal, FOMO encourages increasingly foolish decisions (or at least unsustainable ones).
The housing crisis is one of the outcomes. This AI bubble is another. I'm sure if we look around, we can find other examples.
If resources were more evenly distributed, we would see different regions and peoples doing more varied things which would be appropriate for their needs.
But yes, we should also make sure that everybody can benefit from it.
...in the concentration of money.
"The handful of benevolent god-kings will be nice to people" is not a model for a decent future.
How do we typically deal with positive feedback?
I don't have a problem with concentration of money. I have a problem with playing the game that the concentration of money doesn't actually exist yet and shouldn't be taxed.
You need legislation that forces the Banks to lend money for Asset CREATION not Asset ACCUMULATION. Or at least TAXES asset consumption heavily.
This is especially true for limited natural resources like water. And it's been true for oil and gas development for a century.
Markets will punish them for poor investments. We have not seen that yet, but it is coming.
He says this and then publishes the letter on LinkedIn? What's the play?