AGI fantasy is a blocker to actual engineering
Mood
thoughtful
Sentiment
mixed
Category
tech
Key topics
AGI
AI engineering
AI research
The article argues that the fantasy surrounding Artificial General Intelligence (AGI) is hindering actual progress in AI engineering.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
52m
Peak period
159
Day 1
Avg / period
80
Based on 160 loaded comments
Key moments
- 01Story posted
11/14/2025, 1:21:24 PM
4d ago
Step 01 - 02First comment
11/14/2025, 2:13:18 PM
52m after posting
Step 02 - 03Peak activity
159 comments in Day 1
Hottest window of the conversation
Step 03 - 04Latest activity
11/18/2025, 6:11:31 PM
15h ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Ever since "AI" was named at Dartmouth, there have been very smart people thinking that their idea will be the thing which makes it work this time. Usually, those ideas work really well in-the-small (ELIZA, SHRDLU, Automated Mathematician, etc.), but don't scale to useful problem sizes.
So, unless you've built a full-scale implementation of your ideas, I wouldn't put too much faith in them if I were you.
If you have something that gives a sticky +5% at 250M scale, you might have an actual winner. Almost all new ML ideas fall well short of that.
But, I shouldn't have said anything.
For those who've been sniffing this since early 2010, it's so blindly obvious they've already dropped llms on the floor and moved onto deeper alternative research.
For the rest of us, we're still catching coke bottles from the sky and building places of worship around them
There should be papers on fundamental limitations of LLMs then. Any pointers? "A single forward LLM pass has TC0 circuit complexity" isn't exactly it. Modern LLMs use CoT. Anything that uses Gödel's incompleteness theorems proves too much (We don't know whether the brain is capable of hypercomputations. And, most likely, it isn't capable of that).
That is, if you don't build the Torment Nexus from the classic sci-fi novel Don't Create The Torment Nexus, someone else will and you'll be punished for not building it.
It seems we're much more likely to accidentally build something dumb that kills us via an unanticipated side effect. Like actual Weaboo but for society not just one business. AI helps Coca-Cola develop a new beverage that initially seems very popular and cheap so it's quickly the world's top selling drink - and then we realise, too late, that it's actually extremely addictive and withdrawal induces violent rage. Oh dear. That sort of thing.
Won't someone think of the poor simulations??
Now I run it through whisper in a couple minutes, give one quick pass to correct a few small hallucinations and misspellings, and I'm done.
There are big wins in AI. But those don't pump the bubble once they're solved.
And the thing that made Whisper more approachable for me was when someone spent the time to refine a great UI for it (MacWhisper).
An essay writing machine is cool. A machine that can competently control any robot arm, and make it immediately useful is a world-changing prospect.
Moving and manipulating objects without explicit human coded instructions will absolutely revolutionize so much of our world.
That's really the only value those technologies provide, so if people aren't seeing costs come down there really is zero value coming from those technologies.
It's better than Whisper, and faster, while running on CPU on my ten year old ThinkPad.
I had Claude make me Python bindings for it and add it to my voice typing app.
We live in the future.
Maybe it could be a little bit more accurate, it would be nice if it ran a little faster, but ultimately it's 95% complete software that can be free forever.
My guess is very many AI tasks are going to end up this way. In 5-10 years we're all going to be walking around with laptops with 100k cores and 1TB of RAM and an LLM that we talk to and it does stuff for us more or less exactly like Star Trek.
Yesterday I heard Cory Doctorow talk about a bunch of pro bono lawyers using LLMs to mine paperwork and help exonerate innocent people. Also a big win.
There's good stuff - engineering - that can be done with the underlying tech without the hyperscaling.
Like it's not even clear if LLMs/Transformers are even theoretically capable of AGI, LeCun is famously sceptical of this.
I think we still lack decades of basic research before we can hope to build an AGI.
On the other hand, extracting usable insights from neuroscience? Not at all easy. Human brain does not yield itself to instrumentation.
If an average human had 1.5 Neuralink implants in his skull, and raw neural data was cheap and easy to source? You bet someone would try to use that for AI tech. As is? We're in the "bitter lesson" regime. We can't extract usable insights out of neuroscience fast enough for it to matter much.
Other energy usage figures, air pollution, gas turbines, CO2 emissions etc are fine - but if you complain about water usage I think it risks discrediting the rest of your argument.
(Aside from that I agree with most of this piece, the "AGI" thing is a huge distraction.)
UPDATE an hour after posting this: I may be making an ass of myself here in that I've been arguing in this thread about comparisons between data center usage and agricultural usage of water, but that comparison doesn't hold as data centers often use potable drinking water that wouldn't be used in agriculture or for many other industrial purposes.
I still think the way these numbers are usually presented - as scary large "gallons of water" figures with no additional context to help people understand what that means - is an anti-pattern.
Golf and datacenters should have to pay for their externalities. And if that means both are uneconomical in arid parts of the country then that's better than bankrupting the public and the environment.
> I asked the farmer if he had noticed any environmental effects from living next to the data centers. The impact on the water supply, he told me, was negligible. "Honestly, we probably use more water than they do," he said. (Training a state-of-the-art A.I. requires less water than is used on a square mile of farmland in a year.) Power is a different story: the farmer said that the local utility was set to hike rates for the third time in three years, with the most recent proposed hike being in the double digits.
The water issue really is a distraction which harms the credibility of people who lean on it. There are plenty of credible reasons to criticize data enters, use those instead!
> Honestly, we probably use more water than they do
This kind of proves my point, regardless of the actual truth in this regard, it's a terrible argument to make: availability of water starts to become a huge problem in a growing amount of places, and this statement implies the water usage of something, that in basic principle doesn't need water at all, uses comparable amount of water as farming, which strictly relies on water.
Is that really the case? - "Data Centers and Water Consumption" - https://www.eesi.org/articles/view/data-centers-and-water-co...
"...Large data centers can consume up to 5 million gallons per day, equivalent to the water use of a town populated by 10,000 to 50,000 people..."
"I Was Wrong About Data Center Water Consumption" - https://www.construction-physics.com/p/i-was-wrong-about-dat...
"...So to wrap up, I misread the Berkeley Report and significantly underestimated US data center water consumption. If you simply take the Berkeley estimates directly, you get around 628 million gallons of water consumption per day for data centers, much higher than the 66-67 million gallons per day I originally stated..."
> U.S. data centers consume 449 million gallons of water per day and 163.7 billion gallons annually (as of 2021).
Sounds bad! Now let's compare that to agriculture.
USGS 2015 report: https://pubs.usgs.gov/fs/2018/3035/fs20183035.pdf has irrigation at 118 billion gallons per day - that's 43,070 billion gallons per year.
163.7 billion / 43,070 billion * 100 = 0.38 - less than half a percentage point.
It's very easy to present water numbers in a way that looks bad until you start comparing them thoughtfully.
I think comparing data center water usage to domestic water usage by people living in towns is actually quite misleading. UPDATE: I may be wrong about this, see following comment: https://news.ycombinator.com/item?id=45926469#45927945
They are not equivalent. Data centers primarily consume potable water, whereas irrigation uses non-potable or agricultural-grade water. Mixing the two leads to misleading conclusions on the impact.
The same person who mentioned potable water being an important distinction also cited a report on data center water consumption that did not make the distinction (where the 628M number came from).
> water evaporation from hydroelectric dam reservoirs in their water use calculations
The fact data centers are already having a major impact on the public water supply systems is known, by the decisions some local governments are forced to do, if you care to investigate...
https://spectrum.ieee.org/ai-water-usage
"...in some regions where data centers are concentrated—and especially in regions already facing shortages—the strain on local water systems can be significant. Bloomberg News reports that about two-thirds of U.S. data centers built since 2022 are in high water-stress areas.
In Newton County, Georgia, some proposed data centers have reportedly requested more water per day than the entire county uses daily. Officials there now face tough choices: reject new projects, require alternative water-efficient cooling systems, invest in costly infrastructure upgrades, or risk imposing water rationing on residents...."
https://www.bloomberg.com/graphics/2025-ai-impacts-data-cent...
It's fair to be critical of how the ag industry uses that water, but a significant fraction of that activity is effectively essential.
If you're going to minimize people's concern like this, at least compare it to discretionary uses we could ~live without.
The data's about 20 years old, but for example https://www.usga.org/content/dam/usga/pdf/Water%20Resource%2... suggests we were using over 2b gallons a day to water golf courses.
If data center usage meant we didn't have enough water for agriculture I would shout that from the rooftops.
Exploring how it stacks up against an essential use probably won't persuade people who perceive it as wasteful.
If Americans cut their meat consumption by 10%, we would use a lot less water in agriculture and probably also live longer in general
It's a common reasoning error to bundle up many heterogeneous things into a single label ("agriculture!") and then assign value to the label itself.
We could feed the world with far less water consumption if we opted not to eat meat. Instead, we let people make purchasing decisions for themselves. I'm not sure why we should take a different approach when making decisions about compute.
If you look at the data for animals, that’s not really true. See [1] especially page 22 but the short of it is that the vast majority of water used for animals is “green water” used for animal feed - that’s rainwater that isn’t captured but goes into the soil. Most of the plants used for animal feed don’t use irrigation agriculture so we’d be saving very little on water consumption if we cut out all animal products [2]. Our water consumption would even get a lot worse because we’d have to replace that protein with tons of irrigated farmland and we’d lose the productivity of essentially all the pastureland that is too marginal to grow anything on (50% of US farmland, 66% globally).
Animal husbandry has been such a successful strategy on a planetary scale because it’s an efficient use of marginal resources no matter how wealthy or industrialized you are. Replacing all those calories with plants that people want to actually eat is going to take more resources, not less, especially when you’re talking about turning pastureland into productive agricultural land.
[1] https://digitalcommons.unl.edu/cgi/viewcontent.cgi?article=1...
[2] a lot of feed is also distiller’s grains used for ethanol first before feeding them to animals, so we’d wouldn’t even cut out most of that
Since you like their work, the authors of your paper answered that question more generally here https://link.springer.com/article/10.1007/s10021-011-9517-8 where they conclude "The water footprint of any animal product is larger than the water footprint of crop products with equivalent nutritional value".
You make some often debunked claims, like we'd have to plant more crops to feed humans directly if we stopped eating meat.
This shouldn't make intuitive sense to you since animals eat feed grown on good cropland (98% of the water footprint of animal ag) that we could eat directly, and we lose 95% of the calories when we route crops through animals.
That paper isn’t actually debunking anything that I’m saying. If the water foot print per calorie is 20x for beef but the feed is grown with 90% of its water from rainfall, that’s not a 20x bigger footprint in a way that practically matters because most of that water is unrecoverable anyway. The water that is recoverable just makes it through the watershed.
Meat is a way to convert land that cant grow things people can or want to eat into things that people will eat. That pastureland and marginal cropland growing animal feed can’t just be converted to grow more economically productive crops like fruit and vegetables without Herculean engineering effort and tons of water and fertilizer. Instead the farming would have to stress other fertile ecosystems like the Southwest which would make the water problems worse, even if their total “footprint” is smaller. The headline that beef uses 20x more water per calorie completely ignores where that water comes from and how useful it actually is to us.
I don’t doubt that we can switch to an all plant diet as a species but people vastly underestimate the ecological and societal cost to do so.
Add a datacenter tax of 3x to water sold to datacenters and use it to improve water infrastructure all around. Water is absolutely a non-issue medium term, and is only a short term issue because we've forgotten how to modestly grow infrastructure in response to rapid changes in demand.
Does it count water use for cooling only, or does it include use for the infrastructure that keeps it running (power generation, maintenance, staff use, etc.)
Is this water evaporated? Or moved from A to B and raised a few degrees.
Water is used in modern datacenters for evaporative cooling, and the reason it's used is to save energy -- it's typically around 10% more energy efficient overall than normal air conditioning. These datacenters often have a PUE of under 1.1, meaning they're over 90% efficient at using power for compute, and evaporative cooling is one of the reasons they're able to achieve such high efficiency.
If governments wanted to, they could mandate that datacenters use air conditioning instead of evaporative cooling, and water usage would drop to near zero (just enough for the restrooms, watering the plants, etc). But nobody would ever seriously suggest doing this because it would be using more of a valuable resource (electricity / CO2 emissions) to save a small amount of a cheap and relatively plentiful resource (water).
Similarly, if I say "I object to the genocide in Gaza", would you assume that I don't also object to the Uyghur genocide?
This is nothing but whataboutism.
People are allowed to talk about the bad things AI does without adding a 3-page disclaimer explaining that they understand all the other bad things happening in the world at the same time.
If you take a strong argument and through in an extra weak point, that just makes the whole argument less persuasive (even if that's not rational, it's how people think).
You wouldn't say the "Uyghur genocide is bad because of ... also the disposable plastic crap that those slave factories produce is terrible for the environment."
Plastic waste is bad but it's on such a different level from genocide that it's a terrible argument to make.
Especially since so many anti-crypto people immediately pivoted to anti-AI. That sudden shift in priorities makes it hard to take them seriously.
Beef, I guess, is a popular type of food. I’m under the impression that most of us would be better off eating less meat, maybe we could tax water until beef became a special occasion meal.
You can easily write a law that looks like this: There is now a water usage tax. It applies only to water used for data-centers. It does not apply to residential use, agricultural use, or any other industrial use.
We do preferential pricing and taxing all the time. My home's power rate through the state owned utility is very different than if I consumed the exact same amount of power, but was an industrial site. I just checked and my water rate at home is also different than if I were running a datacenter. So in all actuality we already discriminate for power and water based on end use. at least where I live. Most places I have lived have different commercial and residential rates.
In other words, the price of beef can stay the same.
And yet, if you believe the environmentalist argument in the first place, the price of beef should go up for the damages it causes. Hence why a lot of people think the people complaining about AI are wearing an environmentalist mask, rather than having an actual care about the environment.
Pigouvian taxes are fine, but should be applied across all sources of damage.
But the environment doesn't really care whether the water is being used by a datacenter or something else. My point is just that data centers are actually more efficient users of water compared to many less-controversial users.
I can live without another datacenter - I get very little utility from "one more" - but I have to eat, generally every day..
Might as well get rid of all the lawns and football fields while we’re at it.
My perspective from someone who wants to understand this new AI landscape in good faith. The water issue isn't the show stopper it's presented as. It's an externality like you discuss.
And in comparison to other water usage, data centers don't match the doomsday narrative presented. I know when I see it now, I mentally discount or stop reading.
Electricity though seems to be real, at least for the area I'm in. I spent some time with ChatGPT last weekend working to model an apples:apples comparison and my area has seen a +48% increase in electric prices from 2023-2025. I modeled a typical 1,000kWh/month usage to see what that looked like in dollar terms and it's an extra $30-40/month.
Is it data centers? Partly yes, straight from the utility co's mouth: "sharply higher demand projections—driven largely by anticipated data center growth"
With FAANG money, that's immaterial. But for those who aren't, that's just one more thing that costs more today than it did yesterday.
Coming full circle, for me being concerned with AI's actual impact on the world, engaging with the facts and understanding them within competing narratives is helpful.
https://www.politico.com/news/2025/05/06/elon-musk-xai-memph...
It's not even an externality? They just pay market price for water. You can argue the market price is priced badly (e.g., maybe prices are set by the state), but that doesn't make it an externality. The benefits/costs are still accrued by (and internal to) buyer and seller.
There's lots of promising lower-consumption cooling options, but seems like we are not yet seeing that in a large fraction of data centers globally.
Also a lot less meat in general. A huge part of our agriculture is feed to feed our food. We need some meat, but the current amount is excessive
The US will never give up on eating meat. Full stop.
For every vegan/vegetarian in the US there are probably 25 people that feed beef products to their pets on a daily basis.
Also, while I'm vegetarian going on vegan, welfare arguments are obviously not relevant in response to an assertation that Americans aren't going to give up meat, because if animal welfare was relevant then Americans would give up meat.
I don't see any signs that the US is going to give up on AI and data centers, either. (The coming AI winter notwithstanding)
For what it's worth, I've cut back quite a bit on my beef and pork consumption, and now mostly eat chicken. The environmental and ethical arguments finally got to me.
But their point does disarm the suggestion that water consumption for AI is bad because it's just for fun while meat feeds people.
Because when you eat meat, you could have eaten something far less resource intensive like tempeh. But you ate meat for reasons beyond survival. For most of us, it's because we like the taste and we're used to it.
I don't see that as having any stronger of a claim to water consumption than the things we use AI for (fun, getting work done, writing nix/k8s config) much less a claim to many times the amount of water consumption than AI data centers.
The US never gave up eating lobster either, but many here have never had lobster and almost nobody has lobster even once a week. It's a luxury which used to be a staple.
Data centers in the USA use less than a fraction of a percent of the water that's used for agriculture.
I'll start worrying about competition with water for food production when that value goes up by a multiple of about 1000.
This is more than 4 times more than all data centers in the US combined, counting both cooling and the water used for generating their electricity.
What has more utility: Californian almonds, or all IT infrastructure in the US times 4?
AI has no utility.
Almonds make marzipan.
That is, the topic is not one where I have already picked a side that I'd like to win by any means necessary. It's one where I think there are legitimate tradeoffs, and I want the strongest arguments on both sides to be heard so we get the best possible policies in the end.
The article made a really interesting and coherent argument, for example. That's the kind of discourse around the topic I'd like to see.
AI is at least as useful as marzipan.
Of course surface water availability can also be a serious problem.
Tried buying a GPU lately?
The people advocating for sustainable usage of natural resources have already been comparing the utility of different types of agriculture for years.
Comparatively, tofu is efficient to produce in terms of land use, greenhouse gas emissions, and water use, and can be made shelf-stable.
If water use was such a dire issue that we needed to start cutting down on high uses of it, then we should absolutely cherry pick the high usages of it and start there. (Or we should just apply a pigouvian tax across all water use, which will naturally affect the biggest consumers of it.)
The contention with AI water use is that something like this is currently happening as local water supplies are being diverted for data-centers.
Excellent, that means we can save massive amounts of water by stopping almond production in the western US.
Is the US AI data-centers producing 80% of the world's IT ?
I ask legitimately, I think that would already make it more apples to apples.
Also if you ask me personally, I'd rather have almonds than cloud AI compute. Imagine a future 100 years from now, we killed the almonds, never to be enjoyed ever again by future generations... Or people don't have cloud AI compute. It's personal, but I'd be more sad that I'd never get to experience the taste of an almond and all the cuisine that comes with it.
You've misread it. It's not compared to AI datacenters, it's every type of datacenter, for all types of computing.
In the future scenario you've laid out it wouldn't be cloud AI compute. You wouldn't be able to use HN or send email or pay with a credit card or play video games or stream video.
Example article from a decade ago: https://www.motherjones.com/environment/2015/01/almonds-nuts...
Of course water used up will eventually evaporate, and produce rainfall in the water cycle, but unfortunately at many places "fossil" water is used up, or more water used in an area then the watershed can sustainably support.
This is a constant source of miscommunication about water usage, and that of agriculture also. It is very different to talk about the water needs to raise a cow in eg. Colorado and in Scotish highlands, but this is usually removed from the picture.
The same context should be considered for datacenters.
I think it's bad though to be against growth, for reasons I've described in another comment.
https://www.bbc.com/news/articles/cx2ngz7ep1eo
https://www.theguardian.com/technology/2025/nov/10/data-cent...
https://www.reuters.com/article/technology/feature-in-latin-...
> A small data centre using this type of cooling can use around 25.5 million litres of water per year. [...]
> For the fiscal year 2025, [Microsoft's] Querétaro sites used 40 million litres of water, it added.
> That's still a lot of water. And if you look at overall consumption at the biggest data centre owners then the numbers are huge.
That's not credible reporting because it makes no effort at all to help the reader understand the magnitude of those figures.
"40 million litres of water" is NOT "a lot of water". As far as I can tell that's about the same annual water usage as a 24 acre soybean field.
As a businessman, I want to make money. E.g. by automating away technologists and their pesky need for excellence and ethics.
On a less cynical note, I am not sure that selling quality is sustainable in the long term, because then you'd be selling less and earning less. You'd get outcompeted by cheap slop that's acceptable by the general population.
467 more comments available on Hacker News
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.