Openai and Nvidia Announce Partnership to Deploy 10gw of Nvidia Systems
Mood
heated
Sentiment
negative
Category
other
Key topics
OpenAI and Nvidia announce a partnership to deploy 10GW of Nvidia systems, sparking concerns about energy consumption, environmental impact, and the potential for a bubble in AI investments.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
21m
Peak period
156
Day 1
Avg / period
80
Based on 160 loaded comments
Key moments
- 01Story posted
Sep 22, 2025 at 12:10 PM EDT
2 months ago
Step 01 - 02First comment
Sep 22, 2025 at 12:30 PM EDT
21m after posting
Step 02 - 03Peak activity
156 comments in Day 1
Hottest window of the conversation
Step 03 - 04Latest activity
Sep 24, 2025 at 7:24 AM EDT
2 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
https://bsky.app/profile/anthonycr.bsky.social/post/3lz7qtjy...
(pencil in another loop between Nvidia and OpenAI now)
It does seem like Satya believes models will get commoditized, so no need to hitch themselves with OpenAI that strongly.
https://www.reuters.com/business/microsoft-use-some-ai-anthr...
(Quick, inaccurate googling) says there will be "well over 1 million GPUs" by end of the year. With ~800 million users, that's 1 NVIDIA GPU per 800 people. If you estimate people are actively using ChatGPT 5% of the day (1.2 hours a day), you could say there's 1 GPU per 40 people in active use. Assuming consistent and even usage patterns.
That back of the envelope math isn't accurate, but interesting in the context of understanding just how much compute ChatGPT requires to operate.
Edit: I asked ChatGPT how many GPUs per user, and it spit out a bunch of calculations that estimates 1 GPU per ~3 concurrent users. Would love to see a more thorough/accurate break down.
With that kind of singularity the man-month will no longer be mythical ;)
With varying consumption/TDP, could be significantly more, could be significantly less, but at least it gives a starting figure. This doesn't account for overhead like energy losses, burst/nominal/sustained, system overhead, and heat removal.
So at that point a DC replaces them all with ASICs instead?
Or if they just feel like doing that any time.
To be clear, I am comparing power consumption only. In terms of mining power, all these GPUs could only mine a negligible fraction of what all specialized Bitcoin ASIC mine.
Edit: some math I did out of sheer curiosity: a modern top-of-the-line GPU would mine BTC at about 10 Ghash/s (I don't think anyone tried but I wrote GPU mining software back in the day, and that is my estimate). Nvidia is on track to sell 50 million GPUs in 2025. If they were all mining, their combined compute power would be 500 Phash/s, which is 0.05% of Bitcoin's global mining capacity.
The NVL72 is 72 chips is 120 kW total for the rack. If you throw in ~25 kW for cooling its pretty much exactly 2 kW each.
Google is pretty useful.
It uses >15 TWh per year.
Theoretically, AI could be more useful than that.
Theoretically, in the future, it could be the same amount of useful (or much more) with substantially less power usage.
It could be a short-term crunch to pull-forward (slightly) AI advancements.
Additionally, I'm extremely skeptical they'll actually turn on this many chips using that much energy globally in a reasonable time-frame.
Saying that you're going to make that kind of investment is one thing. Actually getting the power for it is easier said than done.
VC "valuations" are already a joke. They're more like minimum valuations. If OpenAI is worth anywhere near it's current "valuations", Nvidia would be criminally negligent NOT to invest at a 90% discount (the marginal profit on their chips).
30 TWh per year is equivalent to an average power consumption of 3.4 GW for everything Google does. This partnership is 3x more energy intensive.
Ultimately the difference in `real value/MWh` between these two must be many orders of magnitude.
[1] https://sustainability.google/reports/google-2025-environmen...
You over-provision so that you (almost) always have enough compute to meet your customers needs (even at planet scale, your demand is bursty), you're always doing maintenance on some section, spinning up new hardware and turning down old hardware.
So, apples to apples, this would likely not even be 2x at 30TWh for Google.
More than a "Google" of new compute is of course still a lot, but it's not many Googles' worth.
AI that could find a cure for cancer isn't the driving economic factor in LLM expansion, I don't think. I doubt cancer researchers are holding their breath on this.
All-in, you’re looking at a higher footprint maybe 4-5kw per GPU blended.
So about 2 million GPUs.
Therefore, they are listing in terms of the critical limit: power.
Personally, I expect this to blow up first in the faces of normal people who find they can no longer keep their phones charged or their apartments lit at night, and only then will the current AI investment bubble pop.
I imagine this as a subtractive process starting with the maximum energy window.
Because if some card with more FLOPS comes available, and the market will buy all your FLOPS regardless, you just swap it in at constant y / for no appreciable change in how much you're spending to operate.
(I have no idea if y is actually much larger than x)
> to invest up to
i.e. 0 to something something
I know watts but I really can’t quantify this. How much of Nvidia is there in the amount of servers that consume 10GW? Do they all use the same chip? What if there is newer chip that consumes less, does the deal imply more servers? Did GPT write this post?
Also, the idea of a newer Nvidia card using less power is très amusant.
So this investment is somewhat structured like the Microsoft investment where equity was traded for Azure compute.
In the actual shady version of this, Company B isn’t the hottest AI investment around, it’s a shell company created by your brother’s cousin that isn’t actually worth what you’re claiming on the balance sheet because it was only created for the round tripping shell game.
Every time HackerNews talks about anything in the legal or finance realm, people trip over themselves to make arguments for why something a big tech is doing is illegal. This is definitively neither illegal nor shady. If Nvidia believes, for example, that OpenAI can use their GPUs to turn a profit, then this is inherently positive sum economically for both sides: OpenAI gets capital in the form of GPUs, uses them to generate tokens which they sell above the cost of that capital and then the return some of the excess value to Nvidia. This is done via equity. It's a way for Nvidia to get access to some of the excess value of their product.
Also, investing in OpenAI means they get equity in return, which is not a worthless asset. There is actual mutually beneficial trade occurring.
It's a good time to gently remind everyone that there are a whole pile of legal things one can do to change how a security looks "by the numbers" and this isn't even close to the shadiest. Heck some sell-side research makes what companies themselves do look benign.
Two economists are walking in a forest when they come across a pile of shit.
The first economist says to the other “I’ll pay you $100 to eat that pile of shit.” The second economist takes the $100 and eats the pile of shit.
They continue walking until they come across a second pile of shit. The second economist turns to the first and says “I’ll pay you $100 to eat that pile of shit.” The first economist takes the $100 and eats a pile of shit.
Walking a little more, the first economist looks at the second and says, "You know, I gave you $100 to eat shit, then you gave me back the same $100 to eat shit. I can't help but feel like we both just ate shit for nothing."
"That's not true", responded the second economist. "We increased total revenue by $200!"
This kind of corporate behavior is bad and will end up hurting somebody. If we're lucky the fallout will only hurt Nvidia. More likely it will end up hurting most taxpayers.
[1]https://www.corpdev.org/2025/07/23/hp-awarded-945-million-in...
In the end, Nvidia will have OpenAI shares, which are valuable, and OpenAI will have GPUs, which are also valuable. It is not fake revenue, the GPUs will be made, sold at market price, and used, they are not intended to be bought back and sold to another customer. And hopefully, these GPUs will be put to good use by OpenAI so that they can make a profit, which will give Nvidia some return on investment.
It doesn't look so different from a car loan, where the dealer lends you the money so that you can buy their car.
It's not necessarily manipulative but it's also not exactly an arms-length purchase of GPUs on the open market.
Microsoft and Google have been doing it for decades. Probably, MS started that practice.
... and we've seen this before in previous bubbles ...
https://www.cnbc.com/2025/09/17/ai-startup-nscale-from-uk-is...
500 is insane.
"Good news everybody, your power bills are going up and your creaking, chronically underfunded infrastructure is even closer to collapse!"
https://www.newstarget.com/2025-08-02-texas-ai-data-centers-...
And before you think that's nonsense, let's not forget these people are accelerationists. Destroying the fabric of society is their goal.
I say this as someone who has been holding NVDA stock since 2016 and can cash out for a large sum of money. To me its all theoretical money until I actually sell. I don't factor it into financial planning.
You don't see me being a cheerleader for NVDA. Even though I stand to gain a lot. I will still tell you that the current price is way too high and Jensen Huang has gotten high off his own supply and "celebrity status".
After all, we all can't buy NVDA stock and get rich off it. Is it truly possible for all 30,000+ NVDA employees to become multi-millionaires overnight? That's not how capitalism works.
472 more comments available on Hacker News
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.