Not Hacker News Logo

Not

Hacker

News!

Home
Hiring
Products
Companies
Discussion
Q&A
Users
Not Hacker News Logo

Not

Hacker

News!

AI-observed conversations & context

Daily AI-observed summaries, trends, and audience signals pulled from Hacker News so you can see the conversation before it hits your feed.

LiveBeta

Explore

  • Home
  • Hiring
  • Products
  • Companies
  • Discussion
  • Q&A

Resources

  • Visit Hacker News
  • HN API
  • Modal cronjobs
  • Meta Llama

Briefings

Inbox recaps on the loudest debates & under-the-radar launches.

Connect

© 2025 Not Hacker News! — independent Hacker News companion.

Not affiliated with Hacker News or Y Combinator. We simply enrich the public API with analytics.

Not Hacker News Logo

Not

Hacker

News!

Home
Hiring
Products
Companies
Discussion
Q&A
Users
  1. Home
  2. /Discussion
  3. /OpenAI and Nvidia announce partnership to deploy 10GW of Nvidia systems
  1. Home
  2. /Discussion
  3. /OpenAI and Nvidia announce partnership to deploy 10GW of Nvidia systems
Last activity 2 months agoPosted Sep 22, 2025 at 12:10 PM EDT

Openai and Nvidia Announce Partnership to Deploy 10gw of Nvidia Systems

meetpateltech
473 points
632 comments

Mood

heated

Sentiment

negative

Category

other

Key topics

AI
Energy Consumption
Nvidia
Openai
Debate intensity85/100

OpenAI and Nvidia announce a partnership to deploy 10GW of Nvidia systems, sparking concerns about energy consumption, environmental impact, and the potential for a bubble in AI investments.

Snapshot generated from the HN discussion

Discussion Activity

Very active discussion

First comment

21m

Peak period

156

Day 1

Avg / period

80

Comment distribution160 data points
Loading chart...

Based on 160 loaded comments

Key moments

  1. 01Story posted

    Sep 22, 2025 at 12:10 PM EDT

    2 months ago

    Step 01
  2. 02First comment

    Sep 22, 2025 at 12:30 PM EDT

    21m after posting

    Step 02
  3. 03Peak activity

    156 comments in Day 1

    Hottest window of the conversation

    Step 03
  4. 04Latest activity

    Sep 24, 2025 at 7:24 AM EDT

    2 months ago

    Step 04

Generating AI Summary...

Analyzing up to 500 comments to identify key contributors and discussion patterns

Discussion (632 comments)
Showing 160 comments of 632
eagerpace
2 months ago
4 replies
Where is Apple? Even from an investment perspective.
rubyfan
2 months ago
1 reply
Being rationale.
fancyfredbot
2 months ago
2 replies
Rational.
rubyfan
2 months ago
Ha, that too.
newfocogi
2 months ago
Maybe we're not sure if they're being rational or rationalizing.
brcmthrowaway
2 months ago
2 replies
Losing the race
gpm
2 months ago
Right, but is the race to the pot of gold, or the stoplight (in which case by "losing" they save on gas)?
richwater
2 months ago
This is not something that can be won. The LLM architecture has been reaching it's limitations slowly but surely. New foundational models are now being tweaked for user engagement rather than productive output.
bertili
2 months ago
Apple doing fine and often spend the same 100B in a year buying back Apple stocks.
threetonesun
2 months ago
My MacBook Pro runs local models better than anything else in the house and I have not yet needed to install a small nuclear reactor to run it, so, I feel like they're doing fine.
me551ah
2 months ago
4 replies
So OpenAI is breaking up with Microsoft and Azure?
freedomben
2 months ago
2 replies
They've been sleeping with Oracle too recently, so I don't think they're breaking up, just dipping a toe in the poly pool
jsheard
2 months ago
It's more resembling a Habsburg family tree at this point

https://bsky.app/profile/anthonycr.bsky.social/post/3lz7qtjy...

(pencil in another loop between Nvidia and OpenAI now)

sekai
2 months ago
In true Bay Area fashion?
Handy-Man
2 months ago
It was more like Microsoft refused to build the capacity OpenAI was asking for, so they gave them blessing to buy additional compute from others.

It does seem like Satya believes models will get commoditized, so no need to hitch themselves with OpenAI that strongly.

FinnKuhn
2 months ago
I would say Microsoft cheated on OpenAI first ;)

https://www.reuters.com/business/microsoft-use-some-ai-anthr...

mmmllm
2 months ago
Are Anthropic and Google breaking up with Nvidia?
ddtaylor
2 months ago
9 replies
For someone who doesn't know what a gigawat worth of Nvidia systems is, how many high-end H100 or whatever does this get you? My estimates along with some poor-grade GPT research leads me to think it could be nearly 10 million? That does seem insane.
thrtythreeforty
2 months ago
1 reply
Safely in "millions of devices." The exact number depends on assumptions you make regarding all the supporting stuff, because typically the accelerators consume only a fraction of total power requirement. Even so, millions.
cj
2 months ago
2 replies
"GPUs per user" would be an interesting metric.

(Quick, inaccurate googling) says there will be "well over 1 million GPUs" by end of the year. With ~800 million users, that's 1 NVIDIA GPU per 800 people. If you estimate people are actively using ChatGPT 5% of the day (1.2 hours a day), you could say there's 1 GPU per 40 people in active use. Assuming consistent and even usage patterns.

That back of the envelope math isn't accurate, but interesting in the context of understanding just how much compute ChatGPT requires to operate.

Edit: I asked ChatGPT how many GPUs per user, and it spit out a bunch of calculations that estimates 1 GPU per ~3 concurrent users. Would love to see a more thorough/accurate break down.

NooneAtAll3
2 months ago
1 reply
I'm kinda scared of "1.2 hours a day of ai use"...
Rudybega
2 months ago
1 reply
Sorry, those figures are skewed by Timelord Georg, who has been using AI for 100 million hours a day, is an outlier, and should have been removed.
fuzzfactor
2 months ago
1 reply
Roger, but I still think with that much energy at its disposal, if AI performs as desired it will work it's way up to using each person more than 1.2 hours per day, without them even knowing about it :\
Nevermark
2 months ago
1 reply
When GPUs share people concurrently, they collectively get much more than 24 hours of person per day.
fuzzfactor
2 months ago
1 reply
You're right!

With that kind of singularity the man-month will no longer be mythical ;)

Nevermark
2 months ago
It will be epic!
coder543
2 months ago
A lot of GPUs are allocated for training and research, so dividing the total number by the number of users isn’t particularly useful. Doubly so if you’re trying to account for concurrency.
skhameneh
2 months ago
2 replies
Before reading your comment I did some napkin math using 600W per GPU: 10,000,000,000 / 600 = 16,666,666.66...

With varying consumption/TDP, could be significantly more, could be significantly less, but at least it gives a starting figure. This doesn't account for overhead like energy losses, burst/nominal/sustained, system overhead, and heat removal.

kristjansson
2 months ago
1 reply
B200 is 1kW+ TDP ;)
skhameneh
2 months ago
And consists of 8 GPUs
alphabetag675
2 months ago
It's easy to see what it could be by looking at Green500.
iamgopal
2 months ago
2 replies
and How much is that in terms of percentage of bitcoin network capacity ?
cedws
2 months ago
1 reply
I'm also wondering what kind of threat this could be to PoW blockchains.
typpilol
2 months ago
2 replies
Literally none at all because asic
cedws
2 months ago
Some chains are designed to be ASIC resistant.
fuzzfactor
2 months ago
What happens if AI doesn't pay off before the GPUs wear out or are in need of replacement?

So at that point a DC replaces them all with ASICs instead?

Or if they just feel like doing that any time.

mrb
2 months ago
Bitcoin mining consumes about 25 GW: https://ccaf.io/cbnsi/cbeci so this single deal amounts to about 40% of that.

To be clear, I am comparing power consumption only. In terms of mining power, all these GPUs could only mine a negligible fraction of what all specialized Bitcoin ASIC mine.

Edit: some math I did out of sheer curiosity: a modern top-of-the-line GPU would mine BTC at about 10 Ghash/s (I don't think anyone tried but I wrote GPU mining software back in the day, and that is my estimate). Nvidia is on track to sell 50 million GPUs in 2025. If they were all mining, their combined compute power would be 500 Phash/s, which is 0.05% of Bitcoin's global mining capacity.

ProofHouse
2 months ago
1 reply
How much cable (and what kind) to connect them all? That number would be 100x the number of gpus. I would think they just clip on metal racks no cables but then I saw the xai data center that can blue wire cables everywhere
hbarka
2 months ago
1 reply
It was announced last week that Nvidia acquired-hired a company that can connect more than 100,000 GPUs together as a cluster that can effectively serve as a single integrated system.
ddtaylor
2 months ago
2 replies
Do you have a link or info?
hbarka
2 months ago
1 reply
https://www.reuters.com/technology/nvidia-spent-over-900-mil...

https://www.tomshardware.com/tech-industry/nvidia-drops-a-co...

ddtaylor
2 months ago
Thank you
typpilol
2 months ago
I think it's called enfrabica or something similar
kingstnap
2 months ago
1 reply
It's a ridiculous amount claimed for sure. If its 2 kW per it's around 5 million, and 1 to 2 kW is definitely the right ballpark at a system level.

The NVL72 is 72 chips is 120 kW total for the rack. If you throw in ~25 kW for cooling its pretty much exactly 2 kW each.

fuzzfactor
2 months ago
1 reply
So each 2KW component is like a top-shelf space heater which the smart money never did want to run unless it was quite cold outside.
willis936
2 months ago
It will be the world's most advanced resistor.
awertjlkjl
2 months ago
3 replies
You could think of it as "as much power as is used by NYC and Chicago combined". Which is fucking insanely wasteful.
onlyrealcuzzo
2 months ago
3 replies
I dunno.

Google is pretty useful.

It uses >15 TWh per year.

Theoretically, AI could be more useful than that.

Theoretically, in the future, it could be the same amount of useful (or much more) with substantially less power usage.

It could be a short-term crunch to pull-forward (slightly) AI advancements.

Additionally, I'm extremely skeptical they'll actually turn on this many chips using that much energy globally in a reasonable time-frame.

Saying that you're going to make that kind of investment is one thing. Actually getting the power for it is easier said than done.

VC "valuations" are already a joke. They're more like minimum valuations. If OpenAI is worth anywhere near it's current "valuations", Nvidia would be criminally negligent NOT to invest at a 90% discount (the marginal profit on their chips).

dns_snek
2 months ago
1 reply
According to Google's latest environmental report[1] that number was 30 TWh per year in 2024, but as far as I can tell that's their total consumption of their datacenters, which would include everything from Google Search, to Gmail, Youtube, to every Google Cloud customer. Is it broken down by product somewhere?

30 TWh per year is equivalent to an average power consumption of 3.4 GW for everything Google does. This partnership is 3x more energy intensive.

Ultimately the difference in `real value/MWh` between these two must be many orders of magnitude.

[1] https://sustainability.google/reports/google-2025-environmen...

onlyrealcuzzo
2 months ago
Data centers typically use 60% (or less) on average of their max rating.

You over-provision so that you (almost) always have enough compute to meet your customers needs (even at planet scale, your demand is bursty), you're always doing maintenance on some section, spinning up new hardware and turning down old hardware.

So, apples to apples, this would likely not even be 2x at 30TWh for Google.

tmiku
2 months ago
1 reply
For other readers: "15 Twh per year" is equivalent to 1.71 GW, 17.1% of the "10GW" number used to describe the deal.
mNovak
2 months ago
This is ignoring the utilization factor though. Both Google and OpenAI have to overprovision servers for the worst case simultaneous users. So 1.71 GW average doesn't tell use the maximum instantaneous GW capacity of Google -- if we pull a 4x out of the hat (i.e. peak usage is 4x above average), it becomes ~7 GW of available compute.

More than a "Google" of new compute is of course still a lot, but it's not many Googles' worth.

Capricorn2481
2 months ago
Does Google not include AI?
jazzyjackson
2 months ago
3 replies
I mean if 10GW of GPUs gets us AGI and we cure cancer than that's cool, but I do get the feeling we're just getting uncannier chatbots and fully automated tiktok influencers
rebolek
2 months ago
And when it’s built, Sam Altman will say: We are so close, if we get 10TW, AGI will be here next year!
junon
2 months ago
This is also my take. I think a lot of people miss the trees for the forest (intentionally backward).

AI that could find a cure for cancer isn't the driving economic factor in LLM expansion, I don't think. I doubt cancer researchers are holding their breath on this.

yard2010
2 months ago
Current llms are just like farms. Instead of tomatoes by the pound you buy tokens by the pound. So it depends on the customers.
diego_sandoval
2 months ago
Do you think the existence of NYC and Chicago is insanely wasteful?
az226
2 months ago
Vera Rubin will be about 2.5kw and Feynman will be about 4kw.

All-in, you’re looking at a higher footprint maybe 4-5kw per GPU blended.

So about 2 million GPUs.

sandworm101
2 months ago
At this scale, I would suggest that these numbers are for the entire data center rather than a sum of the processor demands. Also the "infrastructure partnership " language suggest more than just compute. So I would add cooling into the equation, which could be as much a half the power load, or more depending on where they intend to locate these datacenters.
alphabetag675
2 months ago
Account for around 3MW for every 1000 GPUs. So, 10GW is around 333 * 10 * 3MW so 3.33 * 1k * 1k GPUs, so around 3.33 M GPUs
gmm1990
2 months ago
7 replies
Strange unit of measurement. Who would find that more useful than expected compute or even just the number of chips.
leetharris
2 months ago
1 reply
Probably because you can't reliably predict how much compute this will lead to. Power generation is probably the limiting factor in intelligence explosion.
sedawkgrep
2 months ago
That, and compute always goes up.
zozbot234
2 months ago
3 replies
It's a very useful reference point actually because once you hit 1.21 GW the AI model begins to learn at a geometric rate and we finally get to real AGI. Last I've heard this was rumored as a prediction for AI 2027, so we're almost there already.
jsnell
2 months ago
1 reply
1.21GW is an absurd level of precision for this kind of prediction.
leptons
2 months ago
It's from the movie "Back to the Future"
the_70x
2 months ago
Came only here searching for 1.21GW
outside2344
2 months ago
Is this a crafty reference to Back to the Future? If so I applaud you.
ben_w
2 months ago
For a while, it's become increasingly clear that the current AI boom's growth curve rapidly hits the limits of the existing electricity supply.

Therefore, they are listing in terms of the critical limit: power.

Personally, I expect this to blow up first in the faces of normal people who find they can no longer keep their phones charged or their apartments lit at night, and only then will the current AI investment bubble pop.

skhameneh
2 months ago
I wouldn't be surprised if power consumption is a starting point due to things like permitting and initial load planning.

I imagine this as a subtractive process starting with the maximum energy window.

aprdm
2 months ago
At large scales a lot of it is measured on power instead of compute, as power is the limitation
isoprophlex
2 months ago
If a card costs x money, and operating it every year/whatever costs y money in electricity, and y >> x, it makes sense to mostly talk about the amount of electricity you are burning.

Because if some card with more FLOPS comes available, and the market will buy all your FLOPS regardless, you just swap it in at constant y / for no appreciable change in how much you're spending to operate.

(I have no idea if y is actually much larger than x)

credit_guy
2 months ago
A point of reference is that the recently announced OpenAI-Oracle deal mentioned 4.5 GW. So this deal is more than twice as big.
xnx
2 months ago
9 replies
What does this mean? "To support the partnership, NVIDIA intends to invest up to $100 billion in OpenAI progressively as each gigawatt is deployed."
vlovich123
2 months ago
1 reply
Nvidia is buying their own chips and counting it as a sale. In exchange they’re maybe getting OpenAI stock that will be worth more in the future. Normally this would count as illegally cooking the books I think but if the OpenAI investment pays off no one will care.
toomuchtodo
2 months ago
2 replies
What if it doesn't?
nutjob2
2 months ago
It's a good question since it's probably the 99% case.
vlovich123
2 months ago
Still unlikely they’d get prosecuted because they’re not trying to hide how they’re doing this and there’s no reasonable expectation that OpenAI is likely to fold. I doubt they’d improperly record this in their accounting ledger either.
patapong
2 months ago
Perhaps it means OpenAI will pay for the graphics card in stock? Nvidia would become an investor in OpenAI thereby moving up the AI value chain as well as ensuring demand for GPUs, while OpenAI would get millions of GPUs to scale their infrastructure.
jstummbillig
2 months ago
I am confused as to what the question is.
solarexplorer
2 months ago
That they will invest 10$ in OpenAI for each W of NVIDIA chips that is deployed? EDIT: In steps of 1GW it seems.
re-thc
2 months ago
> What does this mean?

> to invest up to

i.e. 0 to something something

dsr_
2 months ago
It means this is a bubble, and Nvidia is hoping that their friends in a white house will keep them from being prosecuted, of at least from substantial penalties.
dtech
2 months ago
They're investing in kind. They're paying with chips instead of money
mmmllm
2 months ago
They will transfer the money to buy their own chips right before each chip is purchased
losteric
2 months ago
so nvidia's value supported by the value of AI companies, which nvidia then supports?
isodev
2 months ago
3 replies
> Strategic partnership enables OpenAI to build and deploy at least 10 gigawatts of AI datacenters with NVIDIA systems representing millions of GPUs

I know watts but I really can’t quantify this. How much of Nvidia is there in the amount of servers that consume 10GW? Do they all use the same chip? What if there is newer chip that consumes less, does the deal imply more servers? Did GPT write this post?

mr_toad
2 months ago
You don’t need AI to write vague waffly press releases. But to put this in perspective an H100 has a TDP of 700 watts, the newer B100s are 1000 watts I think?

Also, the idea of a newer Nvidia card using less power is très amusant.

az226
2 months ago
$150-200B worth of hardware. About 2 million GPUs.

So this investment is somewhat structured like the Microsoft investment where equity was traded for Azure compute.

nick__m
2 months ago
A 72 GPUs NVL72 rack consumes up to 130kW, so it's a little more than 5 500 000 GPUs
fufxufxutc
2 months ago
11 replies
In accounting terms, this is a shady business practice known as "round tripping" where you invest in a company for the sole purpose of them buying your product. It allows you to count your revenue multiple times.
klysm
2 months ago
3 replies
Is it counting revenue multiple times? It's buying your own products really, but not sure how that counts as double counting revenue
fufxufxutc
2 months ago
1 reply
The "investment" came from their revenue, and will be immediately counted in their revenue again.
weego
2 months ago
In this case it seems that if we're being strict here the investment could then also show up as fixed assets on the same balance sheet
rsstack
2 months ago
3 replies
Customer A pays you $100 for goods that cost you $10. You invest $100-$10=$90 in customer B so that they'll pay you $90 for goods that cost you $9. Your reported revenue is now $100+$90=$190, but the only money that entered the system is the original $100.
Aurornis
2 months ago
1 reply
Yes, but you’ve also incurred a $90 expense in purchasing the stock of Company B and that stock is on the balance sheet.

In the actual shady version of this, Company B isn’t the hottest AI investment around, it’s a shell company created by your brother’s cousin that isn’t actually worth what you’re claiming on the balance sheet because it was only created for the round tripping shell game.

mxschumacher
2 months ago
that's what Carvana is doing with the car loans it securitizes : https://hindenburgresearch.com/carvana/
creddit
2 months ago
1 reply
Except that this is isn't round-tripping at all. Round-tripping doesn't result in a company actually incurring expenses to create more product. Round-tripping is the term for schemes that enable you to double count assets/revenue without any economic effects taking place.

Every time HackerNews talks about anything in the legal or finance realm, people trip over themselves to make arguments for why something a big tech is doing is illegal. This is definitively neither illegal nor shady. If Nvidia believes, for example, that OpenAI can use their GPUs to turn a profit, then this is inherently positive sum economically for both sides: OpenAI gets capital in the form of GPUs, uses them to generate tokens which they sell above the cost of that capital and then the return some of the excess value to Nvidia. This is done via equity. It's a way for Nvidia to get access to some of the excess value of their product.

bob1029
2 months ago
1 reply
At some point one might simply argue that the nature and timing of these wildly fantastical press releases is tantamount to a "scheme to defraud".
creddit
2 months ago
“ Every time HackerNews talks about anything in the legal or finance realm, people trip over themselves to make arguments for why something a big tech is doing is illegal.”
FinnKuhn
2 months ago
And your evaluation also rises as a consequence of your increased revenue.
lumost
2 months ago
It's real revenue, but you are operating a fractional reserve revenue operation. If the person your investing in has trouble, or you have trouble - the whole thing falls over very fast.
Aurornis
2 months ago
1 reply
This is being done out in the open (we’re reading the press announcement) and will be factored into valuations.

Also, investing in OpenAI means they get equity in return, which is not a worthless asset. There is actual mutually beneficial trade occurring.

mxschumacher
2 months ago
what is OpenAI's durable, competitive advantage that differentiates it against the numerous other LLM providers? Investing at a $500bn valuation for a company that's losing money and has bad unit economics this seems rather aggressive.
landl0rd
2 months ago
2 replies
Nvidia has consistently done this with Coreweave, Nscale, really most of its balance sheet investments are like this. On the one hand there's a vaguely cogent rationale that they're a strategic investor and it sort of makes sense as an hardware-for-equity swap; on the other, it's obviously goosing revenue numbers. This is a bigger issue when it's $100B than with previous investments.

It's a good time to gently remind everyone that there are a whole pile of legal things one can do to change how a security looks "by the numbers" and this isn't even close to the shadiest. Heck some sell-side research makes what companies themselves do look benign.

yannyu
2 months ago
1 reply
A relevant joke, paraphrased from the internet:

Two economists are walking in a forest when they come across a pile of shit.

The first economist says to the other “I’ll pay you $100 to eat that pile of shit.” The second economist takes the $100 and eats the pile of shit.

They continue walking until they come across a second pile of shit. The second economist turns to the first and says “I’ll pay you $100 to eat that pile of shit.” The first economist takes the $100 and eats a pile of shit.

Walking a little more, the first economist looks at the second and says, "You know, I gave you $100 to eat shit, then you gave me back the same $100 to eat shit. I can't help but feel like we both just ate shit for nothing."

"That's not true", responded the second economist. "We increased total revenue by $200!"

paxys
2 months ago
The punchline is supposed to be GDP, but yeah, same concept.
hoosieree
2 months ago
This should go without saying but unfortunately it really doesn't these days:

This kind of corporate behavior is bad and will end up hurting somebody. If we're lucky the fallout will only hurt Nvidia. More likely it will end up hurting most taxpayers.

rzerowan
2 months ago
1 reply
Its the same loop de loop NVIDIA is doing with Coreweave as i understand.'Investing' in coreweave which then 'buys' NVIDIA merch for cloud rental , resulting in Coreweave being the top 4 customers of NVIDIA chips.
vessenes
2 months ago
1 reply
Wait, why the quotes? NVDA sends cash, and the Coreweave spends it, no? I don’t think quotes are accurate, if they imply these transactions aren’t real, and material. At the end of the day, NVDA owns Coreweave stock, and actual, you know, physical hardware is put into data centers, and cash is wired.
rzerowan
2 months ago
Well we do have the precedent of HPE/Autonomy in the UK [1], which ruled that the process is essentially fraud. Whether there will be a prosecution in the current corporate environment remains to be seen. Essentially though the roundtrip revenue inflation was already ruled illegal.

[1]https://www.corpdev.org/2025/07/23/hp-awarded-945-million-in...

GuB-42
2 months ago
3 replies
I don't really understand how it is round tripping.

In the end, Nvidia will have OpenAI shares, which are valuable, and OpenAI will have GPUs, which are also valuable. It is not fake revenue, the GPUs will be made, sold at market price, and used, they are not intended to be bought back and sold to another customer. And hopefully, these GPUs will be put to good use by OpenAI so that they can make a profit, which will give Nvidia some return on investment.

It doesn't look so different from a car loan, where the dealer lends you the money so that you can buy their car.

treis
2 months ago
1 reply
If OpenAI doesn't pan out than Nvidia has worthless OpenAI stock and OpenAI has a pile of mostly useless GPUs.
dwaltrip
2 months ago
That’s still not round tripping?
nmfisher
2 months ago
A dollar is always a dollar, so it's hard to claim that $1 million in revenue is actually worth $10 million. OpenAI shares, on the other hand, aren't publicly traded, so it's much easier to claim they're worth $10 million when noone would actually be willing to buy for more than $1 million.

It's not necessarily manipulative but it's also not exactly an arms-length purchase of GPUs on the open market.

udkl
2 months ago
It looks like NVDIA looking to move up the value chain to have a stake in the even higher margin/addressable market instead of simply selling the tools.
t0mas88
2 months ago
Oracle also announced a lot of future revenue from AI, while they're part of Stargate Partners that is investing in OpenAI. Similar deal...
Mistletoe
2 months ago
Isn’t our stock market basically propped up on this AI credits etc. house of cards right now?
mandeepj
2 months ago
> this is a shady business practice known as "round tripping" where you invest in a company for the sole purpose of them buying your product.

Microsoft and Google have been doing it for decades. Probably, MS started that practice.

selectodude
2 months ago
This is some Enron shit. Lets see NVDA mark to market these profits. Keep the spice flowing.
rsync
2 months ago
"In accounting terms, this is a shady business practice known as "round tripping" where you invest in a company for the sole purpose of them buying your product. It allows you to count your revenue multiple times."

... and we've seen this before in previous bubbles ...

FinnKuhn
2 months ago
They for example did a similar deal with Nscale just last week.

https://www.cnbc.com/2025/09/17/ai-startup-nscale-from-uk-is...

DebtDeflation
2 months ago
3 replies
Wouldn't Nvidia be better served investing the $100B in expanding GPU manufacturing capacity?
ecshafer
2 months ago
By investing in TSMC? By buying TSMC? I don't think $100B would buy them enough current generation capacity to make a difference from scratch.
paxys
2 months ago
The don't have to pick just one.
vessenes
2 months ago
They’re already spending as much money as they possibly can on growth, and have no further use for cash currently - they’ve been doing share buybacks this year.
TheRealGL
2 months ago
5 replies
Did I miss the part where they mention the 10 large nuclear plants needed to power this new operation? Where's all the power coming from for this?
HDThoreaun
2 months ago
4 replies
Build this thing in the middle of the desert and you would need around 100 sq mile of solar panels + a fuck load of batteries for it to be energy independent. The solar farm would be around $10 billion which is probably far less than the gpus cost
xnx
2 months ago
Dissipating 10GW of heat is also a challenge in a sunny, hot, dry environment.
newyankee
2 months ago
100 sq km should suffice
udkl
2 months ago
$10 billion is small change compare to the estimated all-inclusive cost of $10 billion for EACH 500MW data center ... $200 billion for 10GW.
boringg
2 months ago
Won't get you the necessary 4 9's uptime and energy sadly. Im still 100% for this -- but need another model for energy delivery.
catigula
2 months ago
2 replies
Consumer electric grids.
delfinom
2 months ago
1 reply
Yep. Consumers are screwed and $500/month electric bills are coming for the average consumer within a year or two. We do not have the electricity available for this.
leptons
2 months ago
2 replies
I'm pretty average, living in a small home, and my electric bill is already >$500/mo in the summer, and that's with the A/C set at 76F during the day.
thorncorona
2 months ago
1 reply
Where do you live? How old is your house?

500 is insane.

catigula
2 months ago
I don't expect him to tell you where he lives but my bill EXPLODED recently due to what I now know is data center demand.
t0mas88
2 months ago
How many kWh is that? At those amounts solar panels seem like a no-brainer business case?
davis
2 months ago
Exactly this. This is essentially a new consumer tax in your electrical bill. The buildout of the electrical grid is being put on consumers essentially as a monthly tax with the increase in electrical costs. Everyone in the country is paying for the grid infrastructure to power these data centers owned by trillion dollar companies who aren't paying for their needs.
nitwit005
2 months ago
1 reply
I assumed this headline was not aimed at the public, but at some utility they want to convince to expand capacity. Otherwise, bragging about future power consumption seems a bit perplexing.
Ianjit
2 months ago
Or to assuage investors participating in the OpenAI secondary on the issue of cash burn.
nutjob2
2 months ago
Also, the fact they they announce not how much computing power they are going to deploy but rather how much electricity it's going to use (as if power usage is a useful measurement of processing power) is kind of gross.

"Good news everybody, your power bills are going up and your creaking, chronically underfunded infrastructure is even closer to collapse!"

hoosieree
2 months ago
Also water. You will be rationed, OpenAI will not.

https://www.newstarget.com/2025-08-02-texas-ai-data-centers-...

lumenwrites
2 months ago
1 reply
Yaay, one step closer to torment nexus.
nh23423fefe
2 months ago
low effort comment, whose content is a stale reference to other low effort memes
zuInnp
2 months ago
3 replies
Yeah, who cares about the enviroment... who needs water and energy, if you AI agent can give you better pep talk
rlv-dan
2 months ago
Don't forget about better filters for influencers talking about the climate crisis!
novaRom
2 months ago
What's the purpose to have access to smart assistants if it doesn't result in improving your basic needs, not improving your quality of life? Who is spending now? Only high income households, while majority is struggling with high utility bills and grocery prices - very basic needs.
tgv
2 months ago
When fucking up the human mind isn't enough. This is really villainous.

And before you think that's nonsense, let's not forget these people are accelerationists. Destroying the fabric of society is their goal.

moduspol
2 months ago
1 reply
Waiting patiently for the Ed Zitron article on this...
nextworddev
2 months ago
2 replies
He single-handedly cost people more than anyone with his bearish takes lol
topaz0
2 months ago
1 reply
Or he saved them more than anyone by limiting their losses when it does finally crash
nextworddev
2 months ago
except he called the top in 2023
mikhmha
2 months ago
1 reply
Its not a good argument against him. I read his articles and he is absolutely correct about the state of things. Predicting the crash is a fools errand. I don't use that as a argument to discredit what he actually writes regarding the raw economics of the AI industry.

I say this as someone who has been holding NVDA stock since 2016 and can cash out for a large sum of money. To me its all theoretical money until I actually sell. I don't factor it into financial planning.

You don't see me being a cheerleader for NVDA. Even though I stand to gain a lot. I will still tell you that the current price is way too high and Jensen Huang has gotten high off his own supply and "celebrity status".

After all, we all can't buy NVDA stock and get rich off it. Is it truly possible for all 30,000+ NVDA employees to become multi-millionaires overnight? That's not how capitalism works.

nextworddev
2 months ago
He’s been absolutely wrong on most things but spreading FUD is how he makes money, like Gary Marcus
hooloovoo_zoo
2 months ago
These $ figures based on compute credits or the investor's own hardware seem pretty sketchy.
bertili
2 months ago
Whats in it for Nvidia? At the recent 300B valuation, 25% equity?
andreicaayoha
2 months ago
pls
searine
2 months ago
I look forward to subsidizing this effort with my skyrocketing home power bill.

472 more comments available on Hacker News

View full discussion on Hacker News
ID: 45335474Type: storyLast synced: 11/20/2025, 8:04:59 PM

Want the full context?

Jump to the original sources

Read the primary article or dive into the live Hacker News thread when you're ready.

Read ArticleView on HN
Not Hacker News Logo

Not

Hacker

News!

AI-observed conversations & context

Daily AI-observed summaries, trends, and audience signals pulled from Hacker News so you can see the conversation before it hits your feed.

LiveBeta

Explore

  • Home
  • Hiring
  • Products
  • Companies
  • Discussion
  • Q&A

Resources

  • Visit Hacker News
  • HN API
  • Modal cronjobs
  • Meta Llama

Briefings

Inbox recaps on the loudest debates & under-the-radar launches.

Connect

© 2025 Not Hacker News! — independent Hacker News companion.

Not affiliated with Hacker News or Y Combinator. We simply enrich the public API with analytics.