How Much OpenAI Spends on Inference and Its Revenue Share with Microsoft
Mood
heated
Sentiment
negative
Category
business
Key topics
OpenAI
AI costs
Microsoft partnership
The discussion revolves around a report questioning OpenAI's financials, particularly its inference costs and revenue, sparking concerns about the company's financial sustainability and potential 'cooking of the books'.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
12m
Peak period
21
Day 1
Avg / period
11.5
Based on 23 loaded comments
Key moments
- 01Story posted
11/12/2025, 4:38:15 PM
6d ago
Step 01 - 02First comment
11/12/2025, 4:50:42 PM
12m after posting
Step 02 - 03Peak activity
21 comments in Day 1
Hottest window of the conversation
Step 03 - 04Latest activity
11/14/2025, 4:51:32 PM
4d ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
> I also cannot reconcile these numbers with the reporting that OpenAI will have a cash burn of $9 billion in CY2025. On inference alone, OpenAI has already spent $8.67 billion through Q3 CY2025.
This is insane.
How are you understanding the word 'balanced'? Do you mean _tone_, or something? Like, this is unusually dry and non-grumpy for Ed, but it's if anything more critical than his normal output.
brb buying puts
Sure, Nvidia is making crazy money right now, but what's going to happen to all those deals when the market blinks and some of those lesser players start falling over?
[0]: https://nymag.com/intelligencer/article/ai-investment-is-sta...
I honestly think it's a great model for incentive alignment and not that sketchy on the surface. For the manufacturer, it's guaranteed revenue with upside convexity. For the startup, it's better terms and priority from the manufacturers since they have a stake in your success.
Compute is a hard resource, inextricably linked to money, time, and energy.
The math doesn't work in Sam's favor, no matter how much smoke he blows up your ass.
It's going to be interesting when all these GPUs are repurposed to mine Bitcoin, and people try to forget falling for the hysteria that somehow you can arrive at AGI from a glorified markov bot.
No GPU is good for bitcoin mining; that's all been ASICs for a long time. Even before anyone got around to making ASICs for it, FPGA-based designs had displaced GPU mining. Bitcoin mining is very, very simple.
Some altcoins use GPUs.
This along with the fact that it is easier than ever to switch models today. And the fierce competition from Google and Anthropic.
Cannot wait for the pump and dump that the OpenAI IPO is going to be.
How high are OpenAI's compute costs? Possibly a lot higher than we thought
https://www.ft.com/content/fce77ba4-6231-4920-9e99-693a6c38e...
There's a story here that's in some ways bigger than OpenAI or Anthropic's finances. Someone is leaking very sensitive and private financial information to Ed. They're clearly getting these numbers from somewhere and given the monthly breakdowns Ed posted previously for Anthropic, they are likely coming from a billing dashboard of some kind inside the big clouds. It's not very likely the leaks are coming from inside the AI labs themselves because of how cloud specific and incomplete they are.
For a big cloud to have a rogue insider like this is huge. It's really rare for big tech firms to leak private data and this report suggests MS or whoever has this problem hasn't been able to find the leaker, which is amazing. These numbers can't be that widely distributed surely? If companies like OpenAI aren't safe from leaks then nobody is.
- OpenAI claims to have spent $2.5 billion on inference in H1 of 2025, report claims it actually spent $5.02 billion
- OpenAI claims to have made $3.7 billion in revenues in 2024, report claims it actually made $2.469 billion
- OpenAI claims to have made $4.3 billion in revenues in H1 of 2025, report claims it actually made $2.273 billion
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.