Intel Arc Celestial Dgpu Seems to Be First Casualty of Nvidia Partnership
Posted4 months agoActive4 months ago
notebookcheck.netTechstoryHigh profile
skepticalnegative
Debate
80/100
Intel ArcGPUNvidia Partnership
Key topics
Intel Arc
GPU
Nvidia Partnership
The story reports that Intel's Celestial dGPU may be canceled due to Nvidia's partnership with Intel, but the discussion is dominated by skepticism about the rumor's source and the impact of the partnership.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
49m
Peak period
39
2-4h
Avg / period
9.9
Comment distribution89 data points
Loading chart...
Based on 89 loaded comments
Key moments
- 01Story posted
Sep 19, 2025 at 9:51 AM EDT
4 months ago
Step 01 - 02First comment
Sep 19, 2025 at 10:40 AM EDT
49m after posting
Step 02 - 03Peak activity
39 comments in 2-4h
Hottest window of the conversation
Step 03 - 04Latest activity
Sep 20, 2025 at 5:51 PM EDT
4 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45301679Type: storyLast synced: 11/20/2025, 9:01:20 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
Notably this is about the 3rd time in 2 years that he's reported that the Intel dGPU efforts are being killed off.
Even on the latest developments the reporting is contradictory, so someone is wrong and I suspect it's him. https://www.techpowerup.com/341149/intel-arc-gpus-remain-in-...
They're a channel focused on leaks, but most of their leaks are just industry insider gossip masked as factual to farm clicks. Their leaks are useless for any sort of predictions, but may be interesting if you'd like to know what insiders are thinking.
A quick google search also yielded this[1] 2-year old reddit thread that shows videos they deleted because their predictions were incorrect. There's probably many more. (That subreddit seems to be dedicated to trashing MLID.)
[1] https://www.reddit.com/r/BustedSilicon/comments/yo9l2i/colle...
Instead of invectives could you just say what specific leak of his was inaccurate? Everything he said about intel dGPU has happened exactly as he said it would. Have you watched his video about that yourself?
I stopped watching him completely around the time of the intel dGPU release. He would show leaked roadmaps of intel's dGPU launch with Celestial and Druid on there, but the video would be him basically repeating the narrative that the division is on the verge of cancellation and has no future, etc. The documents he has leaked almost never match the titles and narratives he pushes. He's not always wrong, but his biases are clear and the titles are more often misleading clickbait than factual.
Tom Petersen (Lead Intel GPU Engineer) has showed up on multiple interviews with LTT and GN among other tech channels and has talked at length the things his team did on the current gen, with heavy implications of what's coming up next (as far as he can without breaking NDA). His in depth analysis of GPU architecture are a much useful use of my time than listening to a guy that was given 2 leaked slides of a 6-month old powerpoint speculate how it spells doom for whatever company.
If you think logically, it makes zero sense to cancel Celestial now. According to Petersen Arc's hardware team has been working on Druid for months now, and unless the software team is severely behind with the drivers and support then at the very least Celestial will receive a limited release. They already did a limited release of Battlemage to put more resources on Celestial, it would be a shame to throw all that effort away now.
Thanks ill check him out.
> limited release
This is in line with what Tom said about discrete being “effectively cancelled”
Im still interested in hearing what Tom has gotten wrong and i dont mean when intel’s plans change making his old reporting out of date.
> What has he been wrong about
…
In all accounts I have seen, their single SKU from this second generation consumer lineup has been well-received. Yet the article says "what can only be categorized as a shaky and often rudderless business", without any justification.
Yes, it is worth pondering what the Nvidia investment means for Intel Arc Graphics, but "rudderless"? Really?
When it comes to GPUs, a $4T company probably couldn't care less what their $150B partner does in their spare time as long as they prioritize the partnership. Especially when the GPUs in question are low-end units, in a segment that Nvidia has no competition in and not even shipping that many. If they actually asked them to kill it, it would be 100% out of pettiness.
Sometimes I wonder if these articles are written for clicks and these "leakers" are actually just the authors making stuff up and getting it right from time to time.
Intel has so many other GPU-adjacent products and they will doubtless be continuing most of them, even if they don't pursue Arc further: Jaguar Shores, Flex GPUs for VDI, and of course their Xe integrated graphics. I could possibly see Intel not ship a successor to Flex? Maybe? I cannot see a world where they abandon Xe (first-party laptop graphics) or Jaguar Shores ("rack scale" datacenter "GPUs").
With all of that effort going into GPU-ish designs, is there enough overlap that the output/artifacts from those products support and benefit Arc? Or if Arc only continues to be a mid-tier success, is it thus a waste of fab allocation, a loss of potential profit, and an unnecessary expense in terms of engineers maintaining drivers, and so forth? That is the part I do not know, and why I could see it going either way.
I want to acknowledge that I am speaking out of my depth a bit: I have not read all of Intel's quarterly financials, and not followed every zig and zag of every product line. Yet while can see it both ways, in no world do I trust these supposed leaks.
Me too, I just really really doubt that it would come from Nvidia
Sufficiently far in the past you might have been able to get away with an integrated GPU that didn't even have meaningful 3D acceleration etc., but those days are gone. Even web browsers are leaning on the GPU to render content, which matters for iGPUs for battery life, which makes performance per watt the name of the game. And that's the same thing that matters most for large GPUs because the constraint on performance is power and thermals.
Which is to say, if you're already doing the work to make a competitive iGPU, you've done most of the work to make a competitive discrete GPU.
The thing Intel really needs to do is to get the power efficiency of their own process on par with TSMC. Without that they're dead; they can't even fab their own GPUs.
[1]: https://www.tomshardware.com/tech-industry/semiconductors/in...
[2]: https://www.tomshardware.com/tech-industry/intel-ceo-says-it...
When he joined only a few months ago, he set the vision of making Intel a worthy participant in the AI space.
Then just a few months later, he announced "we cannot compete".
What happened in the middle? Recent articles came out about the conflict between him and Frank Yeary, the head of the Intel board. He wanted to acquire a hot AI startup, and Frank opposed it. Two factions were formed in the Board, and they lost a lot of time battling it out. While this was going on, a FAANG came in and bought the startup.
I think his announcement that Intel cannot compete was his way of saying "I cannot do it with the current Intel board."
Now whether they fired the right tens of thousands is another matter.
According to Wikipedia, for FY25:
Intel: 102,000 employees
AMD: 28,000 employees
Nvidia: 36,000 employees
I'm pretty sure the latter two have been growing headcount lately, and even then combined they still have fewer employees than Intel.
Now intel isn’t doing as well on the chip design side as amd or nvidia nor as well on the fab side as tsmc but i suspect that’s on leadership thrashing constantly like it is more than anything else.
[1] https://www.tomshardware.com/tech-industry/intel-will-keep-u...
My read is basically that Intel's board is frustrated they can't part the company out, take their cash, and go home.
I'd also be incredibly frustrated working with a board that seems deadset on actively sabotaging the company for short term gains.
>"On training, I think it is too late for us,"
Not too late for AI, but too late for training meanwhile there's inference opportunity or something like that
With their own fabs, at that
Seems like you'd prefer yet another +1 selling AI oil and promises ...
Same as how they messed up the Core Ultra desktop launch, of their own volition - by setting the prices so high that they can’t even compete with their own 13th and 14th gen chips, not even mentioning Ryzen CPUs that are mostly better in both absolute terms and in the price/perf. A sidegrade isn’t the end of the world but a badly overpriced sidegrade is dead on arrival.
Idk what Intel is doing.
In other countries, any place that sold them at near-MSRP very much was out of stock immediately (not that there was that much of it to begin with, which is my critique), leaving other vendors to raise the prices to around 350 USD and the more greedy ones and scalpers all the way up to 400 USD, I saw those listings myself:
https://www.tomshardware.com/pc-components/gpus/where-to-buy...
https://www.digitaltrends.com/computing/intel-arc-b580-out-o...
Whether anyone actually bought those is another question entirely, but a GPU that's only available to you at 350 USD but should have been sold for 250 USD (and performs well for that price) is just plainly bad value.
It is nice that since the prices have dropped, however the hype around it is already past, so we're in the long tail of the product's lifecycle, especially with things like the CPU driver overhead being identified and getting lots of negative press. They had a chance to move a lot of units at launch thanks to positive reviews from pretty much everyone... and they fumbled it.
It seems like a bad choice at all times. A product with a 45% margin -- or a 5% margin -- is profitable. It's only the products with negative (net) margins that should be cut.
And meanwhile products with lower margins are often critical to achieving economies of scale and network effects. It's essentially the thing that put Intel in its current straits -- they didn't want to sell low-margin chips for phones and embedded devices which gave that market to ARM and TSMC and funded the competition.
Intel did this for memory in the 80s. Memory was still profitable, and could be more so again (see Micron), but it required much investment.
But Intel might not be in this position, and filling the fabs by itself can defiantly be worth it.
But if you don't have the capacity in the new fab, maybe that isn't an issue, so its hard to say from the outside.
It's usually not about the number of people. If you have two projects and both of them are profitable then you can hire more people and do both, even if one of them is more profitable than the other. The exception would be if that many qualified people don't exist in the world, but that's pretty rare and in those cases you should generally divert your focus to training more of them so you don't have such a small bus factor.
Another common mistake here is the sunk cost fallacy. If you have to invest ten billion dollars to be able to do both X and Y and call this five billion in cost for each, and at the end of that one of them has a 5% net margin and the other a 75% net margin, or even if the first one has a -5% net margin, you may not be right to cancel it if you still have to make the full ten billion dollar investment to do the other one. Because it's only -5% by including a five billion dollar cost that you can't actually avoid by canceling that product, and might be +20% if you only include costs that would actually be eliminated by canceling it.
I also don't think its not as rare as you suggest finding people. Depending on your location and industry. It takes time to add people. Good people existing somewhere on the world wasn't enough, specially before remote work.
Also if you can grow your 50% margin business even a little bit faster by focusing on it, over the 5% margin business. It doesn't take that much focus for that to be the better choice. So if to achieve this 5% margin, lots of your best people work it, shifting those to the larger margin business might make sense.
But I agree, if you are a mature company that has the needed infrastructure to keep that product alive then doing so makes sense. Specially because maybe in the future it could be a more important better business.
>"On training, I think it is too late for us,"
Not too late for AI, but too late for training meanwhile there's inference opportunity or something like that
https://morethanmoore.substack.com/p/nvidia-2026-q2-financia...
They've also not bothered investing in SW to add the H100 to their consumer drivers to work well on games. That doesn't mean it's impossible and none of that takes away from the fact that H100 and consumer GPUs are much more similar and could theoretically be made to run the same workloads at comparable performance.
They aren't just for gaming, there's also high-end workstations, but that's probably even more niche.
The only reason I can imagine for them leaving the money on the table is that they think that the AI boom won't last that much longer and they don't want to kill their reputation in the consumer market. But even in that case, I'm not sure it really makes that much sense.
Maybe if consumer GPUs were literally just datacenter silicon that didn't make the grade or something, it would make sense but I don't think that's the case.
I purposefully compare AI boom with the dot com bubble because we all knew how important the internet became eventually, but investments in it were way ahead of its time.
It is a false dichotomy. They can spend the bare minimum to stay in the game card market while fabing AI cards. At this point that is just an insurance premium.
Everything on-die, and with chiplets in-package, is the Intel way.
Default, average integrated graphics will continue to "statisfice" for a greater and greater portion of the market with integrated graphics continuing to grow in power.
The smaller the node the smaller the yield, chiplets is a necessity now (or architectural changes like Cerebras).
But reducing size will still increase yield since you can pick and choose.
Not to mention, Intel having vertical integration gave Intel flexibility, customization, and some cost saving advantages that Nvidia didn't have as much of, Nvidia being a fabless designer who are themselves a customer of another for-profit fab (TSMC).
If TFA is true, this was an anticompetitive move by Nvidia to preemptively decapitate their biggest competitor in 2030's datacenter GPU market.
Alchemist - First gen GPUs A310 GPUs are the low end, A770 are the high end. Powerful hardware for cheap, very spotty software at release. Got fixed up later.
Battlemage - Second gen (current gen), only B570 and B580 GPUs came out. They said weren't gonna release more Battlemage GPUs after these because they wanted to focus on Celestial, but probably went back on it seeing how well the B580 was reviewed and the B770 is due to be released by the end of the year.
Celestial - Next gen GPUs, they were expected for release early 2026. This article claims it was cancelled, but personally I think it's too late to cancel a GPU this late in production. Especially when they basically skipped a generation to get it out faster.
I don't think that's fair, at all, for two reasons:
My whole career has been in HW. In the few startups I was in, where I was privy to the higher decisions, the executives themselves were predicting if hardware would continue next year. It's normal for things to flip-flop, decisions to change, normally, but especially when a product isn't competitive, and you don't have the resources to keep it that way. A leaker can be 100% correct in what they say the current decisions are, and 100% wrong the next week. I've seen it happen, with whole teams fired because they weren't needed anymore.
He makes it very clear when he's speculating and when he's relaying what he's been told, and his confidence for them all (some leakers are proven trustworthy, some are not). This alone is why he doesn't deserve to be called a moron. Morons can't comprehend or communicate uncertainty.
then all of a sudden nvidia decides to "invest" in intel... and all of a sudden it sounds like arc will be probably be getting cancelled.