Nvidia Buys $5b in Intel
Mood
heated
Sentiment
mixed
Category
other
Key topics
Nvidia is investing $5 billion in Intel, representing a 5% ownership stake, and the two companies will collaborate on developing x86 system-on-chips with Nvidia's RTX GPU chiplets, sparking discussions about the implications for the semiconductor industry and potential antitrust concerns.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
15m
Peak period
156
Day 1
Avg / period
53.3
Based on 160 loaded comments
Key moments
- 01Story posted
Sep 18, 2025 at 7:04 AM EDT
2 months ago
Step 01 - 02First comment
Sep 18, 2025 at 7:19 AM EDT
15m after posting
Step 02 - 03Peak activity
156 comments in Day 1
Hottest window of the conversation
Step 03 - 04Latest activity
Sep 21, 2025 at 2:07 PM EDT
2 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
USA, where the federal government is picking winners and losers by making risky stock bets with public money.
This needlessly divisive and devoid of any factual basis. No gulags will exist and you know it.
What about "Alligator Alcatraz", that has been called "concentration camp" [1] (so comparable with a gulag), or where the Korean detainees from the raid on the Hyundai/LG plant ended up, alleging utterly horrible conditions [2]? And there's bound to be more places like the latter, that was most likely just the tip of the iceberg and we only know about the conditions there because the South Korean government raised a huge stink and got the workers out of there.
Okay, Alcatraz 2.0 did get suspended in August to my knowledge, but that's only temporary. It's bound to get the legal issues cleaned up and then be re-opened - or the case makes its way through to the Supreme Court with the same result to be expected.
[1] https://newrepublic.com/article/197508/alligator-alcatraz-tr...
ICE doesn't reliably make any distinction, not since they hired thugs off of the streets and issued arrest quotas. Doesn't matter if the arrested have to be released later on.
Any denial of due process to any person is a gross violation of our most important right. Without the guarantee of due process to everyone, no one has any rights because those in power can violate rights at a whim.
Ofc I would kind of hope/expect antitrust to object given that Intel makes both GPUs and CPUs, and Nvidia is/has dipped their toes into CPU production as well.
Intel still has to go through a lot of reorg (i.e. massive cuts) to get to a happy place, and this is what their succession of CEOs have been procrastinating over.
One wonders just how bad things must have been internally for that to be the state of one of their core IPs in this day and age...
However I do imagine Intel GPUs, that were never great to start with, might be doomed, long term.
Also another possibility would be, there goes One API, which I doubt many people would care about, given how many rebrands SYSCL already went through.
I mean that also applies to Intel and Nvidia. Intel does make GPUs but their market impact is basically zero.
As customer they get better access to Intel Foundry and can offload some capacity from TSMC.
As I understand it the government's shares are non-voting.
Intel has a market cap just 2.5% of NVDA, so you could give away just 2.5% of your stock to buy the entirety of Intel. It's bonkers.
It looks like a good deal either way and in any amount. But of course I am no expert.
This all ignores the near complete lack of product out of their advanced processes as well.
There have been a lot of mergers where that has not happened.
They basically baked in a massive investment profit into the deal. When you factor in the stock jump since this announcement, Nvidia has already made billions.
:-)
[0]: <https://www.fudzilla.com/6882-nvidia-continues-comic-campaig...>
Erm, a rather important point to bury down the story. The fiest question on anyone’s lips will be is this $5bn to build new chip technology, or $5bn for employees to spend on yachts?
> Intel stock experienced dilution because the U.S. government converted CHIPS Act grants into an equity stake, acquiring a significant ownership percentage at a discounted price, which increased the total number of outstanding shares and reduced existing shareholders' ownership percentage, according to The Motley Fool and Investing.com. This led to roughly 11% dilution for existing shareholders
Intel is up 30% pre market on this news so I think the existing shareholders will be fine.
To get money from the outside, you either have to take on debt or you have to give someone a share in the business. In this case, the board of directors concluded the latter is better. I don't understand why you think it is gross.
The business is looking for additional capital. You can only do that by either selling new shares or raising debt.
> in the business like everyone else, not increasing the total share count or causing dilution. They chose not to do this because it would have been more expensive due to properly compensating existing shareholders. So it's spiritually just theft.
Shareholder dilution isn't inherently theft. Specific circumstances, motivations, and terms of issuance have a bearing on whether the dilution is harmful or whether it is necessary for the business.
For instance, it can be harmful if: minority shareholders are oppressed, shares are issued at a deeply discounted price with no legitimate business need or to benefit insiders at the expense of other shareholders, or if the raised capital isn't used effective to grow the company.
Dilution can be beneficial, such as when the raised capital is used for growth, employee compensation via employee stock options, etc.
Looks like using GPU IP to take over other brands' product lines is now officially an nVidia strategy.
I guess the obvious worry here is whether Intel will continue development of their own dGPUs, which have a lovely open driver stack.
So long as the AI craze is hanging in there it feels like having that expertise and IP is going to have high potential upside.
Would be foolish to throw that away now that they're finally getting closer to "a product someone may want to buy" with things like B50 and B60.
They wanted to launch DGX Spark early summer and it's nowhere to be seen, while strix halo is shipping in over 30+ SKUs from all major manufacturers.
AMD's actual commitment to open innovation over the past ~20 years has been game changing in a lot of segments. It is the aspect of AMD that makes it so much more appealing than intel from a hacker/consumer perspective.
> Nvidia will also have Intel build custom x86 data center CPUs for its AI products for hyperscale and enterprise customers.
Hell has frozen over at Intel. Actually listening to people that want to buy your stuff, whatever next? Presumably someone over there doesn't want the AI wave to turn into a repeat of their famous success with mobile.
In the event Intel ever do get US based fabrication semi competitive again (and the national security motivation for doing so is intense) nVidia will likely have to be a major customer, so this does make sense. I remain doubtful that Intel can pull it off, and it will have to come from someone else.
They turned down Acorn about the 286, which led to Acorn creating the Arm, they have turned down various console makers, they turned down Apple on the iPhone, and so on. In all cases they thought the opportunities were beneath them.
Intel has always been too much about what they want to sell you, not what you need. That worked for them when the two aligned over backwards compat.
Clearly the threat of an Arm or RISC-V finding itself fused to a GPU running AI inference workloads has woken someone up, at last.
The problem is, console manufacturers know precisely how much of their product they anticipate to sell, and it's usually a lot. The PlayStation 5 is 80 million units so far.
And at that scale, the console manufacturers want to squeeze every vendor as hard as they can... and Intel didn't see the need to engage in a bidding war with AMD that would have given them a sizable revenue but very little profit margin compared to selling Xeon CPUs to hyperscalers where Intel has much more leverage to command higher prices and thus higher margins.
> they turned down Apple on the iPhone
Intel just was (and frankly, still is) unable to compete on the power envelope with ARM, that's why you never saw x86 take off on Android as well despite quite a few attempts at it.
Apple only chose to go for Intel with its MacBook line as PowerPC was practically dead and offered no way to extract more performance, and they dropped Intel as soon as their own CPUs were competitive. To get Intel CPUs to the same level of power efficiency that M-series CPUs have would require a full rework of the entire CPU infrastructure and external stack, that would require money that even Intel at its best frankly did not have. And getting x86 to be power effective enough for a phone? Just forget it.
> Clearly the threat of an Arm or RISC-V finding itself fused to a GPU running AI inference workloads has woken someone up, at last.
Actually, that is surprising for me as well. NVIDIA's Tegra should easily be powerful enough to run the OS for training or inference workload. If I were to guess, NVIDIA wants to avoid getting caught too hard on the "selling AI shovels" train.
They did push hard on their UMPC x86 SoCs (Paulsbo and derivatives) to Sony, Nokia, etc. These were never competitive on heat or battery life.
You probably meant Poulsbo (US15W) chipset
80 million in 5 years is a nothing burger as far as volume.
NVDA sold 153 million Tegra units to Nintendo in 8 years, so 1.5M units a month. That's just as comparable.
[1] https://www.servethehome.com/on-ice-lake-intel-xeon-volumes-...
And so that gave AMD an opening, and with that opening they got to experiment with designs, tailor a product, get experience and industrial marketshare, and they were able to continue to offer more and better products. Intel didn't just miss a mediocre business opportunity, they missed out on becoming a trusted partner for multiple generations, and they handed market to AMD that AMD used to be a better market competitor.
AMD isn't precisely a market competitor. The server and business compute market is still firmly Intel and there isn't much evidence of that changing unless Apple drops M series SoCs to the wide open market which Apple won't do. Intel could probably release a raging dumpster fire and still go strong, oh wait, that's what they've been doing the last few years.
AMD is only a competitor in the lower end of the market, a market Intel has zero issue handing to AMD outright - partially because a viable AMD keeps the antitrust enforcers from breathing down their neck, but more because it drags down per-unit profit margins to engage in consoles and the lower rungs and niches.
This is not true anymore, as it IS changing, and very rapidly. AMD has shot up to 27.3% of the server market share, which they haven't had since the Opteron days 20 years ago. Five years ago their server market share was very small single digits. They're half of desktops, too. https://www.pcguide.com/news/no-amd-and-intel-arent-50-50-in...
Intel, at one of its lowest low, still come up with lunar lake, which is not as efficiency as Apple M, but still, quite impressive.
I bet if they were focus on mobile when they are at their peak, they could come up with something similar to Apple M
It leads to mistakes like you mention, where a new market segment or new entrant is not a sure thing. And then it leads to mistakes like Larrabee and Optane where they talk themselves into overconfidence (“obviously this is a great product, we wouldn’t be doing it if it wasn’t guaranteed to make $1B in the first year”).
It is very hard to grow a business with zero risk appetite. You can’t take risky high return bets, and you can’t acknowledge the real risk in “safe” bets.
2010-2011 was also the time that AMD were starting to moan a bit about DX11 and the higher level APIs not being sufficient to get the most out of GPUs, which led to Mantle/Vulkan/DX12 a few years down the road. Intel did a bit regarding massively parallel software rendering, with the flexibility to run on anything x86 and implement features as you liked, or AMD's efforts for 'fusion' (APU+GPU, after recently acquiring ATi) or HSA which I seem to recall was about dispatching different types of computing to the best suited processor(s) in the system for it. However I got the impression a lot of development effort is more interested in progressing on what they already have instead of starting in a new direction, and game studios want to ship finished and stable/predictable product, which is where support from intel would have helped.
But certainly Intel wasn’t willing to wait for the market. Didn’t make $1 billion instantly; killed.
Sounds like they will someday soon.
There will always be giant, faraway GPU supercomputer clusters to train models. But the future of inference (where the model fits) is local to the CPU.
It's typical corporate venturing and reporting to a CFO. Google is not much better with them cutting their small(er) projects.
This relates to the Intel problem because they see the world the way you just described, and completely failed to grasp the importance of SoC development where you are suddenly free to consider the world without the preexisting buses and peripherals of the PC universe and to imagine something better. CPU cores are a means to an end, and represent an ever shrinking part of modern systems.
This is very likely the new culture that LBT is bringing in. This can only be good.
But with the state of the courts today... who knows..
Intel was well on its way to be a considerable threat to NVIDIA with their Arc line of GPUs, which are getting better and cheaper with each generation. Perhaps not in the enterprise and AI markets yet, but certainly on the consumer side.
This news muddies this approach, and I see it as a misstep for both Intel and for consumers. Intel is only helping NVIDIA, which puts them further away from unseating them than they were before.
Competition is always a net positive for consumers, while mergers are always a net negative. This news will only benefit shareholders of both companies, and Intel shareholders only in the short-term. In the long-term, it's making NVIDIA more powerful.
I think this partnership will damage nvidia. It might damage intel, but given they're circling the drain already, it's hard to make matters worse.
It's probably bad for consumers in every dimension.
Or to take the opposite, if nvidia rolled over intel and fired essentially everyone in the management chain and started trying to run the fabs themselves, good chance they'd turn the ship around and become even more powerful than they already are.
How was Intel "circling the drain"?
They have a very competitive offering of CPUs, APUs, and GPUs, and the upcoming Panther Lake and Nova Lake architectures are very promising. Their products compete with AMD, NVIDIA, and ARM SoCs from the likes of Apple.
Intel may have been in a rut years ago, but they've recovered incredibly well.
This is why I'm puzzled by this decision, and as a consumer, I would rather use a fully Intel system than some bastardized version that also involves NVIDIA. We've seen how well that works with Optimus.
Also their network cards no longer work properly which is deeply aggravating as that used to be something I could rely on, just bought some realtek ones to work around the intel ones falling over.
We must live in different universes, then.
Intel's 140V competes with and often outperforms AMD's 890M, at around half the power consumption.[1]
Intel's B580 competes with AMD's RX 7600 and NVIDIA's RTX 4060, at a fraction of the price of the 4060.[2]
They're not doing so well with desktop and laptop CPUs, although their Lunar Lake and Arrow Lake CPUs are still decent performers within their segments. The upcoming Panther Lake architecture is promising to improve this.
If these are not the signs of competitive products, and that they're far from "circling the drain", then I don't know what is.
FWIW, I'm not familiar with the health of their business, and what it takes to produce these products. But from a consumer's standpoint, Intel hasn't been this strong since... the early 00s?
[1]: https://www.notebookcheck.net/Radeon-890M-vs-Arc-140V_12524_...
[2]: https://www.notebookcheck.net/Intel-Arc-B580-Benchmarks-and-...
The GPUs might be competitive on price, but that's about it. It's pretty much a hardware open beta.
Like I said, Intel may not be market leader in some segments, but they certainly have very competitive products. The fact they've managed to penetrate the dGPU duopoly, while also making huge strides with their iGPUs, is remarkable on its own. They're not leaders on desktops and servers, but still have respectable offerings there.
None of this points to a company that's struggling, but to a healthy market where the consumer benefits. News of two rivals collaborating like this is not positive for consumers.
>a company that's struggling, but to a healthy market where the consumer benefits
I would argue that the market is only marginally healthier than, say, 2018. Intel is absolutely struggling. The 13th and 14th generation were marred by degradation issues and the 15th generation is just "eh", with no real reason to pick it over Zen. The tables have simply flipped compared to seven years ago; AMD at least is not forcing consumers to change motherboards every two years.
And Intel doesn't even seem to care too much that they're losing relevance. One thing they could do is enable ECC on consumer chips like AMD did for the entire Ryzen lineup, but instead they prefer to keep their shitty market segmentation. Granted, I don't think it would move too many units, but it would at least be a sign of good will to enthusiasts.
Some would say that's circling the drain.
Intel's foundry costs are probably competitive with nvidia too - nvidia has too much opportunity cost if nothing else.
Nah, nobody cares about that. Even in their heyday, SLI and CrossFire barely made sense technologically. That market is basically non-existent. There's more people now wanting to run multiple GPUs for inference than there ever were who were interested in SLI, and those people can mix and match GPUs as they like.
While it doesn't quite compete at performance and power consumption, it does at price/performance and overall value. It is a $250 card, compared to the $300 of the 4060 at launch. You can still get it at that price, if there's stock, while the 4060 hovers around $400 now. It's also a 12GB card vs the 8GB of the 4060.
So, sure, this is not competitive at the high-end segment, but it's remarkable what they've accomplished in just a few years, compared to the decades that AMD and NVIDIA have on them. It's definitely not far fetched to assume that the gap would only continue to close.
Besides, Intel is not only competing at GPUs, but APUs, and CPUs. Their APU products are more performant and efficient than AMD's (e.g. 140V vs 890M).
Consumers still have AMD as an alternative for very decent and price attractive GPUs (and CPUs).
AMD has always followed closely NVIDIA in crippling their cheap GPUs for any other applications.
After many years of continuously decreasing performance of the "consumer" GPUs, only Intel has offered in the Battlemage GPUs FP64 performance comparable with what could be easily obtained 10 years ago, but no longer today.
Therefore, if the Intel GPUs disappear, then the choices in GPUs will certainly become much more restricted than today. AMD has almost never attempted to compete with NVIDIA in features, but whenever NVIDIA dropped some feature, so did AMD.
Moreover, there were claims that the memory errors on GTX Titan were quite frequent. On graphics applications memory errors seldom matter, but if you have to do a computation twice to be certain that there were no memory errors affecting the results, that removes much of the performance advantage of a GPU.
A cheap GPU ten-plus years ago was $200-300. That GPU either had no FP64 units at all, or had them "crippled" just like today. What happened between then and now is that the $1k+ market segment became the $10k+ market segment (and the $200+ market segment became the $500+ market segment). That sucks, and nVidia and AMD are absolutely milking their customers for all they're worth, but nothing really got newly "crippled" along the way.
Intel isn’t at that point, but the companies trajectory isn’t looking good. I’d happily sacrifice ARC to keep a duopoly in CPU’s.
- a bigger R&D budget for their main competitor in the GPU market
- since Nvidia doesn't have their own CPUs, they risk becoming more dependent on their main competitor for total system performance.
This is why they built the Grace CPU - noting that they're using Arm's Neoverse V2 cores rather than their own design.
Doesn't feel the same because the 1997 investment was arranged by Apple co-founder Steve Jobs. He had a long personal relationship with Bill Gates so could just call him to drop the outstanding lawsuits and get a commitment for future Office versions on the Mac. Basically, Steve Jobs at relatively young age of 42 was back at Apple in "founder mode" and made bold moves that the prior CEO Gil Amelio couldn't do.
Intel doesn't have the same type of leadership. Their new CEO is a career finance/investor instead of a "new products new innovation" type of leader. This $5 billion investment feels more like the result of back-channel discussions with the US government where they "politely" ask NVIDIA to help out Intel in exchange for less restrictions selling chips to China.
Stinks of Mussolini-style Corporatism to me.
Let's assume Trump admin pressured Nvidia to invest in intel.
Chips act (voted by Democrats / Biden) gave Intel up to $7.8 billion of YOUR money (taxes) in form of direct grants.
Was it more of "Mussolini-style corporatism" to you or not?
It isn't the "method of communication". It's legislation vs. coercion (in the speculative scenario from the parent comment).
[0] https://en.wikipedia.org/wiki/Corporatism#Neo-corporatism
[1] https://en.wikipedia.org/wiki/Corporatism#Fascist_corporatis...
There was not that much democracy in the French post-WW2 technocratic establishment, but I agree that they were not technically fascist (nor otherwise).
It also happened under G. W. Bush with banks and auto manufacturers, but the worst offense was under Nixon with his nationalization of passenger rail.
At least with the bank and car manufacturer bailouts the government eventually sold off their stocks, and with the Intel investment the government has non-voting shares, but the government completely controls the National Railroad Passenger Corporation, (the NRPC aka Amtrak) with the board members being appointed by the president of the United States.
We lost 20 independent railroads overnight, and created a conglomerate that can barely function.
If you fiddle and concentrate only on the top performers, the bottom falls out. Most of the US economy is still in small companies.
Investing in Apple and Borland were an counter-anti-trust legal move, keeping the competitors alive, but on life support. This way they could say to the government "yes there is competition".
Google does the same these days by keeping Firefox alive.
https://www.ft.com/content/12adf92d-3e34-428a-8d61-c91695119...
Had Apple failed, Microsoft would probably have been found to have a clear monopolistic position. And microsoft was already in hot waters due to InternetExplorer IIRC.
Apples demise wouldve nailed the case.
456 more comments available on Hacker News
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.