The Post-Geforce Era: What If Nvidia Abandons PC Gaming?
Key topics
The notion that Nvidia might abandon PC gaming has sparked a lively debate about the potential consequences, with some commenters worrying that it could spell the "slow death of the democratization of tech" [0dayz]. Others, however, are more sanguine, pointing out that AMD could fill the gap, making it "far safer" to bet on their continued success [internet101010]. The discussion also veered into unexpected territory, with some commenters pondering the potential impact on Nintendo consoles and the types of games that might thrive in a post-GeForce world, with one commenter suggesting that indie games with good designs could benefit [pjmlp]. As the conversation unfolded, it became clear that opinions are sharply divided on whether Nvidia's hypothetical exit would harm or barely dent the PC gaming community.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
42m
Peak period
79
72-84h
Avg / period
22.9
Based on 160 loaded comments
Key moments
- 01Story posted
Dec 20, 2025 at 5:25 AM EST
13 days ago
Step 01 - 02First comment
Dec 20, 2025 at 6:07 AM EST
42m after posting
Step 02 - 03Peak activity
79 comments in 72-84h
Hottest window of the conversation
Step 03 - 04Latest activity
Dec 27, 2025 at 10:29 PM EST
5d ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
Let's be real, the twitch FPS CoD players aren't going to give that up and play a boring life simulator.
This has the potential to harm a lot of businesses from hardware to software companies, and change the lives of millions of people.
Theres a LOT of games that compete with AAA-massive-budget games on aggregate like Dwarf Fortress, CS, League, Fortnite, people are still playing arma 2, dayz, rust, etc Rainbow Six: Siege still has adherents and even cash-payout tournaments. EvE: Online, Ultima Online, Runescape, still goin'
These games have like no advertising and are still moneymakers. Eve and UO are like 20 and 30 years old. Heck, Classic WoW!
Many gacha titles now offer amazing pc graphics on nvidia cards compared to mobile.
I feel like league of legends has, I honestly haven’t checked!
It gonna be ok.
Can you elaborate a little? What, exactly, is your concern here? That you won't have nvidia as a choice? That AMD will be the only game in town? That gpu market will move from duopoly ( for gaming specifically ) to monopoly? I have little to go on, but I don't really want to put words in your mouth based on minimal post.
Not a locked ecosystem console or a streaming service with lag!
Separately, do you think they won't try to ingratiate themselves to gamers again once AI market changes?
Do you not think they are part of the cartel anyway ( and the DIY market exists despite that )?
<< So DIY market is gone.
How? One use case is gone. Granted, not a small one and one with an odd type of.. fervor, but relatively small nonetheless. At best, DIY market shifts to local inference machines and whatnot. Unless you specifically refer to gaming market..
<< That kills lots of companies and creators that rely on the gaming market.
Markets change all the time. EA is king of the mountain. EA is filing for bankruptcy. Circle of life.
<< It’s a doom spiral for a lot of the industry.
How? For AAA? Good. Fuck em. We have been here before and were all better for it.
<< If gaming is just PlayStation and switch and iGPUs there is a lot less innovation in pushing graphics.
Am I reading it right? AMD and Intel is just for consoles?
<< It will kill the hobby.
It is an assertion without any evidence OR a logical cause and effect.
So far, I am not buying it.
Firmly in old-guy “this content should not exist” camp
That cartel is imply flying high right now, convinced they got the market by the balls.
They don't. Just give it most of 2026 and you'll see.
If my hobby is ruined and I can’t have fun, I’m going to be an asshole and make everyone else unhappy.
CoD is also huge on Playstation.
Totally niche appeal, yeah right.
Oh, we can only hope!
>This has the potential to harm a lot of businesses from hardware to software companies, and change the lives of millions of people.
Including millions of gamers, but for the better.
Why can’t you let people enjoy their hobby?
But what's most insane is trying to draw any parallels between gaming and these other things - something that was literally engineered to ruin human lives, biologically (hard drugs) or psychologically (gambling). The harm and evil caused by these two industries is incomprehensible, and trying to fit gaming in among them both downplays the amount of suffering inflicted by gambling and hard drugs, as well as villainizes normal people - like the hundreds of millions of people who play games in a sane, non-problematic way or indie game devs who make games because they want to express themselves artistically.
Anyways, I gotta log off HN for a while. I can feel my gaming withdrawal kicking in. I've bankrupted myself four times by only spending my money on gaming, and I've been in and out of rehab centres and ERs as I've been slowly destroying my body with gaming in a spiral of deadly addiction. I think I'll have to panhandle and threaten strangers on the street to buy some Steam cards.
Yes.
Or to put it more succinctly, would you want your obituary to lead with your call of duty prowess?
Excellent satire.
Of course, back here in reality, it's possible to consume games and all of those other things in moderation, and most of us do.
Thank you for your consideration.
Not all games need to be that, but Ghost of Tsushima in GBA Pokemon style is not the same game at all. And is it badly designed ? I also don't think so. Same for many VR games which make immersion meaningful in itself.
We can all come up with a litany of bad games, AAA or indie, but as long as there's a set of games fully pushing the envelope and bringing new things to the table, better hardware will be worth it IMHO.
The whole point is to convey details of an area you never lived in, of an actual place you never visited.
I'd make the same argument for Half-Life Alyx or BioHazard, the visceral reaction you get from a highly detailed and textured world works at a different level than just "knowing" what you have in front of your eyes.
Your brain is not filling the gaps, it is taking in the art of the creator.
RE 7 Biohazard was made for the PS4! And its VR version and Half-Life Alyx are VR games, which probably do require higher graphics fidelity, so they're not exactly the same thing as conventional games.
That might be the fundamental divide, for that category of games I'm more on the VR camp and will settle for 2D only for convenience or availability.
I see it with different expectations than games like Persona or Zelda (or GTA?) which could compete solely on the mechanics and a lot more, and I get the feeling you're comparing it more to these genres?
Biohazard on PS4 was very meh to me, at that level I feel it could get down to Switch graphics to prioritize better game mechanics and more freedom of play. I never loved the clunkiness, as an action games it's pretty frustrating, and the VR game is even worse in gameplay quality. The immersiveness in VR is the only redeeming quality IMHO.
VR, sure, you want a lot of frames on 2 screens, that requires beef so the visual fidelity on same GPU will be worse than on screen, but other than that if anything graphical part of games have flatlined for me.
Also, putting the money literally anywhere else gonna have better results game quality wise. I want better stories and more complex/interesting systems, not few more animated hairs
To note, cost and hardware availability is I think one half of the critical reasons people don't get into VR (other half being the bulkiness and puke ?). In a roundabout way, GPU melting games helped get better hardware at mainstream prices. Until crypto and AI happened. And now the Steam Frame faced with the RAM price situation.
> 5 years
I don't play it, but Infinite Nikki comes to mind, and the visuals are the core experience. I wonder how much a game like Arcknight Enfield taxes the player hardware, given they're pushing the 3D modelling side.
I agree with you on the plateauing part, in that the gaming industry seems to have mostly shoved HDPI in the corner. It costs so much more to produce a game at that visual quality in the first place, and PC makers and benchmarks focus on FHD performance, so ROI is that lower on the marketing side.
It kinda makes me sad, like being told "8-bit art is enough for images, we should focus on composition, how many Vermeer or DaVinci like painters do we expect anyway ?"
Me neither but recommended requirements on Steam are like.. RTX 2060, so 6 years old mid grade video card. We really don't need more power than we already have to make beautiful games.
> It kinda makes me sad, like being told "8-bit art is enough for images, we should focus on composition, how many Vermeer or DaVinci like painters do we expect anyway ?"
Except it isn't ? At this point more power is only really needed if you want to go hardcore into photorealism, and to actually use all that power you need massive budget just to produce all the assets at required quality.
It's like saying "if only painters had even smaller brushes, we could get photographical quality paintings." Does it really make art that much better ?
Steam's recommended specs are on the conservative side, usually adapted to play with the average settings.
Pushing the game setting to ultra at 4K gives it a mere 100fps on a RTX 4090 https://www.youtube.com/watch?v=rju22K1lfQY
That's a lot of dedication, but yes, some people will really enjoy the game in its full splendor. Telling them what they need or don't need, or what they should enjoy misses the point of games IMHO (avid players have probably already spent more than a top end gaming PC on the dresses)
> Does it really make art that much better ?
Putting a technical limit on what makes or doesn't make art better sounds fundamentally off to me.
I kinda understand your point on diminishing returns, except we haven't even reached a good frame rate at HDPI for 24~32"ish screens. And we'll always move to the next level.
"XXX should be enough for everyone" kind of assertions have never panned out well IMHO.
My son is using that card, today, and I'm amazed at everything that card can still power. I had a 5080 and just comparing a few games, I found if he used the SuperResolution correctly, he can set the other game settings at the same as mine and his frame-rate isn't far off (things like Fortnite, not Cyberpunk 2077)
There are many caveats there, of course. AMD's biggest problem is in the drivers/implementation for that card. Unlike NVidia's similar technology, it requires setting the game at a lower resolution which it then "fixes" and it tends to produce artifacts depending on the game/how high those settings go. It's a lot harder to juggle the settings between the driver and the game than it should be.
The other cool things is they also have Frame Gen available in the driver to apply to any game, unlike DLSS FG which only works on a few games. You can toggle it on in the AMD software just below the Super Res option. I quickly tried it in a few games and it worked great if you're already getting 60+ FPS, no noticeable artifacts. Though going from 30=>60 doesn't work, too many artifacts. And the extra FPS are only visible in the AMD software's FPS counter overlay, not in other FPS counter overlays.
Don't get too worried. People still can and do vote with their wallets. Additional vector of attack against greedy capitalists is also the fact that the economy is not doing great either.
They cannot increase prices too much.
I also predict that the DDR5 RAM price hikes will not last until 2027 or even 2028 as many others think. I give it maximum one year, I'd even think the prices will start slightly coming down during summer 2026.
Reading and understanding economy is neat and all but in the modern age some people forget that the total addressable market is not infinite and that the regular customers have relatively tight budgets.
this is true in general
but the barrier to entry for gaming GPUs is massive (hundreds of billions)
intel have been working at it for close to a decade and now just about have a workable product, at the low end
By the time those are depleted we'll have a new player.
I'm hoping the Chinese fabs can finally catch up enough to provide a meaningful alternative both for memory and compute. They're more or less the only ones still making consumer grade stuff in lots of other segments.
No, I prefer them touching grass and talking to some people, or getting a less addictive and time-wasting hobby.
It is so bad that is almost impossible to buy a traditional desktop on regular computer stores, there are only fish tanks with rainbows on sale.
In the latter case, I'd expect patches for AMD or Intel to become a priority pretty quickly. After all, they need their products to run on systems that customers can buy.
Intel is just plain not capable of it because it's not really a GPU, more a framebuffer with a clever blitter.
NVIDIA, like everyone else on a bleeding edge node, has hardware defects. The chance goes up massively with large chips like modern GPUs. So you try to produce B200 cores but some compute units are faulty. You fuse them off and now the chip is a GP102 gaming GPU.
The gaming market allows NVIDIA to still sell partially defective chips. There’s no reason to stop doing that. It would only reduce revenue without reducing costs.
B200 doesn't have any graphics capabilities. The datacenter chips don't have any graphical units, it's just wasted die space.
As long as gaming GPUs will compete for same wafer space that AI chips use, the AI chips will be far more profitable to NVIDIA
If it does, I think it would be a good thing.
The reason is that it would finally motivate game developers to be more realistic in their minimum hardware requirements, enabling games to be playable on onboard GPUs.
Right now, most recent games (for example, many games built on Unreal Engine 5) are unplayable on onboard GPUs. Game and engine devs simply don't bother anymore to optimize for the low end and thus they end up gatekeeping games and excluding millions of devices because for recent games, a discrete GPU is required even for the lowest settings.
Nowadays a game is only poorly optimized if it's literally unplayable or laggy, and you're forced to constantly upgrade your hardware with no discernible performance gain otherwise.
Not because the developers were lazy, but because newer GPUs were that much better.
If you think that the programmers are unmotivated (lazy) or incompetent; you’re wrong on both counts.
The amount of care and talent is unmatched in my professional career, and they are often working from incomplete (and changing) specifications towards a fixed deadline across multiple hardware targets.
The issue is that games have such high expectations that they didn’t have before.
There are very few “yearly titles” that allow you to nail down the software in a nicer way over time, its always a mad dash to get it done, on a huge 1000+ person project that has to be permanently playable from MAIN and where unit/integration tests would be completely useless the minute they were built.
The industry will end, but not because of “lazy devs”, its the ballooned expectations, stagnant costs, increased team sizes and a pathological contingent of people using games as a (bad) political vehicle without regard for the fact that they will be laid off if they can’t eventually generate revenue.
And a friend of mine still mostly plays the goddamn Ultima Online, the game that was released 28 years ago.
Your expectations of that game are set appropriately. Same with a lot of Indy games, the expectation can be that its in early access for a decade+. You would never accept that from, say, Ubisoft.
I fully agree and I really admire people working on the industry. When I see great games which are unplayable in the low end because of stupidly high minimum hardware requirements, I understand game devs are simply responding to internal trends within the industry, and especially going for a practical outcome by using an established game engine (such as Unreal 5).
But at some time I hope this GPU crunch forces this same industry to allocate time and resources either at the engine or at the game level to truly optimize for a realistic low end.
I don’t think any company that has given up their internal engine could invest 7 years of effort without even having revenue from a game to show for it.
So the industry will likely rally around Unreal and Unity- and I think a handful of the major players will release their engines on license… but Unreal will eat them alive due to the investments in Dev UX (which is much-much higher than proprietary game engines IME). Otherwise the only engines that can really innovate are gated behind AAA publishers and their push for revenue (against investment for any other purpose).
All this to say, I’m sorry to disappoint you, its very unlikely.
Games will have to get smaller and have better revenues.
But maybe, just maybe, they could request Epic or Unity to optimize their engines better for the lower end.
Optimisation is almost universally about tradeoffs.
If you are a general engine, you can’t easily make those tradeoffs, and worse you have to build guardrails and tooling for many cases, slowing things down further.
The best we can hope for is even better profiling tools from Epic, but they’ve been doing that for the last couple of years since borderlands.
No T&L meant everything was culled, clipped, transformed and per-vertex divided (perspective, lighting) on CPU.
Then you have brute force approach. Voodoo 1/2/3 doesnt employ any obvious speedup tricks in its pipeline. Every single triangle pushed into it is going to get textured (bilinear filtering, per pixel divide), shaded (lighting, blending, FOG applied) and then in the last step the card finally checks Z-buffer to decide between writing all this computed data to buffer or simply throwing it away.
It took a while before GPU devs started implementing low-hanging fruit optimizations https://therealmjp.github.io/posts/to-earlyz-or-not-to-early...
Hierarchical-Z, Fast Z clearing, Compressed Z buffer, Compressed Textures, Tiled shading. It all got added slowly one step at a time in early 2000.
For awhile there you did have noticeable gameplay differences- those with GL quake could play better kind of thing.
20 fps is not fine. I would consider that unplayable.
I expect at least 60, ideally 120 or more, as that's where the diminishing returns really start to kick in.
I could tolerate as low as 30 fps on a game that did not require precise aiming or reaction times, which basically eliminates all shooters.
Perhaps, but they also turned off Nanite, Lumen and virtual shadow maps. I'm not a UE5 hater but using its main features does currently come at a cost. I think these issues will eventually be fixed in newer versions and with better hardware, and at that point Nanite and VSM will become a no-brainier as they do solve real problems in game development.
Or even before hitting the shelves, cue Trio3D and Mystique, but tha's another story.
DOOM and Battlefield 6 are praised for being surprisingly well optimized for the graphics they offer, and some people bought these games for that reason alone. But I guess in the good old days good optimization would be the norm, not the exception.
This is an insane thing to say.
> Game and engine devs simply don't bother anymore to optimize for the low end
All games carefully consider the total addressable market. You can build a low end game that runs great on total ass garbage onboard GPU. Suffice to say these gamers are not an audience that spend a lot of money on games.
It’s totally fine and good to build premium content that requires premium hardware.
It’s also good to run on low-end hardware to increase the TAM. But there are limits. Building a modern game and targeting a 486 is a wee bit silly.
If Nvidia gamer GPUs disappear and devs were forced to build games that are capable of running on shit ass hardware the net benefit to gamers would be very minimal.
What would actually benefit gamers is making good hardware available at an affordable price!
Everything about your comment screams “tall poppy syndrome”. </rant>
But solitare ran on a 486 and I don’t see what of the gameplay requires massive CPU.
But the buffer for a full HD screen fill most of the memory of a typical 486 computer I think
I don't think it's insane. In that hypothetical case, it would be a slightly painful experience for some people that the top end is a bit curtailed for a few years while game developers learn to target other cards, hopefully in some more portable way. But also feeling hard done by because your graphics hardware is stuck at 2025 levels for a bit is not that much of hardship really, is it? In fact, if more time is spent optimising for non-premium cards, perhaps the premium card that you already have will work better then the next upgrade would have.
It's not inconceivable that the overall result is a better computing ecosystem in the long run. The open source space in particular, where Nvidia has long been problematic. Or maybe it'll be a multi decade gaming winter, but unless hangers m gamers stop being willing to throw large amounts of money chasing the top end, someone will want that money even if Nvidia didn't.
> In fact, if more time is spent optimising for non-premium cards, perhaps the premium card that you already have will work better then the next upgrade would have.
Nah. The stone doesn’t have nearly that much blood to squeeze. And optimizations for ultralow-end may or may not have any benefit to high end. This isn’t like optimizing CPU instruction count that benefits everyone.
Would there be a huge drive towards debloating software to run again on random old computers people find in cupboards?
They'll just move to remote rendering you'll have to subscribe to. Computers will stagnate as they are, and all new improvements will be reserved for the cloud providers. All hail our gracious overlords "donating" their compute time to the unwashed masses.
Hopefully AMD and Intel would still try. But I fear they'd probably follow Nvidia's lead.
Game streaming works well for puzzle, story-esque games where latency isn't an issue.
Any game that is requires high APM (Action Per Minute) will be horrible to play via streaming.
I feel as if I shouldn't really need to explain this on this site, because it should be blindingly obvious that this will always be an issue with any streamed games for the same reason you have a several seconds lag between what happening on a live sports event and what you see on the screen.
The economics of it also have issues, as now you have to run a bunch more datacenters full of GPUs, and with an inconsistent usage curve leaving a bunch of them being left idle at any given time. You'd have to charge a subscription to justify that, which the market would not accept.
Not that its good or bad tho but we could probably have something more akin to spot instances of gpu being given for gaming purposes.
I do see a lot of company are having GPU access costs per second/instant shutdown/restart I suppose but overall I agree
My brother recently came for the holidays and I played ps5 for the first time on his mac connected to his room 70-100 kms away and honestly, the biggest factor of latency was how far the wifi connection (which was his phone's carrier) and overall, it was a good enough experience but I only played mortal kombat for a few minutes :)
and I meant that I think that the ps5 can run far away and you can still connect to it from your laptop and even connect a controller to your laptop (as my brother did) to play with a controller which runs on a mac and then it uses the ps5 itself
All in all, I found it really cool for what its worth.
But if this does happen it will be in my opinion the start of a slow death of the democratization of tech.
At best it means we're going to be relegated to last tech if even that, as this isn't a case of SAS vs s-ata or u.2 vs m.2, but the very raw tech (chips).
41 more comments available on Hacker News