Look at How Unhinged GPU Box Art Was in the 2000s (2024)
Posted3 months agoActive3 months ago
xda-developers.comTechstoryHigh profile
calmpositive
Debate
20/100
GPU HistoryGaming CultureTech Nostalgia
Key topics
GPU History
Gaming Culture
Tech Nostalgia
The article showcases the wild and creative box art of GPUs from the 2000s, sparking nostalgia and discussion among commenters about the evolution of the tech industry and gaming culture.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
11m
Peak period
36
0-3h
Avg / period
9.6
Comment distribution115 data points
Loading chart...
Based on 115 loaded comments
Key moments
- 01Story posted
Oct 19, 2025 at 9:32 PM EDT
3 months ago
Step 01 - 02First comment
Oct 19, 2025 at 9:43 PM EDT
11m after posting
Step 02 - 03Peak activity
36 comments in 0-3h
Hottest window of the conversation
Step 03 - 04Latest activity
Oct 21, 2025 at 3:54 PM EDT
3 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45639498Type: storyLast synced: 11/20/2025, 4:47:35 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
I bought a small press book with a collection of this art and it was a fun little trip down memory lane, as I’ve owned some of the hardware (boxes) depicted in it.
For anyone else interested: https://lockbooks.net/pages/overclocked-launch
So I will probably install Linux (probably Debian) and move on and forget about those particular games... (~30 years since I first installed Linux on a PC!).
I built its successor in 2020, using a GPU from 2017. The longevity of the PS4 has given that thing serious legs, and I haven't seen the need to upgrade yet. It still runs what I want to run. It's also the first post-DOS x86/64 machine I've owned that has never had Windows installed.
Today, a layman couldnt chronologically sort CoD games from past 10 years from looks/play/feel, new Fifa and similar is _the_ same game but with new teams added to it, and virtually every game made is a "copycat with their own twist" with almost 0 technical invention.
Feels similar to how painting hasn’t had any revolution in new paints available.
I mostly play indie/retro/slightly-old games these days, so I mostly hear of the negatives for modern AAA, admittedly. I'm also tempted to complain about live service, microtransactions, gacha, season passes, and so on in recent big releases, but maybe that would be getting off-topic.
Just like Crysis did 18 years ago?
>it's no longer really feasible to keep most of your games installed all the time and ready to play.
Crysis was around 5% of common HDD back then. Now, it'd be equivalent of around 80 GiB now. That would be just about what Elden Ring with the DLC takes.
I think he meant the software side (game-systems wise), not hardware innovation
The AM5 platform is quite lacking when it comes to PCIe lanes - especially once you take USB4 into account. I'm hoping my current setup from 2019 survives until AM6 - but it seems AMD wants to keep AM5 for another generation or two...
x16 GPU + x4 NVMe + x4 USB = 24 direct CPU lanes covers 99% of the market, with add-ons behind the shared chipset bandwidth. The other 1% of the market are pushed to buy Threadripper/Epyc.
You'd be better off complaining about how Threadripper CPUs and motherboards are priced out of the enthusiast consumer market, than asking for the mainstream CPU platform to be made more expensive with the addition of IO that the vast majority of mainstream customers don't want to pay for.
Now it's a node in my proxmox cluster running transcodes for jellyfin. The circle of life
The only reason I'd upgrade is to improve performance for AI stuff.
I genuinely don't believe this to be true for AMD. I bought a 6600xt on Release Day and by the time I was able to build my complete PC, it had upstream linux kernel support. You can say what you will about AMD but any company that respects my freedoms enough to build a product with great linux support and without requiring any privacy invading proprietary software to use is a-ok in my book.
Fuck NVidia though.
That was 2021 though when AMD was still a relative underdog trying to claw market share from Nvidia from consumers. AMD of today has adjusted their prices and attitude to consumers to match their status as a CPU and GPU duopoly in the AI/datacenter space.
I remember when running linux on your computer at all was hit or miss, these days i can go to lenovo and buy a thinkpad with ubuntu preinstalled and know that it will just work.
And it looks like Linux-Libre also opposes CPU microcode updates[1], as if bugged factory microcode with known security vulnerabilities is any less non-free than fixed microcode. Recommending an alternative architecture that uses non-proprietary microcode I can understand; this I cannot.
[1] https://www.phoronix.com/news/GNU-Linux-Libre-4.16-Released
To me, this is a continuum with the box art of early games, where because the graphics themselves were pretty limited the box art had to be fabulous. Get someone like Roger Dean to paint a picture for the imagination. https://www.rogerdean.com/
The peak of this was the Maplin electronics catalogue: https://70s-sci-fi-art.ghost.io/1980s-maplin-catalogues/ ; the Radio Shack of UK electronics hobbyists, now gone entirely. Did they need to have cool art on the catalogue? No. Was it awesome? Yes.
https://spillhistorie.no/2025/10/03/legends-of-the-games-ind...
Turns out that the Psygnosis developers in the 1980s used him as a kind of single-shot concept artist. They would commission the box art first, then use that as inspiration for designing the actual game to go inside the box.
https://row.rarevinyl.com/products/anderson-bruford-wakeman-...
Weird shop; they never really got rid of any stock that was even theoretically useable, so it was at least partially a museum of outdated gadgets.
A reminder: Even years after inventing CUDA, Nvidia, the top GPU manufacturer, was fighting for survival. I'm not sure what saved them - perhaps crypto.
If you ignore the money, they appeared quite strong. But they struggled financially. Intel famously considered buying them around 2010 because they knew they could buy them cheap - Nvidia might not survive and weren't in a position to negotiate). Thankfully, the Intel CEO killed the idea because he knew Jensen wouldn't work well with Intel.
Nvidia may not have been saved by "bean counters", but they do have a place in the world.
Games are no different, in Morrowind gods ripped each other's penises off and used them as spears; in Skyrim you fight dragons.
They're also great value; a couple months back I went to a local store and bought 100 or so "old" game CD/DVDs for less than $35, none scratched. For the price of one triple-A game, I'd probably have been able to get 250 at least.
Besides, in Skyrim you eat souls of hitherto immortal beings in an act of metaphysical cannibalism, and, among other things, get to witness firsthand exactly what happens to those souls you trap to fuel your fancy sword of zapping.
Meanwhile, in the background, Vivec might or might not have been kidnapped to be on the receiving end of that spear thing, and fascist elves are trying their hardest to undo the universe (it's not plot pertinent though), and also briefly did (or claimed to do) something to the moons (that are not moons, remember) that terrified an entire race into submission.
The point is, the lore is still there. You just have to pay attention, because it's not always spelled outright.
From full cases [0] including the CPU cooler in general, to themed components[1], when it comes to gaming makers are going beyond and above to create cool visuals.
[0] https://www.dospara.co.jp/gamepc/kuzuha.html
[1] https://www.yodobashi.com/product/100000001009108157/
Here's one of their mice: https://www.lofree.co/products/lofree-petal-mouse
Anime is certainly weird, but I wouldn't say in the right way.
You'll have to use the Internet archive to see them all. [1] Several, like 'Dawn' for example, were quietly removed in 2020.
[0] https://www.nvidia.com/en-us/geforce/community/demos/
[1] http://web.archive.org/web/2019/https://www.nvidia.com/en-us...
They're still some fun to interact with anyways, or just fun as a way to review what was hot shit at the time, but I couldn't get a few of the really old ones to run on Windows 10/11 this summer. A video is a lot better than just saying "well, I'm not going to build an old PC just to play this demo" and not seeing it at all.
Several models don't even have pictures of the card, but every one of them shows the crazy box.
They also still list all their old GPUs. Compare the wild boxes at the top with the TV tuner boxes at the bottom: https://support.hercules.com/en/cat-videocards-en/
Crazy contrast to me having spent the past weekend wondering if cloud gaming services like Geforce Now are matured enough that I can fully move to a thin client - fat server setup for the little bit of gaming I still do.
You don't even need to create any internal tech - Steam Remote Play already has everything you need, and I successfully used it to play Battlefield from an AWS GPU instance (was even good enough for multiplayer).
Because the price is too low, more people want to buy a graphics card than the number of graphics cards that can be produced, so even people who would love to pay more can't get one.
Scalpers solve this mismatch by balancing the market: now people who really want to get a graphics card (with a given specification) and are willing to pay more can get one.
So, if you have a hate for scalpers, complain that the graphics card producer did not increase its prices. :-)
Compared to my CPU (9950X3D), it's got a massive monolithic die measuring 750mm2 with over 4x the transistor count of the entire 9950X3d package. Beyond the graphics, it's got tensor and RT cores, dedicated engines for video decode/encode, and 32GB of DDR7 on the board.
Even basic integrated GPUs these days have far surpassed GPUs like the RTX 970, so you can get a very cheap GPU that gets power through the CPU socket, at MSRP.
I'm a retired data center electrician, and my own GPU's has been "loose" at least more than once. Really make sure that sucker is jammed in there/latched.
I'm not in the market for a 5090 or similar, but the other day I was looking at a lower-end model, an AMD 9060 or Nvidia 5060. What shocked me was the massive variation in prices for the same model (9060 XT 16 GB or 5060 Ti 16 GB).
The AMD could be had for anywhere from 400 to 600 euros, depending on the brand. What can explain that? Are there actual performance differences? I see models pretending to be "overclocked", but in practice they barely have a few extra MHz. I'm not sure if that's going to do anything noticeable.
Since I'm considering the AMD more and it's cheaper, I didn't take that close a look at the Nvidia prices.
Looks. I'm not joking. The market is aimed at people with a fish bowl PC case that care about having a cooler with a appealing design, a interesting PCB colour and the flashiest RGB. Some may have a bit better cooling but the price for that is also likely marked up several times considering a full dual tower CPU cooler costs $35.
I am not a tech wizard, but I think the major (and noticeable) difference would be available tensor cores — that currently nVidea's tech is faster/better in the LLM/genAI world.
Obviously AMD jumped +30% last week from OpenAI investment — so that is changing with current model GPUs.
In fairness, the graphics card has many times more processing power than the rest of the components. The CPU is just there to run some of the physics engine and stream textures from disk.
There was also this[0] video posted recently here of a Turbo Pascal tutorial which has some interesting intro :-)
[0] https://www.youtube.com/watch?v=UOtonwG3DXM
https://www.coolermaster.com/en-global/products/shark-x/
I think that the boxes initially reflected that.
My first accelerator (rather late) was that 3D Blaster Voodoo 2; the graphics of the box contributed to the emotion of holding it, they looked better than in the picture.
I was mindblown when I saw what the card could do, and I believe to have thought that the graphics did reflect well its capabilities.
I sure kept the box for many years.
I imagine that then the manufacturers felt compelled to keep making boxes which would stand out; and in part, yes, they tried to attract some purchases from people who didn't originally mean to get a new graphics card.
https://old.reddit.com/r/pcmasterrace/comments/y7wcd7/gpu_bo...
Later on I bought a Sapphire 9800pro "Atlantis" which had some T1000 esque figure on the box art.
After that a lot of stuff becoming more corporate and boring.
Can't believe this was a hobby for me and my dad during primary school and now understanding how computers work led me to my current full time job to put food on the table for my own children.
> GPU makers have all abandoned this practice, which is a shame as it provided something different through box art alone. Now, we're drowning in bland boxes and similar-looking graphics cards
I feel like there could be a more positive adjective than “unhinged” if you're going to turn around and praise it. OED sez “wildly irrational and out of touch with reality”. How about “whimsical”? I love this stuff and think we need to bring this kind of whimsy back to computing.
> There's a scantily dressed lady in armor
Author neglects to mention that ATi/AMD had a named ongoing marketing character for many many years — Ruby!
- Agent Ruby Demo Compilation https://www.youtube.com/watch?v=sUAuj0Jn8UI
- 2008 Ruby demo https://www.youtube.com/watch?v=0YjXCae4Gu0
- Ruby origin story https://web.archive.org/web/20071023192128/http://game.amd.c...
- ATI Agent Ruby™ Usage Guidelines 1.0 http://www.barbaraburch.com/portfolio/whitepaper6.pdf
- She even stuck around long enough for the ATi name to entirely disappear from AMD Radeon branding: https://i.imgur.com/uBWfzCA.jpeg https://www.youtube.com/watch?v=bwIMHX7rW8Q (2013)
- AMD-exclusive Ruby skin for Quake Champions https://www.youtube.com/watch?v=-LRSqC9n0Tc (2017)
> GeForce 6600 GT was enclosed inside a box featuring a lovely lady
nᴠɪᴅɪᴀ had several named demo characters too, but they removed all the pretty lady ones some time in 2020. Compare:
- https://web.archive.org/web/20200921115422/https://www.nvidi...
- https://www.nvidia.com/en-us/geforce/community/demos/
Adam Sessler voice I give this article a two… out of five.
Searched and managed to find an image:
https://www.reddit.com/r/pcmasterrace/comments/144323l/great...