People stuck using ancient Windows computers
Mood
calm
Sentiment
mixed
Category
other
Key topics
Legacy Systems
Windows
Backward Compatibility
The BBC article discusses individuals and organizations still using outdated Windows computers, sparking a discussion on the pros and cons of maintaining legacy systems and the challenges of upgrading or replacing them.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
56m
Peak period
32
Day 1
Avg / period
19.5
Based on 39 loaded comments
Key moments
- 01Story posted
Aug 24, 2025 at 1:16 PM EDT
3 months ago
Step 01 - 02First comment
Aug 24, 2025 at 2:11 PM EDT
56m after posting
Step 02 - 03Peak activity
32 comments in Day 1
Hottest window of the conversation
Step 03 - 04Latest activity
Aug 26, 2025 at 1:28 AM EDT
3 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
People are upset because their hardware is still working??
Is it better if it just stopped working one day?
The Stago STA Compact (Max) automated coagulation analyzer.
The first version of this analyzer ran MS-DOS. It worked fine, but it was a bit difficult to use - it didn't have a mouse. There were some keyboard shortcuts, but mostly I had to use keyboard arrows and Enter/Esc to operate it.
Then there was an updated version (Max) which was basically the same analyzer with new brains: different computer inside, dual-core CPU, Windows XP instead of MS-DOS. It is much, much worse than MS-DOS version.
The database can only hold about 4-5 days worth of results. When it gets almost full, and the sample drawer is open, the internal MCUs timeout while waiting for commands from the main CPU, which gets stuck busy displaying the samples window. And there are race conditions everywhere. If I scroll the results window while the analyzer adds/updates results into it, it gets confused and shows the new results on the wrong table rows, corresponding to other patients - yes, it's that bad.
It's obvious they tried to avoid race conditions as much as possible, for example, it can't print internal control results while the analyzer is running samples, it won't open the samples drawer while running the internal control from the reactives drawer, etc. I would prefer the old MS-DOS system any time.
The system being slow and old doesn’t matter. It is running xp and airgapped. Sometimes you access the data by usb stick or burning a cd rom. The software stack it runs mainly dumps sensor data onto a flat file so its not really necessary to be very robust. And sure the ancient optiplex desktop idling all day drinks more electricity than a modern light weight chip, but that couple dollars more a week if that in electricity costs is hardly a concern in research setting.
For the people who use this old technology, life can get tedious. For four years, psychiatrist Eric Zabriskie would show up to his job at the US Department of Veterans Affairs (VA) and start the day waiting for a computer to boot up. "I had to get to the clinic early because sometimes it would take 15 minutes just to log into the computer," Zabriskie says. "Once you're in you try to never log out. I'd hold on for dear life. It was excruciatingly slow."
..
Most VA medical facilities manage health records using a suite of tools launched by the US government in 1997 called the Computerized Patient Record System (CPRS). But it works on top of an even older system called VistA – not to be confused with the Windows Vista operating system – which first debuted in 1985 and was originally built on the operating system MS-DOS.
The VA is now on its fourth attempt to overhaul this system after a series of fits and starts that dates back almost 25 years. The current plan is to replace it with a health record system used by the US Department of Defense by 2031. "VA remains steadfast in its commitment to implementing a modernised, interoperable Federal [electronic health record] system to improve health care delivery and positively impact patient care," says VA press secretary Pete Kasperowicz. He says the system is already live at six VA sites and will be deployed at 19 out of 170 facilities by 2026.
Thank god that Windows 11 only needs about 3 minutes to boot up and 2 more to be usable after login in a corporate environment. How the times fly. /s
I'd like to think that Linux as a platform for running such systems would have gotten a mention but it seems that BBC is unaware it exists.
It's not a stunt I try to pull with Windows.
Old programs with statically linked dependencies might work, but you run into issues where the GUI framework is broken or incompatible or your window manager doesn't like it. Lots of little random stuff like that.
Windows is best in class at backwards compatability, though whether that's a good thing is up for debate.
Being forced to maintain compatibility for all previously written apis (and quite a large array of private details or undocumented features that applications ended up depending on) means windows is quite restricted in how it can develop.
As a random example, any developers who have written significant cross platform software will be able to attest that the file system on windows is painfully slow compared to other platforms (MS actually had to add a virtual file system to git at one point after they transitioned to it because they have a massive repo that would struggle on any OS, but choked especially badly on Windows). The main cause (at least according to one windows dev blog post I remember reading) is that windows added apis to make it easy to react to filesystem changes. That’s an obviously useful feature, but in retrospect was a major error, so much depends on the filesystem that giving anything the ability to delay fs interaction really hurts everything. But now lots of software is built on that feature, so they’re stuck with it.
On the other hand, I believe the Linux kernel has very strict compatibility requirements, they just don’t extend to the rest of the OS, so it’s not like there’s a strict rule on how it’s all handled.
Linux has the obvious advantage that almost all the software will have source code available, meaning the cost of recompiling most of your apps for each update with adjusted apis is much smaller.
And for old software that you need, there’s always VMs.
Linus is very adamant about "not breaking userspace"
The main problem with backwards compatibility (imho) is glibc. You could always ship your software with all dynamic lobs that you need, but glibc does make it hard because it likes to move awkward and break things.
The trouble is usually with other dynamically linked libraries not being available anymore on modern distributions.
Which ended with Windows 10. There are a lot of old Win95 era games which do not run on Windows 10.
I wonder if at some point virtualizing, and potentially adding a modern control layer on top of their current machines is a potential path forward.
The reality is that you need to keep upgrading and building new infrastructure. Because inevitably the old one won’t work or no longer be enough to support the needs of the users. And when that happens, it will be even more painful and expensive to get it up and running again. And the best case scenario would be that no one loses their lives over it.
see: https://news.ycombinator.com/item?id=30505421
The article does not mention what OS is being used, but RT-11 was designed for "real-time" applications. That was released in 1973, so over 50 years ago.
I was happy with Windows XP, Windows vista, 7,8,8.1,10, and 11 added nothing to my quality of life that I can think of?
YMMV. TBH 7 was quiet stable compared with 10 and 11.
Just don’t connect it to the internet.
> The only thing missing from Grigar's collection is a PC that reads five-and-a-quarter-inch floppy disks, she says. Despite their ubiquity, the machines are surprisingly hard to find. "I look on eBay, Craigslist, I have friends out looking for me, nothing. I've been looking for six years," she says. If you have one of these old computers lying around, and it still works, Grigar would love to hear from you.
Still in love with my Mac OS 10.6 (Snow Leopard) tax machine (offline).
I keep another machine of the same era (Intel Core2Duo) online with Win7Pro, for official paperwork/logins. Doesn't seem to be hacked / compromised, yet (what people usually say).
Also rocking modern Apple Silicon (M2Pro/3/4) which is impressive equipment, particularly considering their miniscule power usage. The current 15" MacBookAir will stream video for the majority of a day on a single charge, and from CostCo can be occassionally purchased for $849 (which includes an additional year of warranty).
We ended up creating a disk image then emulating the machine in Hyper-V and passing through 2 usb-based serial ports. Works like a charm!
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.