A Bubble That Knows It's a Bubble
Posted4 months agoActive4 months ago
craigmccaskill.comTechstoryHigh profile
calmmixed
Debate
80/100
AI BubbleTech InvestmentInnovation
Key topics
AI Bubble
Tech Investment
Innovation
The article discusses the current AI bubble, drawing historical parallels with past tech bubbles, and the HN discussion explores the implications of this bubble on the tech industry and innovation.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
6h
Peak period
111
Day 1
Avg / period
21.3
Comment distribution128 data points
Loading chart...
Based on 128 loaded comments
Key moments
- 01Story posted
Aug 24, 2025 at 6:02 PM EDT
4 months ago
Step 01 - 02First comment
Aug 25, 2025 at 12:02 AM EDT
6h after posting
Step 02 - 03Peak activity
111 comments in Day 1
Hottest window of the conversation
Step 03 - 04Latest activity
Sep 5, 2025 at 1:51 AM EDT
4 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45008209Type: storyLast synced: 11/20/2025, 7:40:50 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
Commoditization of this scale of compute is definitely going to be a boon for many fields of research. Unfortunately fundamental public research is exactly what is being cut right now in the US.
Long term, I think the real winners are going to be in robotics. Still an unsolved field, but Waymo proves that even a nearly 20 year slog to the finish line is viable. And robotics infrastructure may be more robust to obsolescence than the underlying compute. I find it odd so many companies are making humanoid robots though... Over engineering that reeks of bubble economics and possible fraud.
If you want your robot to be a helper around the general populations houses for example, you would aim to make a general purpose bot capable of stairs, ladders, lying down, reaching high, stepping over things, holding awkward weights and loads while doing all of the above. Pinch, twist, push, pull, in all degrees of motion a human has etc.
If we applied the same logic, there should be a massive effort to ditch wheelchairs and build exoskeletons instead.
Bipedal robots are more expensive to develop, build and maintain, more limited in their payloads, and because of the additional complexity, less reliable.
The most viable use case of AI is bullshiting humans, which is still a multi-billion market. Infrastructure hooray!
Exoskeletons can't match that.
I mean I wouldn't buy either unless I could be certain it's not uploading all data to the cloud and be fully controlled by a user hostile company, but if we're talking fantasy tech ala Detroit: become human... Yeah, it'd be willing to spend a lot of money to have all chores taken care of by a humanoid robot.
And in before someone talks nonsense again wrt "you already can, just pay someone to do it for you"... I do not want to have strangers in my home. This is also essentially why I wouldn't want any cloud connected bot anywhere innit.
But that's going to be hilarious. Imagine your internet goes down while the bot is half way down your stairs, or the in the middle of pouring a drink. Very fun.
It's super easy to come up with scenarios that a wheeled bot can't cope with, but again "good enough, cheap enough" will probably see lots of wheeled bots on the market. I am just trying to show why the pioneering companies would be interested in bipedal bots, it's a long term play.
Lastly, the elephant in the room is that basically all general purpose bots are a euphemism for military bots that will need to operate in unknowable conditions.
Exactly, we need legs when they are specifically needed, and we already have wheeled robots so building legged robots that can move like a human will cover so many cases we currently cannot cover.
And even more important are arms and hands, and legs is a precursor to that, they are much simpler so its smart to start with legs to then try to make good arms and hands.
So instead the government gets involved and demands a change to built environment instead of a speculative bet on the idea of a new technology.
I have seen how robots currently behave when they lose their footing though, and I'd be bloody terrified to be strapped into one.
Maybe wheelchair users and robot manufacturers can share a force for getting wheeled locomotion into more spaces, but I think homes will always be a challenge as stairs are a requirement for denser living, and elevators are expensive.
Knowing the kind of markup on wheelchairs that means a YouTuber wheel chair look like a bargain (see Jerry rig everything wheelchair), I can't imagine how much the US healthcare "industry" would charge for a "medical grade" exoskeleton.
I doubt robots will actually end up in every household, but a niche luxury product and utility for businesses makes some sense. Even if you think about it from that perspective, robot makers would still want them to be a universal robot not dozens of unique use case bots.
If a business can pay 30k for a general purpose extra set of hands I think that would be a no brainer, and I think the wealthy would see it similarly.
Sounds like a limited market/growth potential, hard to amortize the huge R&D etc. Could happen but will never justify the current levels of investment required.
I had a friend who got a Sun cluster for basically free when the 2000 dot com bubble burst. And when we were doing recreational math contests a couple of years later it was slower than our laptops.
So it is very likely that a load of today's GPU compute is very competitive next year or the year after?
The AI bubble bursting will kill investment in the next gen hardware in the west.
But china will come to market with its first gen that it is currently building to replace its dependency on the west and will leapfrog the west etc. China isn't really completely dependent on competing in our AI bubble, its using AI for its own things and will plough on even when the west bubble bursts. Seems obvious?
Still, there has been so much talk about the AI bubble bursting last week and this is the the best writeup.
We are not getting the same insane gains from node shrinks anymore.
Imagine the bubble pops tomorrow. You would have an excess of compute using current gen tech, and the insane investments required to get to the next node shrink using our current path might no longer be economically justifiable while such an excess of compute exists.
It might be that you need to have a much bigger gap than what we are currently seeing in order to actually get enough of a boost to make it worthwhile.
Not saying that is what would happen, I'm just saying it's not impossible either.
All the investment in AI should help bring infrastructure up to a higher level, power distribution and cooling for example are at a much higher level than would have otherwise been.
Who knows what use that might have if it suddenly becomes incredibly cheap.
(this is my silver lining thinking)
fair point, maybe you could show me a 50 year old rail that is still worthy of being ridden. ;)
Even a 20 year old rail is problematic from what I understand (from a UK perspective).
What need to be done first is the gravel and then also the ties. Expensive are also trackout/switches with motors, and of course the signal boxes. What is now the big deal is adoption to newer technologies like ETCS.
What needs the fastest maintenance nowadays, though, is software :-).
What's the corresponding infrastructure of AI? The major cost - the GPUs - are effectively obsolete after 3-5 years. The physical location of the datacenters, power, cooling and fibre that connects them might be the lasting infrastructure. Is datacenter location important? Are we actually building up new power sources (apart from endless announcements about FANGs opening nuclear power stations, which as far as I'm aware have not happened yet)?
A big ones here may also be increased technological literacy, the rise of a new UI paradigm (chat with a non-human) and the structuring so much data in the world that while it previously existed was hard to meaningfully leverage because it was unstructured.
And, last but not least, lowering the barrier to entry to starting tech companies by eliminating and launching a new generation of SMB-like tech startups that don’t need to take VC-money and scale to survive. And as a result can can solve problems facing niche industries (not to be confused with things like Wix or Etsy that lowered the barriers business selling real world products to create an online presence)
If nothing else, mainstreaming AI will have the same impact mainstreaming spreadsheets did.
Is it all about the actual GPUs though, is that the only "infrastructure" being built? A list from the top of my head of things that I'd say do last:
1. Data center buildings (take a while to build, contents completely aside).
2. Organisations and processes for running operations and procurement in said data centers - doesn't take decades to build for sure, but it's something worthwhile to already have.
3. Advances in the actual chips, i.e. more powerful processing units.
4. Advances in chip fabrication.
5. Chip fabrication facilities and organisations (similar to #1 and #2).
So sure, GPUs are highly temporary. But a lot of the things being developed and built around them much less so.
I do think one possible bubble burst scenario is that we'll have cheap compute available for decades but not a lot of great ideas of what to do with it. That is not unlike the 2000s I suppose.
The GPU hardware rots and becomes obsolete in a matter of years, but the national infrastructure required to support the physical sites isn't going away. Things such as...
- improved power distribution networks
- logistics arrangements to build and support the DC sites
- lots and lots of new fibre interconnects to support the massive bandwidth needs
- hopefully: better power delivery planning laws
- plumbing infrastructure, because all that hardware requires cooling
Some of the DC sites will be decommissioned from their initial use, but given the physical security requirements, might morph into handy higher-security industrial facilities with only small repurposing. Such reuse cases would especially benefit from improved logistics (see above).
This is not as hardy as fiberoptic communication lines used to build the internet, or railroad lines used to build transportation infrastructure.
I think at least with CPU:s the depreciation has slowed down a lot compared to 15 years ago.
Regards robot form factor; I'd rather R2D2 than C3PO. I don't want anything approaching the Uncanny Valley; I want a machine that does handy things!
I like the term "democratize investing" here. "We're granting the masses the privilege of dumping their lifesavings into this overhyped project, so we can make a clean exit".
> Yes, retail can buy Nvidia, but they can’t access pre-IPO rounds where the real speculation happens. This concentration among professional investors won’t prevent a bubble, but it might prevent the kind of widespread financial devastation that followed previous crashes.
What year is this from? The author might want to do a recent news search.
When you are selling a 5 dollars for 1 dollar doubling revenue is easy. It just creates more losses, same with OpenAI
On a software focused forum like HN, I’m surprised people still don’t understand the grow at all cost until you are the top 2 or 3 left model. There has been dozens of examples of tech companies losing money for years just to become highly profitable after.
People are still not getting that big tech is investing like their lives depend on it is because they are. GenAI can render the core businesses of big tech obsolete.
Sure, but there have been thousands of tech companies that lost money year over year and went bust. Odds are that any given AI company will end up losing a lot of money. Maybe the asymmetric potential payoff is worth the risk in certain cases, but it's not crazy to be skeptical about Anthropic or any other hot company.
That is if US sanctions don't kill the DSA.
E.g. someone borrowing against their higher property value(s) to put a down payment on another property.
Leverage is the amplifier. And I don’t see many self-circulating capital flows. I expect contractions to be reasonable for this bubble, or more realistically industry stagflation.
Here's the mechanism in simple terms:
When US manufacturing jobs moved to China in the 2000s, American workers saw their incomes drop dramatically - like a factory worker going from $30/hour at Ford to $12/hour at Walmart. Instead of accepting lower living standards, the system created an alternative solution through housing and credit.
As home prices rose rapidly (often 10-15% annually), workers could borrow against their home's appreciation through equity loans and refinancing. A worker whose house went from $150,000 to $300,000 could borrow $50,000 to maintain their lifestyle - buying trucks, boats, and continuing to consume as if their income hadn't dropped.
This created a win-win illusion: China got manufacturing jobs, US companies got higher profits from cheap labor, Americans got cheaper goods at stores like Walmart, and workers felt wealthy despite earning less. Nobody complained because everyone seemed to benefit in the short term.
The system worked as long as home prices kept rising, allowing people to keep borrowing against appreciation. But when housing prices stopped climbing around 2005, the illusion collapsed - workers were left with lower wages, massive debt, and no way to keep borrowing.
This mechanism essentially allowed America to maintain consumption by borrowing against future wealth rather than addressing the fundamental problem of job losses. The 2008 financial crisis was the inevitable result when this unsustainable system finally broke down.
Blockchain, NFTs and 3D printing are still around and have vacuumed up billions and billions without the average person being able to tell an impact on their lives.
But at the time it was going to be the next big thing transforming everything.
Same as 3D printing. Certainly cool and useful in some niche contexts, but it has not disrupted manufacturing.
At the same time, Printables and MakerWorld are flooded with…toys. They gamified their platforms and a ton of “thingy” models, ex. generic planter pots (some of them just renders, never even printed!) is the result.
This certainly hides the benefit but I very much think it’s there.
But if we look at the types of predictions made in the early days (print a house in a day for under $5k, print any food you want at home, obviating factories you can make anything at home...), almost none of that has come through.
And that doesn't mean it's a bad technology. Most technologies don't revolutionize the average person's life, but can still change corners of civilization.
But compare that to the internet, which has literally changed how we do basically everything in our daily lives.
I think the point is that most technologies are like 3D printing while the current narrative is that AI will be more like the internet.
On other side you get to complex topologies and very specialised parts. Again pretty hard to scale and limited demand.
In the end it is manufacturing and manufacturing is huge. But also generally does not have great margins. It has lot of competition. So 3D printing would end up there with others say makers of CNC machines, various presses and so on. Multi-billion dollar industry, but not tech.
Housing is back …
Dotcom came back…
Nothing was a bubble. Dotcom was into a new paradigm shift with mobile in less then a decade. These aren’t even significant timelines when you think about it.
So you pull out of the AI hype today, fine. These past recent bubbles show that everything ramps back up within five years.
AI-is-hype people are delusional. The computer has never been able to do what it’s doing today. We could only dream of it.
Sure, but do the math. It doesn’t work out yet. This stuff burns money and energy. Either revenue has to go up A LOT or costs do have to come down A LOT (or quality has to suffer by using smaller models).
Electricity will become very cheap during the day at least with solar continuing its declining trajectory.
But that assumption would be as silly as yours.
Ironic how you can contradict yourself without realizing. The fact that something "came back", meant it WAS a bubble that popped.
The former can be overvalued (see housing pre-2008), but we'll never come to the conclusion that it's useless or only needed in niche use cases. In that case, the item itself isn't really the bubble. The bubble is in what enables the irrational prices (e.g. subprime mortgages).
The latter can definitely be a bubble where the technology just isn't useful for a given use case (or at all).
This article is based off of the Altman bubbly comment.
There is absolutely nothing else left to invest in when it comes to software development, this is it.
It’s so painfully obvious but so many AI doomers use it as evidence.
He doesn’t want a talent war with Meta and Apple. And Meta has responded by signaling a truce in the talent war by saying they’re freezing AI hiring.
"AI is an existential risk for humanity, that's why we have to dump all resources we have into building it".
"It's critically important that AI as an industry is regulated, but also we'll pull out of the EU if they try to regulate us"
"AI is an existential risk for humanity ...". ... so you should trust only us to build it
"AI as an industry should be regulated ..." ... to make it harder for newcomers on the market.
The flow of money to spur innovation is exactly like "Cambrian Explosion". We should do this more often, with biotech and future fields to come.
OTOH all the VR headsets gathering dust now didn't turn out to be quite as useful as those fiber optic cables. And I'm not sure what will remain after the AI bubble pops except for a massive matrix multiplication overcapacity ;)
I also wouldn't call all the money being funneled into a single technology a "Cambrian Explosion", it's the opposite of that, an organism being propped up that wouldn't survive on its own in a competitive environment.
[0] "millions of ordinary investors watched their retirement accounts and college funds evaporate. The same middle-class Americans who had been told they were foolish not to participate in the ‘new economy’ now faced financial ruin. Teachers’ pension funds were halved. Family savings meant for homes and education vanished"
And pray we don't enter a "lost decade" (which is closer to 30 years, now) like Japan.
I read quotes like this and reminded that it is common that people forget money is just a competitive resource we use to outbid each other for _real_ things. Money moves around, it isn't lost or "Completely vaporized", someone receives it at the other side of the transaction. It is still in circulation, it can still be used to outbid people for real things, just by different people.
Also, pets.com still exists, it just forwards to petsmart.com.
But money is an abstraction of wealth, and wealth absolutely can be destroyed, in multiple ways:
1. It can be physically destroyed - if I break a window, that's wealth that is destroyed. That window now needs to be replaced, which costs materials and labor, which could've gone to building something new instead.
2. It can be spent on things that end up not used. If five years from now, those millions of GPUs are no longer in use, we created them for nothing instead of creating more of something people would use.
3. Wealth can be spent on the less important things, rather than the more important things. This is not exactly wealth being destroyed, just built more slowly, because instead of building lots of new wealth (via innovation, say) we're creating less valuable things.
I don't think any of the above are relevant to AI, btw.
That's true, but the thing that's lost is the economic/productive capacity that the money was spent on, that could have been used for other (better) purposes.
For example, if I raise $100mn in a frothy market, and spend it on employing 100 Engineers on $1mn/yr salaries for 1 year before ultimately going bankrupt, it's true that the money doesn't disappear, as it was simply transferred from the VCs to the Engineers, but what's spent/consumed is the Engineers' time. Society can never get those 100 person-years back, and the VCs have to write their capital investment to 0.
The other comments are separately true - money is created by bank borrowing and destroyed by loans being repaid or going bad. Periods of speculation often result in increasing leverage (e.g. borrowing to buy stocks/houses), which does result in the destruction of money when it unwinds (as well as damage to bank's balance sheets, which can become problematic when it happens at a large enough scale - see 2008).
On the other hand, the monetary value of the stock market (and other assets) going up and down does create or destroy "money". From a financial point of view, it's not a zero sum game.
The 2014 doc was a pretty wild read for me when it came out - it changed my perspective quite a bit.
[1]:https://www.bankofengland.co.uk/-/media/boe/files/quarterly-...
[2]: https://www.goodreads.com/book/show/58796370-can-t-we-just-p...
Of course, assuming that this would be the only thing where economic gains come from is already such a laughably bearish vision. It's just that that's all you need for the bubble-thesis to fall flat.
Also whatever LLM productivity gains are currently happening are being massively subsidized. Once companies switch out of lighting money on fire mode most of these products will get dramatically worse and more expensive. Maintaining a cutting edge LLM isn't a railroad that you build once and can run and manage for centuries at a fraction of the initial price, they require constant expensive investment.
If that's true, then we are in a bubble by definition. When AI development eventually stagnates, failing to deliver on these promises, valuations will correct fast (and painfully). What happens then to Nvidia and other hardware companies? And what about the massive AI investments currently propping up the economy [1]? These would also be slashed, messing up the entire supply chain that's gearing up to meet this demand.
While I agree the technology is great and useful, I believe we are in bubble territory. I believe it's unlikely to be as transformative as the CEOs and VCs funding these companies claim.
[1] https://sherwood.news/markets/the-ai-spending-boom-is-eating...
I'm don't think this is unique, most bubbles historically as far back as the South Sea bubble have had a lot of people aware of the irrationality, but investing in an attempt to profit from it.
I'd even go so far as to say, this is exactly what makes bubbles so volatile as opposed to normal "market corrections". If the dotcom boom had been all people who really believed they were sensibly evaluating the internet's financial potential, I don't think we'd have seen them jump ship quite so quickly.
I won't predict the future, but another point about historic bubbles: they almost all go on much further than people think they will before collapse.
Things seem to be slowly changing in Japan.
At this point it is wrong to speak about Japan's "lost decade" - it should be "lost decades".
In my opinion, that happens because the JPYUSD carry trade. This reverses as they raise interest rates (which has been kept artificially low in Japan for decades).
It looks likely it could take all the big software companies with it, and all the big cloud providers. It may well kill most GPU vendors, most datacentre and hosting companies. Industrial-scale LLMs are propping up the entire cloud business, and that itself was bloated and overgrown. SaaS was a mistake. Anything -aaS was a mistake.
I'd _like_ to see this kill off MS, Oracle etc.
Intel is teetering. NVidia is probably screwed. AMD may follow.
There's geopolitics here too. China wants Taiwan and has actively been divesting from Western hardware and software. So has Russia. Lots of Linux growth there: it's free, it works, they can just take it.
And there's rapid climate change too, which is starting to become visible.
Everyone who manufuctures in Taiwan may well be doomed. But ditto everyone in the tropics, in the newer tech centres: Malaysia, Thailand, etc.
Everyone who gets chips from Taiwan is probably screwed. Everyone who assembles in PacRim and SE Asia too.
That will take down most Western companies.
Apple might weather it: it sells hardware, and it has its own unique OS family. But others make its hardware for it -- in those areas.
Chinese tech may bloom.
Small scale individual FOSS will be OK.
Stuff reusing legacy tech, that can run on old kit.
Everyone's deprecating x86-32. That may bite them hard.
Everything dependent on virtual stuff and public cloud, everything dependent on K8s and remote datacentres, everything you can't run locally on kit you own that sits in premises you own.
That includes a lot of the games industry.
Non-commercial OSes will be OK.
Bad times for RHEL and the clones. Bad times for SUSE and maybe Canonical.
Maybe OK for Debian. Good for Arch & Alpine & Slackware.
Stuff that needs GPUs, bad. Stuff that works fine in standard def on CPU graphics, good.
But I am just indulging my own biases and skepticism here, I freely admit.
;-)
I'm not so sure I want MS to go away though. While I'm not a fan of MS and I've never even really used a Windows computer beside it being mandated in school and university, I don't want Windows to go away.
When the masses come to Linux, it will simply become worse and locked down. There are already enough companies invested who will want to do so as soon as it becomes feasible and enough influencable users are on Linux.
Windows is better then mac OS in terms of user control, so I also don't want the masses to go to mac OS.
What I would welcome is, Desktop Linux to not be a rounding error and the OS market to become more competitive, so that MS needs to improve Windows and can't ship whatever crap of the day they want.
I would have liked the solution proposed by the Department of Justice's original judge in the 1990s anti-monopoly case, though: to split Microsoft into separate companies: one for apps, one for OSes (maybe with development tools). These days, maybe another for public cloud services.
https://en.wikipedia.org/wiki/United_States_v._Microsoft_Cor....
The first judge, Thomas Penfield Jackson, wanted them split up.
https://en.wikipedia.org/wiki/Thomas_Penfield_Jackson
He got booted out and replaced with the much more conciliatory Colleen Kollar-Kotelly.
https://en.wikipedia.org/wiki/Colleen_Kollar-Kotelly
Mostly forgotten now but the fact that this Brit can remember the names of 2 US judges over 28 years later shows how important this case was at the time.
It is why MS bundled Internet Explorer 4 into Windows 98, calling the result "Active Desktop". That flawed broken "Windows Explorer with built-in Web rendering" was the design that the KDE project copied when it created the KDE desktop – as opposed to the much simpler cleaner desktop of Windows 95 and Windows NT 4.
Lawrence Lessig demonstrated this in court: https://www.theregister.com/1999/11/22/who_the_heck_is_lawre...
He managed to remove IE, and show the result still worked fine, demonstrating that Win98 did not need IE.
That in turn let to the app 98Lite to remove it, which still exists.
https://www.litepc.com/98lite.html
It led to NLite, which works on Win10.
https://www.nliteos.com/
In other words the ripples have not died away. That court case affected the design and implementation of FOSS OSes today built by programmers who hadn't been born when the lawsuit happened.
I don't know exactly why Microsoft chose to combine a browser and the file explorer. Maybe it was solely to keep their monopoly. However exposing the Internet as a file system is not so far fetched. Most protocols (e.g. HTTP, FTP) are about the transmission of files. To me it sounds totally sensible to implement HTTP as a file system driver and then have the "browser" only consist of rendering, without any network features. Honestly that sounds like a really cool idea. It results in a browser that transparently browses from websites hosted on servers to websites hosted on the disk. Saving websites could be really simple. Uploading also.
Passing parameters to a website could be the same, as executing a program. page?foo=bar&baz would be ./page --foo=bar --baz. su -c "systemctl enable --runtime" becomes path://root@/systemctl/enable?runtime . Of course you need a secure sandbox, otherwise you just built remote code execution as a service to anyone.
I can tell you. It was clear at the time but that time was 27-28 years back.
Microsoft didn't like it when anyone else made big money off the PC platform, and it got jealous when anyone started "making bank" from tools that MS didn't offer.
In the early to mid-1990s Netscape made hundreds of millions from its eponymous, industry-leading, rich-media-capable web browser (codenamed "Mozilla".)
Every PC and Mac had Netscape on it. It was proprietary, free for personal and non-commercial use -- but corporates had to pay to license it. And they did. In the early days of the WWW, Netscape was the browser.
Whereas when Windows NT (1993) and Windows 95 (1995) launched, they did not include a web browser at all. Bill Gates' circa 1995 book The Road Ahead barely even mentions the Internet at all.
Instead Win95 came with a client for the proprietary Microsoft Network, which extended the Win95 desktop, called "Explorer", and it also included clients for Microsoft Mail -- right on the desktop -- and Microsoft's proprietary chat protocol.
An optional extra, "Microsoft Plus!", included a fairly poor web browser, bought in from Spyglass and rebadged Internet Explorer.
This was a £40 add on to Win95.
Netscape made a killing. Microsoft got angry and wanted revenge. In one leaked quote it wanted "to knife Netscape in the back."
So it started offering IE as a free download for Windows 3.1, Windows NT, and Windows 95. It started a relatively rapid development programme to improve it. IE 1 was very poor, and IE 2 wasn't much better, but IE 3 was all right.
This angered a lot of people.
1. Competitor launches product
2. Product gets successful, makes lots of money.
3. Microsoft pours money and effort into making a rival product -- OK, fine.
4. MS makes it free, and offers it as a free upgrade for existing users. Not fine.
Then came IE 4. This was launched with a big splash. It was built into the shell.
Instead of simply showing folders, the Explorer now rendered their contents as HTML and used the IE engine to show them. The desktop backdrop was Web content and could change through the day. You could pin a Web bookmarks bar to the Taskbar, or have it floating. There was also a floating list of web thumbnails and shortcuts. The .HLP help file format was replaced with new HTML help. Optionally, icons became web links: name underlined, and opened with a single click.
This new "improved" desktop was offered as a free upgrade for Win95 and NT 4.
And a year or so later the new "Active Desktop" was bundled as part of the new Windows 98.
There is a single functional improvement in AD that was not connected with shoving everything possible through the browser engine: it was multithreaded.
In Win95 and NT4, if you start a file copy or move, the shell locks up until it's finished. You can't use other windows or start programs. In Active Desktop, you can: the file operation trundles along and you can keep working. That is the sole non-Web-related difference.
This kind of behaviour is illegal: it's called "restraint of trade". You can't just make a free rival to a competitor's paid one, bundle your rival app so everyone gets it like it or not, and simply get away with it. That's anti-competitive behaviour. So MS made its developers go out of the way to find ways to integrate IE4 deeply into the Win98 desktop so it could claim it was an OS component and not anti-competitive bundling.
Netscape complained to the US government. The US government acted.
But after years of fighting, MS got off Scot-free. Netscape collapsed and was split up and sold off. AOL got the browser (but used IE as it had a secret back-room deal with MS), Sun got the web server, and the unfinished unreleased next version of the browser was made FOSS and a new non-profit foundation set up and named after the browser's internal codename.
https://www.mozilla.org/
All those bolted-on shell extras to justify the presence of a web browser? That is the design KDE copied, not the original, much smaller and faster Explorer desktop.
> That's anti-competitive behaviour.
Yes that is way more obvious illegal behaviour than the current iteration of the trend. I envy this previous decades legal system for that obviousity.
> the shell locks up until it's finished.
You can still have this. If you create a folder in an open dialog, and the Windows Defender kicks in or it's a network mount, the inner window of that dialog freezes. Same for searches sometimes in the Windows Explorer. If you press the Window close button several times, the whole shell (including the taskbar) crashes and restarts.
> the Explorer now rendered their contents as HTML and used the IE engine to show them. The desktop backdrop was Web content and could change through the day. You could pin a Web bookmarks bar to the Taskbar, or have it floating.
I think if it didn't happened as anti-competitive behaviour, but as a real OS-rewrite, it could have been a really good feature. People could write their blog post/social media posts like normal documents and then upload them by a simple drag-and-drop in the Windows Explorer. I think we would have quite a different web then with the walled gardens of today. (I know this is easily doable with a text editor, ssh/ftp and a bind mount, but the layman doesn't do this.)
Oh, nice. Just what you want.
There was context at the time of Win98, good and bad.
Jobs had just come back to Apple. He cancelled most of its WIP projects. His other company's OS, NeXTstep, was set to replace classic MacOS.
But he didn't just axe everything, and among other things, he set the classic MacOS developers to salvaging as much as they could from the failed "Copland" project to make a multitasking next-generation classic MacOS.
They took quite a lot of the UI tech from the MacOS "Finder", broadly its desktop.
The result was MacOS 8, and one of the big things in its new desktop, as well as loadable themes and pop-up drawers (which never made it across to OS X) was... multithreaded file operations. This was a huge win in the day: Macs were a bit slow anyway so being able to keep working while the Finder kept copying was a huge boost.
So, that was new out in July 1997. Microsoft was left behind: the Win95 desktop couldn't do that, and neither could the fancy pre-emptive multitasking NT 4, released in 1996.
I do miss the era of bold OS experimentation like that. Now, it's just "how can we embed as many cloud services they'll have to pay for?"
https://en.wikipedia.org/wiki/AI_winter
I knew about since like 2010 or before, anti-tech Luddite will act like it's never a thing, shatup.
“When I see a bubble forming, I rush in to buy, adding fuel to the fire,” goes one of George Soros’s well-known quotes. “That is not irrational.”
Perez goes back to the technology of canals:
* https://en.wikipedia.org/wiki/Technological_Revolutions_and_...
People getting excited for something (perceived as) new is part of the human character.
If you're worried about money, this is horrible news. If you just want to get shit done, it's great news, only if you can avoid losing personal agency long enough to survive the crash.
We're headed towards the hockey stick in terms of what people using AI can do. I'm rapidly learning that even ChatGPT5 can get confused, and lose sight of goals, but not in the hallucination variety, just the bog standard way people end up trapped in rabbit holes. I'm learning how to talk to it and get it back on track.
AI really can be productive, but it still needs guidance to be really useful.
The clearest example is in AI generated visual content. If you dig through what people are doing, its clear that only a very small % of users are actually getting truly high-quality, ready-for-production content, while the rest are just prompting in pure slop. There is a skill level to this that hasn’t really permeated the mainstream
Once that happens, we might see some of that 95% waste figure change to, maybe, 50% waste
The "fearmongering" he is trying to create, can be seen as self-serving, so his opinions should be taken with a very big grain of salt.