Did People in the 90s Worry About the Efficiency of the Internet
Key topics
So I’m confused about what the future will look like. If this level of efficiency compounds, even for a few years, we would be required to spend a compounding amount of money to match it, right? The alternative is that we move to a 4 (or 3?) day work week, or UBI, or what? If we don’t match the spending, companies will consolidate - both in terms of personnel and competition.
What is going to happen? What is next? Was there any concept similar to this 30 years ago and I’m just worried for no reason?
The author is concerned about the potential impact of AI on productivity and the economy, drawing parallels with the emergence of the internet in the 90s, and is seeking perspectives on what the future might hold.
Snapshot generated from the HN discussion
Discussion Activity
Moderate engagementFirst comment
7m
Peak period
8
0-6h
Avg / period
3.2
Based on 16 loaded comments
Key moments
- 01Story posted
Oct 21, 2025 at 8:10 PM EDT
3 months ago
Step 01 - 02First comment
Oct 21, 2025 at 8:17 PM EDT
7m after posting
Step 02 - 03Peak activity
8 comments in 0-6h
Hottest window of the conversation
Step 03 - 04Latest activity
Oct 24, 2025 at 8:11 AM EDT
3 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
Before investment in IT became widespread, the expected return on investment in terms of productivity was 3-4%. This average rate developed from the mechanization/automation of the farm and factory sectors. With IT though, the normal return on investment was only 1% from the 1970s to the early 1990s.
https://en.wikipedia.org/wiki/Productivity_paradox
Measurement or Management?: Revisiting the Productivity Paradox of Information Technology. http://www.diw.de/documents/publikationen/73/38739/v_00_4_9....
Then in the 2000 to 2020s productivity slowdown aka productivity paradox 2.0. https://en.wikipedia.org/wiki/Productivity_paradox#2000_to_2...
Increased automation has always led to jobs and professions going extinct. We don't have typists or lamplighters anymore. It hasn't been a problem before because those people generally did just find new work.
The present level of hysteria really is unprecedented. Coupled with that, the very explicit goals of AI are to remove jobs almost everywhere, forever. Paying people to work is now unfashionable.
So jobs being automated has always been a thing, people have always worried about it and there was always someone opining about it in newspapers. The world moved on because we had a functional labor market. The current cycle is absolutely hysterically over-hyped. The end result won't be nearly as catastrophic for the labor market as Altman et al want you to believe. Mainly because their AI goals are almost certainly unattainable. However, the damage to the economy and labor market is and will continue to be quite severe.
It won't be as bad as everyone wants you to think, but it's still gonna suck
Is there a practical difference? In either case we are likely out of a job
Were you alive when lamplighters or whalers or typists were put out of business?
Or is it just fun to say that it just so happens that the peak of hysteria is right this very minute!
At least there was a lot of hope that digitizing a lot of things would be faster and more convenient.
Perhaps secretaries, postal workers, and travel agents should have been worried but I think most industries did not see the Internet as a threat.
Waiting weeks for mail order items or using a paper card catalog in a library weren’t fun.
At the same time, I don’t think people were expecting shopping malls to decline as they have. Buying shoes and clothes online would have seemed absurd.
In the movie “the Net”, the idea of ordering food delivery over the Internet seemed a bit far fetched at the time — it was very early days for online commerce.
What’s going to happen next will probably more closely resemble the early 20th century in economic crash, when the population goes through a subprime debt cutoff for the 25% of U.S. households who can’t afford 1-bedroom rent, starving a lot of corporations of the workers and buyers that are propping up “run it until the well is dry” businesses (after their workers’ cars are repossessed and there’s no unprofitable public transit to compensate).
Much of this era’s political exploitation is under the banner “privatization”, which is simply trying to open up more markets that can be harvested until the field is barren, as otherwise they’d have to invest in the ones that have already been destroyed. So look for government-regulated monopolies that cannot be run at a profit — as an example, the postal mail has been a multi-decade quest for this outcome. Privatized sewers and roadway services (planning, paving, painting, signaling) come to mind; imagine how much profit a company could generate by forcing an entire city to toll roads in order to extract the most profit for the rich owners, etc.
In the early 1970s, I did read atomic energy books written in the late 50s - early 60s. Widespread fission or even fusion production would make electricity too cheap to measure in 10 or 20 years. The prediction back then was for a 3 day work week real soon. The vibe was very positive, however. People would still have jobs, they'd pay well, but everyone would work less, and have more leisure time.
If this cycle is anything like the dot com cycle, there will be billions of dollars in capital invested in AI “stuff”. Data centers, various LLMs, derivatives of LLMs, shells of derivatives of LLMs, and other tangential things that claim to be AI. Eventually some anal retentive shareholder activist will ask some pretty basic questions about return on investment, the wisdom of investing so much in capital that depreciates rapidly, the actual value of all of this.
Truth be told, a lot of the predictions from the peak dot com era came true, it just took another decade of technology development and the widespread deployment of broadband. The hype cycle inevitably outpaces the market reality by several years, even if elements of it are true.
And a lot of the “efficiencies” of moving commerce online simply got appropriated by new middlemen. Amazon, Google, Apple each take their transactional vigs. Hard to argue that the current advertising supported media market is efficient when the most successful sites have to meter access to content with subscriptions (and chum ads that burn your CPU).
UBI? Not going to happen in an allegedly capitalist society like the US. We're all temporarily embarrassed millionaires who resent paying anything to support someone else's lifestyle. Far more likely to eliminate entire categories of jobs and careers.
It’s curious to me that the investor class will pour billions of dollars into “AI” over the coming years seeking to replace labor costs instead of investing in improving the efficiency of the existing labor pool. In some ways this is like the outsourcing/offshoring rager the investor class had over the past thirty years (that was the thing people should have worried about in the 1990s but did not). In the goal to shave pennies per share of costs and juice market returns we wiped out entire job categories and industries in the US. Sure, we got cheaper devices and other manufactured goods, but ignored the social costs.
So, what will happen next? It’s a big muddle. If you’ve spent billions investing in various LLM processing systems, can you reasonably expect to generate revenues and profits from the very people who are now unemployed or underemployed due to the very LLMs/AIs/algorithms you’ve invested in?
“Claims and counterclaims about the likely effects of computers on work in America had also abounded since Weiner. Would the machines put enormous numbers of people out of work? Or would they actually increase levels of employment? By the late seventies, it appeared, they had done neither.”
So the 90s was essentially dialup. So people were hellbent on keeping designs simple and small sizes.
When consumer high speed became a thing, the opposite effect happened, it was extravagent to have tons of content.
Many standards were up in the air, web development had several privatized niches that lead to dead-end projects. Challenge was to find one that was cheap to license, worked on your platforms of choice, and that your audience would use.
Lot of stuff was thrown to the wall to see it it would stick and didn't.
Worrying about efficiency was not a prime factor, getting something that worked and was stable was more important.
I guess you could draw parallel with AI now a lot of work is on figuring out what to do with it and make its output consistent and reliable long term (cheaply) are probably more in the minds than making it efficient.
Now those problems are gone. The big problem now is people. Most people writing the code to deliver content online are over paid and grossly incapable. Maybe if you give them 150mb in dependencies, 10x more time, and lots of false praise they can deliver a few lines in text using a giant framework.
What’s next in the coming 30 years? Putting text on screen in a web browser is becoming far too expensive. Businesses will have to consider their options, like alternate formats. One thing we do know is that they will not train their employees unless all other options are exhausted.
Those things are not practical. Human beings are designed to work. Before the Industrial Era, it was all hard labor. Then came the machines and it became "hard labor with machines". Then came the Internet era and it become "work with computers". Now with AI, it will just be "Work with AI". The number of hours/days will remain the same. Employers will still expect you to work 5 days a week if you have a job. It is just that they will expect your output to be 50-100x higher using AI.