Generation X May Be the First to Need a Universal Basic Income
Posted3 months agoActive3 months ago
thehill.comOtherstory
calmmixed
Debate
40/100
Universal Basic IncomeGeneration XJob SecurityAI Impact
Key topics
Universal Basic Income
Generation X
Job Security
AI Impact
An op-ed piece discusses how Generation X may be the first to need a universal basic income due to job insecurity caused by AI, sparking a thoughtful discussion on the topic among HN users.
Snapshot generated from the HN discussion
Discussion Activity
Active discussionFirst comment
36m
Peak period
14
0-2h
Avg / period
4.6
Comment distribution23 data points
Loading chart...
Based on 23 loaded comments
Key moments
- 01Story posted
Oct 6, 2025 at 8:51 AM EDT
3 months ago
Step 01 - 02First comment
Oct 6, 2025 at 9:27 AM EDT
36m after posting
Step 02 - 03Peak activity
14 comments in 0-2h
Hottest window of the conversation
Step 03 - 04Latest activity
Oct 7, 2025 at 11:19 AM EDT
3 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45490825Type: storyLast synced: 11/20/2025, 1:45:02 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
I find this so odd. Is there an imagining that there is some imminent trivial threshold at which AI will stop improving?
If AI improves at the current pace, all jobs will disappear as the very minimal floor of change.
Even if I accept your premise (AI continues to improve at the existing pace), I don't see how AI is going to replace, say, welders, in a year. Or even bankers. (Yeah, an AI might be able to do the financial part. But bankers - as opposed to tellers - are about relationship with customers, and the AI isn't going to substitute there.)
1. The exponential curve of AI progress
2. The lack of any mechanism that will immediately wall off AI progress at this specific exponential part of the curve
>I don't see how AI is going to replace, say, welders, in a year.
This article is about white-collar jobs. However, if human-level intelligence (let alone surpassing human-level intelligence) just becomes a matter of compute, robotics will quickly follow. That being said I'd expect this to be years, not a year.
> But bankers - as opposed to tellers - are about relationship with customers, and the AI isn't going to substitute there
Are you sure about that? Because even GPT-4o is an example of how people are very hungry for relationships presented by these AIs.
AI seems to need more and more power and expense thrown at it to get from say a 70% answer to an 80%, then just as much again to an 85%, then just as much again to an 87.5%. Speed of progress in the lower percentages is not an indicator of speed in the higher percentages.
Completely irrelevant.
Just because something happened in the past, doesn't mean it will happen in the future. Especially when you're talking with vague generalities (like you are) about something where the devil is in the details.
And then there are the personal effects, which comments like yours completely fail to address. If you lose your job and end up working at Walmart for a 10% the salary for the rest of your a career, while some other guy someplace else gets a new job that pays 75% your old one, is that a good outcome? People aren't fungible, especially when thinking about themselves.
> Just because something happened in the past, doesn't mean it will happen in the future
Say no more, let's ignore what always happened and go in the opposite way.
Oh come on man. The tech companies hope to create a technology that will automate the human intellect. I hope the fail, but we have to take the goal seriously. Don't talk to me about steam boats.
>> Just because something happened in the past, doesn't mean it will happen in the future
> Say no more, let's ignore what always happened and go in the opposite way.
How about you try to think about the this situation, this technology instead of essentially cargo culting and superficially reasoning "this is a technology and that was a technology 100 years ago, both technologies, so they'll both play out the same."
You're kinda missing the point. I suppose what I propose would be a "basic income scheme," but not UBI ("Universal basic income (UBI)[note 1] is a social welfare proposal in which all citizens of a given population regularly receive a minimum income..."). I consider the essential feature of UBI that most of the profits from AI are reserved for the billionaires and other shareholders who own those systems, and only the minimum amount needed to prevent the extreme desperation that can lead to revolution is socialized.
I'm proposing all the profits from AI be socialized, and the billionares get no more than you or I.
https://en.wikipedia.org/wiki/Social_credit