Generative AI as Seniority-Biased Technological Change
Posted4 months agoActive4 months ago
papers.ssrn.comTechstoryHigh profile
calmmixed
Debate
80/100
AI AdoptionJob MarketCareer Development
Key topics
AI Adoption
Job Market
Career Development
A Harvard study found that firms adopting generative AI are cutting junior hiring while growing senior roles, sparking debate on whether AI is reshaping the workforce or if economic factors are driving the trend.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
N/A
Peak period
64
0-2h
Avg / period
12.3
Comment distribution160 data points
Loading chart...
Based on 160 loaded comments
Key moments
- 01Story posted
Sep 16, 2025 at 9:24 AM EDT
4 months ago
Step 01 - 02First comment
Sep 16, 2025 at 9:24 AM EDT
0s after posting
Step 02 - 03Peak activity
64 comments in 0-2h
Hottest window of the conversation
Step 03 - 04Latest activity
Sep 17, 2025 at 5:01 PM EDT
4 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45261930Type: storyLast synced: 11/20/2025, 8:56:45 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
No, I was being sarcastic.
Currently part of the problem is the taboo using AI coding in undergrad CS programs. And I don't know the answer. But someone will find the right way to teach new/better ways of working with and without generative AI. It may just become second nature to everyone.
That's not true anymore in the smart phone / tablet era.
5-10 years ago my wife had a gig working with college kids and back then they were already unable to forward e-mails and didn't really understand the concept of "files" on a computer. They just sent screenshots and sometimes just lost (like, almost literally) some document they had been working on because they couldn't figure out how to open it back up. I can't imagine it has improved.
Some people dont want to hear that, but...
i just want devs who actually read my pr comments instead of feeding them straight into an llm and resubmitting the pr
Pretty sure it's a self-destructive move for a CS or software engineering student to pass foundational courses like discrete math, intro to programming, algorithm & data structure using LLM. You can't learn how to write if all you do is read. LLM will 1-shot the homework, and the student just passively reads the code.
On more difficult and open coursework, LLM seems to work pretty well at assisting students. For example, in the OS course I teach, I usually give students a semester-long project on writing from scratch x86 32-bit kernel with simple preemptive multitasking. LLM definitely makes difficult things much more approachable; students can ask LLM for "dumb basic questions" (what is pointer? interrupt? page fault?) without fear of judgement.
But due to the novelty & open-ended nature of the requirement ("toy" file system, no DMA, etc), playing a slot machine on LLM just won't cut it. Students need to actually understand what they're trying to achieve, and at that point they can just write the code themselves.
But if they're not hired...?
Kind of like that meme or how two AIs talking to each other spontaneously develop their own coding for communication. The human trappings become extra baggage.
It's pretty hard for a non-big tech company to pay big tech level salaries.
I personally turned down an Apple offer because they required 3 days in office and went this a much smaller fully remote team.
I think this is the gambit that we have already committed to.
From company interns. Internships won't go away, there will just be less of them. For example, some companies will turn down interns because they do not have the time to train them due to project load.
With AI, now employed developers can be picky on whether or not to take on interns.
Cheap labor. It doesn't take that much to train someone to be somewhat useful, in mmany cases. The main educators are universities and trade schools. Not companies.
And if they want more loyalty the can always provide more incentives for juniors to stay longer.
At least in my bubble it's astonishing how it's almost never worth it to stay at a company. You'd likely get overlooked for promotions and salary rises are almost insultingly low.
You get a lot in the interim!!! I started at Andersen Consulting (now Accenture.) The annual attrition was ~20%, but they still invested over a year of training into me.
But it worked:
- They needed grunt work in early years (me, working 75hr billable weeks). Not sure how much of this is viable now given LLMs
- They had great margins on the other four years. Not sure how much of this is viable now, as margins have shrunk in the past 25yrs as there is more way competition
- They used me to train the next cohort in years 4/5
- I appreciated the training and give them 60hr billable weeks on average for five years
It was a brutal and exhausing five years but i'm forever thankful to AndersenConsulting/Accenture for the experience.
its like how the generic "we take anyone" online security degree has poisoned that market -- nothing but hoards of entry level goobers, but no real heavy hitters on the mid-to-high end. put another way, the market is tight but there are still reasonable options for seniors.
then again we live under capitalism
Take the software development sector as example: if we replace junior devs by AI coding agents and put senior devs to review the agent's work, how will we produce more seniors (with wide experience in the sector) if the juniors are not coding anymore?
Other than that, I guess developing software in some capacity while doing a non-strictly software job - say, in accounting, marketing, healtcare, etc. This might not be a relevant number of people if 'vibe coding' takes hold and the fundamentals are not learned/ignored by these accountants, marketers, healthcare workers, etc.
If that is the case, we'd have a lot of 'informed beginners' with 10+ years of experience tangentially related to software.
Edit: As a result of the above, we might see an un-ironic return to the 'learn to code' mantra in the following years. Perhaps now qualified 'learn to -actually- code'? I'd wager a dollar on that discourse popping up in ~5 years time if the trend of not hiring junior devs continues.
I'm half-joking, but I wouldn't be surprised to see all sorts of counterpoint marketing come into play. Maybe throw in a weird traditional bent to it?
> (Pretentious, douche company): Free-range programming, the way programming was meant to be done; with the human touch!
All-in-all, I already feel severely grossed out any time a business I interact with introduces any kind of LLM chatbot shtick and I have to move away from their services; I could genuinely see people deriving a greater disdain for the fad than there already is.
The question is... is this based on existing capability of LLMs to do these jobs? Or are companies doing this on the expectation that AI is advanced enough to pick up the slack?
I have observed a disconnect in which management is typically far more optimistic about AI being capable of performing a specific task than are the workers who currently perform that task.
And to what extent is AI-related job cutting just an excuse for what management would want to do anyway?
6-12 months in, the AI bet doesnt pay off, then just stop spending money in it. cancel/dont renew contracts and move some teams around.
For full time entry hires, we typically dont see meaningful positive productivity (their cost is less than what they produce) for 6-8 months. Additionally, entry level takes time away from senior folks reducing their productivity. And if you need to cut payroll cost, its far more complicated, and worse for morale than just cutting AI spend.
So given the above, plus economy seemingly pre-recession (or have been according to some leading indicators) seems best to wait or hire very cautiously for next 6-8 months at least.
I think it's more to do with the outsourcing. Software is going the same way as manufacturing jobs. Automation hurts a little, but outsourcing kills.
I can imagine that there were a decent number of execs who tried chatgpt, made some outlandish predictions and based some hiring decisions upon those predictions though.
This paper looks kinda trashy - confusing correlation with causation and clickbaity.
From https://news.ycombinator.com/item?id=45131866 :
> In 2017 Trump made businesses have to amortize these [R&D] expenses over 5 years instead of deducting them, starting in 2022 (it is common for an administration to write laws that will only have a negative effect after they're gone). This move wrecked the R&D tax credit. Many US businesses stopped claiming R&D tax credits entirely as a result. Others had surprise tax bills.
Then companies bought their own stock instead of investing in labor:
"S&P 500 Buybacks Now Outpace All R&D Spending in the US" (2019) https://news.ycombinator.com/item?id=21762582
People just want the same R&D tax incentives back:
"Tell HN: Help restore the tax deduction for software dev in the US (Section 174)" (2025 (2439 points)) https://news.ycombinator.com/item?id=44226145
For the tech side, we've reduced behavioral questions and created an interview that allows people to use cursor, LLMs, etc. in the interview - that way, it's impossible to cheat.
We have folks build a feature on a fake code base. Unfortunately, more junior folks now seem to struggle a lot more with this problem
The other part is that you can absolutely tell during a live interview when someone is using an LLM to answer.
It slices through the bullshit fast. Either the person I'm interviewing is a passionate problem solver, and will be tripping over themselves to describe whatever oddball thing they've been working on, or they're either a charlatan or simply not cut out for the work. My sneaking suspicion is that we could achieve similar levels of success in hiring for entry level positions at my current company if we cut out literally the entirety of the rest of the interviews, asked that one question, and hired the first person to answer well.
companies must do this, 'cause if they don't then their competition will (i.e. the pressure)
of course, we can collectively decide to equally value labor and profit, as a symbiotic relationship that incentivizes long term prosperity. but where's the memes in that
This is what I see, but want to hear others' experiences.
When you constrict the market like they have done, you naturally get distortions, and the adversarial nature of the market fails to perform economic calculation potentially leading to grave consequences. Even at this point, whipsaws would be quite destructive. I know people who have abandoned their careers due to lack of job availability for the foreseeable future. They were searching for years with no recovery.
When you destroy a pipeline of career development for short-term profit which is possible because of the decoupled nature of money-printing/credit facility, decisions made are psychologically sticky. When there is no economic benefit for competency, the smart people leave for a market where this is possible.
The smart people I know right now are quietly preparing for socio-economic collapse as a result of runaway money-printing. When you have a runaway dangerous machine, you can either step back and let it destroy itself (isolating yourself), or you can accelerate the breakdown of its dependent cycles. Many choose the former, since the latter carries existential risk for no real benefit in the short-term but the latter would result in the least amount of causalties.
70% of the economy covers the white-collar market, which will be gone soon, not because the jobs can be replaced by AI, but because the business leaders in consolidated industry all decide to replace workers becoming the instrument of their own demise through deflationary economics.
Preserve the dependencies that promote critical thinking, rational thought, and measured reason.
Aside from that, figure out what you need for survival beyond the basics in a non-permissive environment without a rule of law; and build a community of people that have these skills. The lone wolf always dies, the pack survives.
Austere Medicine, Food Preservation, Guns/Self Defense, Information Recon, etc... Map dependencies, costs, and yields (information that is largely kept confidential these days).
Learn how to make stuff from scratch, and learn what areas you can leapfrog given knowledge of the development of such the technologies in such industries.
Most of the materials we rely on today would not be available because we rely on knowledge of chemistry which few have.
The actual factors that determine the Wealth of Nations are based in the ability of the individual citizen to be self sufficient and produce necessities from scratch that will last without excessive recurring cost. Through a distribution of labor. The book from Adam Smith covers these factors in great detail, and while LTV is now defunct/refuted, the cost portion implicit in the making of intermediate goods resulting from such valuation/economics is not, and remains valid.
Debt-based money-printing on the otherhand is all about distorting the market allowing nationalized industry to outcompete legitimate companies through slave labor under financial engineering. The debasement stolen from everyone holding the currency is in fact slave labor. A runaway positive feedback system that destroys everything it touches (eventually), tainted by Ahriman.
Even if a company has somehow managed to successfully replace all human labor with AI and fire 100% of its human workforce, the revenue wouldn't necessarily spike.
I have always kept that latter point to myself at work - the next generation have to learn somewhere, and training them can be a pleasure (depending on the person) - but even quite good folk with 4-5 years of experience need (what feels like) a lot of assistance to understand/follow the architecture laid out for them.
I don’t use AI to help me code because IME it’s no better than an enthusiastic junior dev who doesn’t learn, and helping them learn is the main reason I might want to work with such a dev.
- Generative AI is genuinely capable enough to replace entry level workers at a lower cost.
- The hype around Generative AI is convincing people who make hiring decisions that it is capable enough to replace entry level workers at a lower cost.
- The hype around Generative AI is being used as an excuse to not hire during an economic downturn.
Could even still be other things too.
I think it's the expectation. We have some publicized examples of needing to hire people back. My company isn't laying off, but they also aren't hiring much, especially at the entry level. They're also trying to force attrition with PIPs and stuff. If it was a right now thing, they'd just layoff.
Also, just because coding gets 30% faster (say some studies), doesn't mean that's a 30% reduction in headcount since many tasks are about design and stuff. This seems to further point to a lack of hiring being anticipatory in my estimation.
So companies reduce junior hiring because their work is relatively less valuable, and they can meet their goals by shuffling existing resources. When they can't do that, they go for seniors since the immediate bang for the buck is higher (ofc, while depleting the global pipeline that actually produces seniors in the long run, in a typical tragedy of the commons)
To really test the implied theory that using AI enables cutting junior hiring, we need to see it in a better economy, in otherwise growing companies, or with some kind of control (though not sure how this would really be possible).
I'm not disputing your point, but I'm curious: given that the main headline measures that we tend to see about the US economy right now involve the labour market. How do you establish the counterfactual?
I think we might be seeing this now but headlines get more clicks with AI taking our jobs.
Note that last time round it took the media a year or so to _notice_; it didn't immediately become clear what was going on.
When more than half of the population has a 5% or higher mortgage rate then you can start to say it’s not that high.
https://calculatedrisk.substack.com/p/fhfas-national-mortgag...
They are the world’s largest asset manager. So goes the economy, so goes the economic outcome of their clients.
You’re not “investing” in anyone if their tenure is going to be 2-3 years with the first one doing negative work.
And why should juniors stay? Because of salary compression and inversion, where HR determines raises. But the free market determines comp for new employees, it makes sense for them to jump ship to make more money. I’ve seen this at every company I’ve worked for from startups, to mid size companies, to boring old enterprise companies to BigTech.
Where even managers can’t fight for employees to get raises at market rates. But they can get an open req to pay at market rates when that employee leaves.
And who is incentivize to care about “the organization” when line level managers and even directors or incentivized to care about the next quarter to the next year?
My broader point is that when these short-term incentives dominate, organizations (and societies) lose the capacity to build for the long term. That’s exactly why governance frameworks matter: they help create safeguards against purely short-term dynamics — whether in HR policy or in AI policy.
Everything is short term. Just look at the equity market - its all pricing not intrinsic valuation, based on forecasting cash flows out in perpetuity.
Folks need to wake up to this realisation and just accept it as a flaw of the system we operate in. Until the system is revised and redesigned, its not gonna change.
But that’s exactly why we need governance frameworks: markets alone won’t correct for long-term stability. Well-designed institutions can act as the counterweight — whether in finance or in AI policy.
I guess you'd need to trust the company, which is hard to come by.
That’s why governance frameworks (whether in labor or in AI) matter: they provide external guarantees of trust where bilateral promises may not hold.
This is cause for government intervention.
>Practically the whole planet is experiencing population decline now.
That's objectively false.
Your statement about populations declining is just untrue.
That world was 30 years ago. In 2025 world average total fertility rate is 2.2, which is a shade above replacement rate (2.1). And 2.2 is a 10% drop since 2017 alone (when it was 2.46).
Because life expectancy is higher, the population will continue to increase. But not "rapidly".
[1]: https://www.youtube.com/watch?v=f7_e_A_vFnk
Many of the largest countries are experiencing similar declines, with fewer and fewer countries maintaining large birth rates.
Young people are cheap and they love AI!
I’m old too - 51. But I consistently tell young people to chase money. Target BigTech or adjacent companies, “grind leetCode”, avoid startups and Monopoly money “equity”, etc.
One person I mentored when they were an intern in 2021 and when they came back the next year, is 25 years old making $220K (Solution Architect not a developer) and I couldn’t be happier for them. They make the same as I make now. But I’ve already raised two (step)kids, bought and sold the big house in the burbs, etc and love working remotely.
I told them to do whatever it takes to be seen, promote themselves, build a network internally and with clients and make all the money they can.
And the entire time I'm watching this I'm just thinking that they don't realize that they are only demonstrating the tools that are going to replace their own jobs. Kinda sad, really. Demand for soft skills and creatives is going to continue to decline.
Dev jobs too.
I personally think we're still a ways from the latter...
In the late 90s you weee considered a prodigy if you understood how to use a search engine. I had so many opportunities simply because I could find and retain information.
So LLMs have solved this. Knowing a framework or being able to create apps is not a marketable skill any longer. What are we supposed to do now?
It’s the soft skills that matter now. Being well liked has always been more important in a job than being the best at it. We all know that engineer who knows they are hot shit but everyone avoids because they are insufferable.
Those marketing people don’t need to spend a week on their deck any longer. They can work the customer relationship now.
Knowing how to iterate with an LLM to give the customer exactly what they need is the valuable skill now.
Entry-level jobs get "hollowed out" in a stagnant economy regardless of "AI".
AI = not hiring because no new work but spin as a "AI" . Markets are hungry of any utterance of the the word AI from the CEO.
so ridiculous. but we've collectively decided to ignore BS as long as we can scam each other and pray you are not the last one holding the bag.
You have to somehow have the discipline to avoid getting caught up in the noise until the hype starts to fade away.
Is this a case of "correlation does not imply causation?"
28 more comments available on Hacker News