AI Is Wiping Out Entry-Level Tech Jobs, Leaving Graduates Stranded
Key topics
The notion that AI is decimating entry-level tech jobs has sparked a lively debate, with some commenters questioning whether the tech industry is simply slowing down, masking its stagnation behind AI advancements. While some pointed to LLMs, Claude, and Stable Diffusion as revolutionary technologies, others countered that these aren't enough to offset the perceived drought of innovative breakthroughs. A few commenters chimed in with examples of exciting emerging tech, such as Neuralink, quantum computing, and green steel, which reignited the discussion. As the conversation unfolded, it became clear that the impact of AI on the job market is complex, with some benefiting as consumers of code, while others struggle to adapt.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
2m
Peak period
139
0-12h
Avg / period
32
Based on 160 loaded comments
Key moments
- 01Story posted
Dec 16, 2025 at 12:37 PM EST
17 days ago
Step 01 - 02First comment
Dec 16, 2025 at 12:39 PM EST
2m after posting
Step 02 - 03Peak activity
139 comments in 0-12h
Hottest window of the conversation
Step 03 - 04Latest activity
Dec 21, 2025 at 2:39 PM EST
12 days ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
Also what does three prove? Is three supposed to be a benchmark of some kind?
I would wager every year there are dozens, probably hundreds, of novel technologies being successfully commercialized. The rate is exponentially increasing.
New procedural generation methods for designing parking garages.
New manufacturing approaches for fuselage assembly of aircraft.
New cold-rolled steel shaping and folding methods.
New solid state battery assembly methods.
New drug discovery and testing methods.
New mineral refinement processes.
New logistics routing software.
New heat pump designs.
New robotics actuators.
See what I mean?
I would wager we are very far from peak complexity, and as long as complexity keeps increasing there will always be opportunities to do meaningful innovative work.
2. We may be at the peak complexity that our sources of energy will support. (Though the transition to renewables may help with that.)
3. We may be at the peak complexity that humans can stand without too many of them becoming dehumanized by their work. I could see evidence for this one already appearing in society, though I'm not certain that this is the cause.
2. Kardachev? You think we are at peak energy production? Fusion? Do you see energy usage slowing down, or speeding up, or staying constant?
3. Is the evidence you're seeing appear in society just evidence you're seeing appear in media? If media is an industry that competes for attention, and the best way to get and keep attention is not telling truth but novel threats + viewpoint validation, could it be that the evidence isn't actually evidence but misinformation? What exactly makes people feel dehumanized? Do you think people felt more or less dehumanized during the great depression and WW2? Do you think the world is more or less complex now than then?
From the points you're making you seem young (maybe early-mid 20s) and I wonder if you feel this way because you're early in your career and haven't experienced what makes work meaningful. In my early career I worked jobs like front-line retail and maintenance. Those jobs were not complex, and they felt dehumanizing. I was not appreciated. The more complex my work has become, the more creative I get to be, the more I'm appreciated for doing it, and the more human I feel. I can't speak for "society" but this has been a strong trend for me. Maybe it's because I work directly for customers and I know the work I do has an impact. Maybe people who are caught up in huge complex companies tossed around doing valueless meaningless work feel dehumanized. That makes sense to me, but I don't think the problem is complexity, I think the problem is getting paid to be performative instead of creating real value for other people. Integrity misalignment. Being paid to act in ways that aren't in alignment with personal values is dehumanizing (literally dissolves our humanity).
I've had meaningful work, and I've enjoyed it. But I'm seeing more and more complexity that doesn't actually add anything, or at least doesn't add enough value to be worth the extra effort to deal with it all. I've seen products get more and more bells and whistles added that fewer and fewer people cared about, even as they made the code more and more complex. I've seen good products with good teams get killed because management didn't think the numbers looked right. (I've seen management mess things up several other ways, too.)
You say "Maybe it's because I work directly for customers and I know the work I do has an impact". And that's real! But see, the more complex things get, the more the work gets fragmented into different specialties, and the (relative) fewer of us work directly with customers, and so the fewer of us get to enjoy that.
Like the ability for computers to generate images/videos/songs so reliably that we are debating if it is going to ruin human artists... whether you think that is terrible or good it would be dumb to say "nothing is happening in tech".
https://www.danshapiro.com/blog/2025/12/i-made-the-xkcd-impo...
The xkcd comic is from 11 years ago (September 2014).
Isn’t the sales pitch that they greatly expand accessibility and reduce cost of a variety of valuable work? Ok, so where’s the output? Where’s the fucking beef? Shit’s looking all-bun at the moment, unless you’re into running scams, astroturfing, spammy blogs, or want to make ELIZA your waifu.
2. Copilot for Windows Notepad
3. Copilot for Windows 11 Start Menu
I'm bored.
Nobody, not even Apple was using the term "Apple Silicon" in 2010.
The first M series Macs shipped November 2020.
seems to translate to a 6.1% unemployment rate and 16.5% underemployment rate?
https://www.finalroundai.com/blog/computer-science-graduates...
Blame the article for using suboptimal numbers, but the "wiping out" part is definitely justified when talking about jobs for graduates
If you click through to new york fed's website, the unemployment figures are 4.8% for "recent college graduates (aged 22-27)", 2.7% for all college graduates, and 4.0% for all workers. That's elevated, but hardly "wiping out".
https://www.signalfire.com/blog/signalfire-state-of-talent-r...
Starting at 2019 and saying "pre-pandemic levels" might be a bit disingenuous since that was a leap to a boom... and the bust we're seeing now.
https://www.cbre.com/insights/articles/tech-boom-interrupted
Yes, tech hiring in 2025 is down from 2019. That's a lot like saying "tech hiring is down from 2000" in 2003.And while 2019 might have been third-highest year for investment in 2020, according to this it's been surpassed in 2021, 2022, and 2024
https://kpmg.com/xx/en/media/press-releases/2025/01/2024-glo...
So why have graduate hires continued to decline since 2023? It seems funds have been diverted from junior hiring into AI investments.
However, as others have remarked, this might be a case of "AI is not taking your jobs, AI investment is taking your jobs"
Junior hiring might pick up again once the spending spree is over
The U.S has a national security interest in completely stopping all of it. They dont, because every administration is paid not to.
Regulate tech, ban labor export, ban labor import, protect your countries from the sellout.
It's not a secret companies do not want to hire Americans. Americans are expensive, demand too many benefits like fair pay, healthcare, and vacations. They also are (mostly) at-will. H1B solves all these problem. When that doesn't work, there's 400 Infosys-likes available to export that labor cheaply. We have seen this with several industries, the last most prominent one being auto manufacture.
All that matters is that the next quarters earnings are more than the last. No one hates the American worker more than Americans. Other countries have far better worker protections than us.
I see no reason H1B couldn't be solved by having an high barrier to entry (500k one time fee) and maintenance (100k per year). Then, force them to be paid at the highest bracket in their field. If H1Bs are what it's proponents say - necessary for rare talent not found else where - then this fee should be pennies on the value they provide. I also see no reason we can't tax exported labor in a similarly extreme manner. If the labor truly can't be found in America the high price of the labor on tax and fee terms should be dwarfed by their added value.
If it is not the case that high fees and taxes on H1B and exported labor make sense then the only conclusion is the vast majority of H1Bs and exported labor are not "rare talent" and thus aren't necessary.
My initial reaction would be that these people, unfortunately, got scammed, and that the scammers-promising-abundant-high-paying-jobs have now found a convenient scapegoat?
AI has done nothing so far to reduce the backlog of junior developer positions from where I can see, but, yeah, that all in "Europoor" and "EU residency required" territory, so what do I know...
During COVID we were struggling to retain good developers that just couldn't deal with the full-remote situation[1], and afterwards, there was a lull in recent graduates.
Again, this is from a EU perspective.
[1] While others absolutely thrived, and, yeah, we left them alone after the pandemic restrictions ended...
The post-pandemic tech hiring boom was well documented both at the time and retrospectively. Lots of resources on it available with a quick web search.
So, please elaborate?
Availability heuristic.
Job openings for graduates are significantly down in at least one developed nation: https://www.theguardian.com/money/2025/jun/25/uk-university-...
Was Ai also responsible for that market? This seems a bit unsupported.
Plus, that decline seems specious anyway (as in: just-about visible when you only observe the top-5% of the chart), plus, the UK job market has always been very different from the EU-they-left-behind.
No business cares about that question, just like the Onceler didn't care how many Truffula trees were left.
(i.e. this cynical complaint is exactly the opposite of the cynical complaint about managers/directors engaging in empire building.)
Since this isn't the 1800s anymore there won't be any major revolutions but I expect way more societal violence going forward. If you have no hope for the future it's not hard to go to very dark paths quickly, usually through no fault of your own sadly.
Now add how easy it is for malicious actors to get an audience and how LLM tech makes this even easier to do. Nice recipe for a powder keg.
what if we all just blame the youth?
I think that might fix the situation
I'm sure they were saying the same thing in the 1800s
Other than that, I am guessing junior roles will move offshore to supply the body shops where the corporate IT work has been going.
AI is sucking up investment and AI hype is making executives stupid. Hundreds of billions of dollars that used to go towards hiring is now going towards data centers. But AI is not doing tech jobs.
These headlines do nothing but increase the hype by pointing towards the wrong cause entirely.
Even agentic computing (ie. an AI doing anything on it's own accord for tech-savy users, never mind average users) is new from this year. I would argue it's still pretty far from widespread. Neither my wife nor my kids, despite my explaining repeatedly, even know what that is, never mind caring.
I'm repeating the mantra from before, and I get that it's not useful. But no, it's not AI wiping out entry-level jobs. It's governments failing to prop up the economy.
On the plus side, this means it can be fixed. However, I very much doubt the current morons in charge are going to ...
I don’t think we’ve seen any amount of a net drop in tech jobs on account of LLMs (yet). I actually think they’re (spending on projects using them, that is) countering a drop that was going to happen anyway due to other factors (tightening credit being a huge one; business investment hesitation due to things like batshit crazy and chaotic handling of tariffs; consumer sentiment; et c)
AI is eating the boring tasks juniors used to grind: data cleaning, basic fixes, report drafts. Companies save cash, skip the ramp-up, and wonder why their mid-level pipeline is drying up. Sarcastic bonus: great for margins, sucks for growing actual talent.
Long term though, this forces everyone to level up faster. Juniors who grok AI oversight instead of rote coding will thrive when the real systems engineering kicks in. Short term pain, massive upside if you adapt.
I will include this thread in the https://hackernewsai.com/ newsletter.
It's like if you waited until college to start learning to play piano, and wonder why you can't get a job when you graduate. You need a lot of time at the keyboard (pun intended) to build those skills.
At the company where I work (one of the FAANGs), there is suddenly a large number of junior IC roles opening up. This despite the trend of the last few years to only hire L5 and above.
My read of the situation:
- junior level jobs were sacrificed as cost cutting measures, to allow larger investment in AI
- some analysts read this as “the junior levels are being automated! Evidence: there is some AI stuff, and there are no junior roles!”
- but it was never true, and now the tide is turning.
I’m not sure I ever heard anybody in my company claim that the dearth of junior openings was due to to “we are going to automate the juniors”. I think all of that narrative was external analysts trying to read the tea leaves too hard. And, wannabes like Marc Benioff pretending to be tech leaders, but that’s a helpful reminder that Benioff is simply “not serious people”.
Maybe there was some idea that if AI actually solved software engineering in a few years you wouldn't need any more SWEs. Industry is moving away from that idea this year.
The expectations for juniors, and how seniors work with them, will certainly change, but it's way too early to be making doomsday predictions.
Of course, that's easy for me to say when I'm not the one who just spent thousands of dollars and 4 years of their to land in an environment where getting a job is going to be challenging to say the least.
Seems to me the companies are mostly in a holding pattern: sure, if an important project needs more bodies, it's probably okay to hire. I suspect that lots of teams have to make do until further notice.
Are some teams using AI instead of hiring junior engineers? I don't think there's any doubt about that. It's also a trial period to better understand what the value-add is.
Based on listening to engineers on various podcasts, almost all of them describe the current level of AI agents as being equivalent to a junior engineer: they're eager and think they know a lot but they also have a lot to learn. But we're getting closer to the point where a well-thought out Skill [1] can do a pretty convincing job of replacing a junior engineer.
But—at the rate AI is improving, a company that doesn't adopt AI for software engineering will be at a competitive disadvantage compared to its peers.
[1]: https://www.anthropic.com/engineering/equipping-agents-for-t...
The people that comment as such are either so disconnected from the software development process or so bought in on the hype that they are forgetting what the point of a junior role is in the first place.
If you hire a junior and they're exactly as capable as a junior 3 years later (about how far we're in now) many organizations would consider letting that employee go. The point of hiring a junior is that you get a (relative to the market) cheap investment with a long-term payoff. Within 1-2 years if they are any good, they will not be very junior any more (depending on domain, of course). There is no such promise or guarantee with AI, and employing an army of junior engineers that can't really "learn" is not a future I want to live in as a mid-career senior-ish person.
Of course, you can say "oh, it'll improve, don't worry" but I live in the present and I simply do not see that. I "employ" a bunch of crappy agents I have to constantly babysit. If I had spent the money on a junior I would only have to babysit for the first little while and then they can be more autonomous.
When this dries up because the market stops investing in junior engineers, you're left with absolutely nothing after a mere handful of years. I worry about this future a lot. Luckily I am in a shop that does not mind hiring them (now is a super good time to find junior talent at a good price too).
Most engineers don't work at FAANG. Most _good_ engineers DONT work at FAANG. FAANG is still composed of almost all good engineers. Most software engineers are NOT _good_.
All of these things are simultaneously true.
Most of your junior engineering hires will never develop to FAANG levels, and as such are never in positions to seriously only hypercompete for those FAANG salaries. There vast majority of devs, even in the US, that are perfectly adequate (note, not great, adequate) to act as developers for non-FAANG companies for non-FAANG wages. This is the kind of developer universities are churning out at insane rates.
"We’ve seen this act before. When companies are financially stressed, a relatively easy solution is to lay off workers and ask those who are not laid off to work harder and be thankful that they still have jobs. AI is just a convenient excuse for this cost-cutting. "
But a big part of it to me is looking at the job data[0]. If you look at devs during this period you can see that during the pandemic they hired more in early to mid 2022 but currently are lower than any other industry.
Tech loves booms and busts, with hiring and everything else. But more than anything the tech industry loves optics. The market has rewarded the industry for hiring during the pandemic and in the past year it has rewarded them for laying people off "because AI". And as the new year comes around they'll get rewarded for hiring again as they "accelerate development" even more. Our industry is really good at metric hacking and getting those numbers to keep going up.
I think the problem is we've perverted ("over optimized") the market. You have to constantly have stock growth. The goal is to become the best but you lose the game by winning. I think a good example of this is from an article a read a few months ago[1]. It paints AWS in a bad light but if you pull out the real data you'll see AWS had a greater increase in absolute users than GCloud (you can also estimate easily from the article). But with the stock market it is better to be the underdog with growth than the status quo with constant income[2]. What a weird way to optimize our businesses. You are rewarded for becoming the best, but you are punished for being the best. Feels like only a matter of time before they start tanking on purpose because you can't go up anymore, so you need to make room to go up. IDK, does anyone else feel like this is insane?
[0] Go to "Sector" then add "Software Development" to the chart https://data.indeed.com/#/postings
[1] https://www.reuters.com/business/world-at-work/amazon-target...
[2] Doesn't take a genius to figure out you'll make more money had you invested $100 in GCloud vs $100 in AWS (in this example). The percentile differential is all that matters. Being percentile punishes having a large existing userbase. You have double the percentile growth going from 1 user to 100 than from 10 million to 500 million, yet any person who isn't severely mentally incapacitated would conclude the latter is a better business.
Then we blame the other group of students for not going there and picking majors where the jobs aren’t.
We need some kind of apprenticeship program honestly, or AI will solve the thing entirely and let people follow their honest desires and be able to live reasonably in the world.
I always find hilarious that people treat transformer tech as a public good. Transformers, like any other tech out there owned by large tech companies. Short of forcing the few companies who own the top models to abide to your rule, there is no chance OpenAI is going to give itself up to the government. And even if they did, it means nothing if Microsoft/Amazon/Google/etc do not provide you with the facilities to deploy the model.
A much realistic solution is that Big Tech will collude with governments to keep certain autonomy and restrict its use only for the elites
I think college is useless for the ones out there whom already know how to code, collaborate and other skills the industry is looking for. Many out there are developing high level projects on GitHub and other places without having any degree.
Also, most of the stuff you learn in college has absolutely no relation to what you will do in the industry.
Sure I learned lots of stuff I've never used. Like relational algebra. But I also learned lots of stuff I use a lot, and it's unlikely I'd have studied most of that stuff on my own. During my degree I also had time and opportunity to pursue lots of other topics outside the mandated course material, you're not limited to what they force you to learn.
So sure if you have the motivation, discipline and resourcefulness to learn all that stuff on your own go right ahead. Most people aren't even close. Most people are much better off with a degree.
In my experience those that lack these do not have chance in tech in the first place, so save yourself lot of debt.
But until then we in the US live in a capitalist hellscape where we have to prioritize survival which means only focusing on marketable skills to get a job. After that one can pay for college once they can afford it if they want that experience for personal enrichment.
I don't think one can seriously argue that. This as much a meme as anything. I know it's popular to rag on devs writing inefficient software, but there's plenty of apps with functions where a user couldn't possibly notice the difference between O(n^2) and O(1). You wouldn't take the time to make everything O(1) for no speedup because someone told you that's what good code is, that's just wasting dev time.
In fact, one of the first things you learn is that O(1) can be slower. Constant time is not good if the constant is big and n is small.
I fixed one where a report took 25 minutes to generate and after switching out an O(n^2) list lookup with a dict it too less than 5. Still embarrassingly slow but a lot better.
There's also a lot of cases where it didn't matter when the dev wrote it and they had 400 rows in the db but 5 years later theres a lot more rows so now it matters.
Doesn't cost anything to just use a better algorithm. Usually takes exactly the same amount of time, and even if it is marginally slower at small n values who cares? I don't give a shit about saving nanoseconds. I care about the exponential timewaste that happens when you don't consider what happens when the input grows. For small inputs it doesn't matter what you do. Everything is fast when the input is small. That's why it makes sense to prefer low complexity by default.
There is almost no reason to delegate the work, especially low level grunt work.
People disputing this are either in denial, or lacking the skill set to leverage AI.
One or two more Opus releases from anthropic and this field is cooked
If you have the skills to divide large projects into small tasks, just like we did before AI, you can feed most of the tasks to AI now.
Frontend, backend, animations, design, infra, distributed systems engineering, networking.
Really weird.
Now people can just search stack overflow quicker for the wrong answer, and even more confidently than ever before.
It possible that your job is simply not that difficult to begin with?
An inexperienced junior engineer delegating all their work to an LLM is an absolute recipe for disaster, both for the coworkers and product. Code reviews take at least 3x as long. They cannot justify their decisions because the decisions aren't theirs. I've seen it first hand.
I cant think of an example where an LLM will get in the way of 90% of the stuff people do. The 10% will always be bespoke and need a human to drive forward as they are the ones that create demand for the code / work etc.
Enterprise software is different beast - large fragile [quasi]monoliths, good luck for [current] AI to make a meaningful fixes and/or feature development in it. And even if AI manages to speed up actual development multiple times, the impact would be still small as actual development takes relatively small share of overall work in enterprise software. Of course it will come here too, just somewhat later than at places like FAANG.
"Over $50 billion in under 24 hours: Why Big Tech is doubling down on investing in India" https://www.cnbc.com/2025/12/11/big-tech-microsoft-amazon-go...
Companies don't want to pay US salaries, cost of living in the US are not going down, costs of engineering talent in India is cheaper, you can hire 2 devs for the cost of 1 US dev. Why would you ever have any US engineering devs?
It won't change organically unless the costs of India engineers goes up or the costs of US engineers goes down.
Who has more control over government, the people or the 0.0001%? There is no "US", you are not part of the club.
Since the workers are hired for cost over quality, they're typically incompetent. Though many have learned to parasitize SME and support staff expertise by asking highly specific questions in an extended sequence. It's a salami-slicing strategy where the majority of the work ends up being performed by those SMEs and support staff while the incompetent workers collect the paychecks and credit. I'm pushing my teams to more aggressively identify and call out this behavior, but it's so systemic that it's an endless battle with every new project coming in the door.
Personal frustrations aside, it's very dangerous from both economic and national security perspectives for India to be building and administering so much of the West's IT infrastructure. Our entire economy depends on it, yet we're voluntarily concentrating that dependency in a single foreign nation. A Pacific conflict alone could sever us from the majority of our IT workforce, regardless of India's intentions.
When I went to Japan, it felt like all kinds of people were doing all kinds of jobs many hours into the day, whether it is managing an arcade, selling tickets at the station, working at a konbini or whatever small job. Maybe we need to not give such lofty ideas to the new generation and represent blue collar jobs as "foreigner" or "failure" jobs.
- very few teams have headcount, or expecting to grow - the number of interview requests get has dropped off a cliff.
So BigTech is definitely hiring less IMHO.
That said, I am not sure if it's only or even primarily due to replacement by AI. I think there's generally a lot of uncertainty about the future, and the AI investment bubble popping, and hence companies are being extra cautious about costs that repeat (employees) vs costs that can be stopped whenever they want (buying more GPUs).
And in parallel, they are hoping that "agents" will reduce some of the junior hiring need, but this hasn't happened at scale in practice, yet.
I would expect junior SWE hiring to slowly rebound, but likely stabilize at a slower pace than in the pre-layoff years.
I only want to point out that evidence of less hiring is not evidence for AI-anything.
As others have pointed out, here and previously, things like outsourcing to India, or for Europe Eastern Europe, is also going strong. That's another explanation for less jobs "here", but they are not gone, they just moved to cheaper places. As has been going on for decades, it just continues unevenly.
https://www.cnbc.com/2025/12/11/big-tech-microsoft-amazon-go...
> Over $50 billion in under 24 hours: Why Big Tech is doubling down on investing in India
https://news.microsoft.com/source/asia/2025/12/09/microsoft-...
> Microsoft invests US$17.5 billion in India to drive AI diffusion at population scale
Yes many over-rely on LLMs, but new engineers see possibilities we've stopped noticing and ask the questions we've stopped asking. Experience is invaluable, but it can quietly calcify into 'this is just how things are done.'
Bad article. Hope a human didn't write it.
It's true that a lot of things which were once junior contributor things are now things I'd rather do, but my scarce resource is attention. And humans have a sufficiently large context window and self-agentic behaviour that they're still superior to a machine.
In fact there are possibly other macro-economic effects at play:
1. The inability to deduct engineering for tax purposes in the year they were spent: "Under the Tax Cuts and Jobs Act (TCJA) from 2017, the law requires companies to amortize (spread out) all domestic R&D expenses, including software development costs, over five years, starting in tax years after December 31, 2021, instead of deducting them immediately. This means if you spend $100,000 on software development in 2023, you can only deduct 1/5th (or $20,000) each year over five years"
2. End of zero-interest rates.
3. Pandemic era hiring bloat - let's be honest we hired too many non-technical people, companies are still letting attrition take place (~10%/yr where I am) instead of firing.
4. Strong dollar. My company is moving seats to Canada, Ireland, and India instead of hiring in the US. Getting 1.5-2 engineers in Ireland instead of 1 senior on the US west coast.
Otherwise AI is an accelerator to make more money, increase profits and efficiency. Yes it has a high cost, but so does/did Cloud, every SaaS product we've bought/integrated.