Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
AI is wiping out entry-level tech jobs, leaving graduates stranded (restofworld.org)
125 points by cratermoon 18 hours ago | hide | past | favorite | 156 comments




I’m not sure if this is true.

At the company where I work (one of the FAANGs), there is suddenly a large number of junior IC roles opening up. This despite the trend of the last few years to only hire L5 and above.

My read of the situation:

- junior level jobs were sacrificed as cost cutting measures, to allow larger investment in AI

- some analysts read this as “the junior levels are being automated! Evidence: there is some AI stuff, and there are no junior roles!”

- but it was never true, and now the tide is turning.

I’m not sure I ever heard anybody in my company claim that the dearth of junior openings was due to to “we are going to automate the juniors”. I think all of that narrative was external analysts trying to read the tea leaves too hard. And, wannabes like Marc Benioff pretending to be tech leaders, but that’s a helpful reminder that Benioff is simply “not serious people”.


In addition the industry has been going through a massive correction post Covid, and all the free money drying up. Any impact AI is having is all mixed up with that.

The expectations for juniors, and how seniors work with them, will certainly change, but it's way too early to be making doomsday predictions.

Of course, that's easy for me to say when I'm not the one who just spent thousands of dollars and 4 years of their to land in an environment where getting a job is going to be challenging to say the least.


Agree, the death of the junior SWE is greatly exaggerated. (At least in FAANG)

Maybe there was some idea that if AI actually solved software engineering in a few years you wouldn't need any more SWEs. Industry is moving away from that idea this year.


The death, maybe, but not the lack of hiring. At $BIGCORP, where I work, I haven't seen an externally hired junior dev in at least 2 years in an extended team of ~100 people.

We are hiring juniors again now and haven't in the last few years.

My prediction is that you will see that trend reverse soon. Have the teams become top-heavy?

My prediction is that you won't see it reverse too soon, but that AI has nothing to do with it. It's just (for now, until the AI bubble itself bursts) a convenient scapegoat for people who haven't come to grips with the broad economic malaise outside of, but not caused by, AI.

I'm with you on this, though I do think some people are true believers. Say a lie enough times, right?

But a big part of it to me is looking at the job data[0]. If you look at devs during this period you can see that during the pandemic they hired more in early to mid 2022 but currently are lower than any other industry.

Tech loves booms and busts, with hiring and everything else. But more than anything the tech industry loves optics. The market has rewarded the industry for hiring during the pandemic and in the past year it has rewarded them for laying people off "because AI". And as the new year comes around they'll get rewarded for hiring again as they "accelerate development" even more. Our industry is really good at metric hacking and getting those numbers to keep going up. As long as it looks like a good decision then people are excited and the numbers go up.

I think the problem is we've perverted ("over optimized") the market. You have to constantly have stock growth. The goal is to become the best but you lose the game by winning. I think a good example of this is from an article a read a few months ago[1]. It paints AWS in a bad light but if you pull out the real data you'll see AWS had a greater increase in absolute users than GCloud (you can also estimate easily from the article). But with the stock market it is better to be the underdog with growth than the status quo with constant income[2].

What a weird way to optimize our businesses. You are rewarded for becoming the best, but you are punished for being the best. Feels like only a matter of time before they start tanking on purpose because you can't go up anymore, so you need to make room to go up[3]. I mean we're already trading on speculation. We're beyond tech demos pushing stock up (already speculative) and now our "demos" are not even demonstrations but what we envision tech that hasn't been built to look like. That's much more speculative than something that is in beta! IDK, does anyone else feel like this is insane? How far can we keep pushing this?

[0] Go to "Sector" then add "Software Development" to the chart https://data.indeed.com/#/postings

[1] https://www.reuters.com/business/world-at-work/amazon-target...

[2] Doesn't take a genius to figure out you'll make more money had you invested $100 in GCloud vs $100 in AWS (in this example). The percentile differential is all that matters. Being percentile punishes having a large existing userbase. You have double the percentile growth going from 1 user to 100 than from 10 million to 500 million, yet any person who isn't severely mentally incapacitated would conclude the latter is a better business.

[3] Or at least play a game of hot potato. Sounds like a collusion ring in waiting. e.g. AWS stagnates, lets Azure take a bunch of users, Azure stagnates and users switch to AWS. Gives both the ability to "grow" and I'm sure all the users will be super happy with constantly switching and all the extra costs of doing so...


i guess the ai returns are not there as soon as it was expected.

At my company, we're actively lowering our off-shore dev count in favor or on-shore devs. We're small but we're growing so we're hiring about one junior dev a year. This alone doesn't mean anything, but adding another data point to the conversation.

I agree that AI wasn't genuinely replacing junior roles to any important extent, and the larger investment in AI is spot on. Fast Company had exactly this take in November in "AI isn’t replacing jobs. AI spending is". https://www.fastcompany.com/91435192/chatgpt-llm-openai-jobs...

"We’ve seen this act before. When companies are financially stressed, a relatively easy solution is to lay off workers and ask those who are not laid off to work harder and be thankful that they still have jobs. AI is just a convenient excuse for this cost-cutting. "


That narrative never sat right with me. That all these companies decided that AI was going to replace humans suddenly? Just an obvious pit to fall in to and one that conveniently feeds the AI is taking your job meme. Your read makes MUCH more sense.

FAANG has shedded between 81,000 and 87,000 workers in the past 5 years; I suspect a significant chunk of these jobs aren't coming back.

Seems to me the companies are mostly in a holding pattern: sure, if an important project needs more bodies, it's probably okay to hire. I suspect that lots of teams have to make do until further notice.

Are some teams using AI instead of hiring junior engineers? I don't think there's any doubt about that. It's also a trial period to better understand what the value-add is.

Based on listening to engineers on various podcasts, almost all of them describe the current level of AI agents as being equivalent to a junior engineer: they're eager and think they know a lot but they also have a lot to learn. But we're getting closer to the point where a well-thought out Skill [1] can do a pretty convincing job of replacing a junior engineer.

But—at the rate AI is improving, a company that doesn't adopt AI for software engineering will be at a competitive disadvantage compared to its peers.

[1]: https://www.anthropic.com/engineering/equipping-agents-for-t...

    Meta (Facebook)

    2022: ~11,000 employees (13% of workforce)
    2023: ~10,000 employees plus 5,000 open positions eliminated
    2024: Multiple smaller rounds totaling ~100-200 employees
    2025: ~3,600 employees (5% of workforce, performance-based cuts)
    Total: Approximately 24,700-25,000 employees

    Amazon

    2022: ~10,000 employees
    2023: ~17,000 employees (split between multiple rounds)
    2024: Smaller targeted cuts
    2025: ~14,000 employees announced
    Total: Approximately 41,000+ employees

    Google (Alphabet)

    2023: ~12,000 employees (6% of workforce)
    2024: Multiple smaller rounds, hundreds of employees
    2025: Several hundred in Cloud division and other areas
    Total: Approximately 15,000-20,000 employees

    Apple
    Apple has been an outlier among FAANG companies:

    2022-2023: Minimal layoffs (hiring freeze instead)
    2024: ~700+ employees (primarily from canceled Apple Car project and microLED display teams)
    2025: Small cuts in sales and other divisions
    Total: Approximately 800-1,000 employees (significantly less than peers)

    Netflix

    2022: ~450 employees across two rounds (150 + 300)
    2023: Smaller targeted cuts in animation and drama divisions
    2024-2025: Minimal additional cuts
    Total: Approximately 500-600 employees

    Overall FAANG Totals

    Across all five companies over the past 5 years: approximately 81,000-87,000 workers
    have been laid off, with the vast majority occurring in 2022-2023
    during the post-pandemic correction period.

> Based on listening to engineers on various podcasts, almost all of them describe the current level of AI agents as being equivalent to a junior engineer: they're eager and think they know a lot but they also have a lot to learn. But we're getting closer to the point where a well-thought out Skill [1] can do a pretty convincing job of replacing a junior engineer.

The people that comment as such are either so disconnected from the software development process or so bought in on the hype that they are forgetting what the point of a junior role is in the first place.

If you hire a junior and they're exactly as capable as a junior 3 years later (about how far we're in now) many organizations would consider letting that employee go. The point of hiring a junior is that you get a (relative to the market) cheap investment with a long-term payoff. Within 1-2 years if they are any good, they will not be very junior any more (depending on domain, of course). There is no such promise or guarantee with AI, and employing an army of junior engineers that can't really "learn" is not a future I want to live in as a mid-career senior-ish person.

Of course, you can say "oh, it'll improve, don't worry" but I live in the present and I simply do not see that. I "employ" a bunch of crappy agents I have to constantly babysit only to output more work "units" I could before at the cost of some quality. If I had spent the money on a junior I would only have to babysit for the first little while and then they can be more autonomous. Even if they can improve beyond this, relying on the moat of "AI" provider companies to make this happen is not exactly comfortable either.


> The point of hiring a junior is that you get a (relative to the market) cheap investment with a long-term payoff.

This is only a consideration if you can pay enough to keep the junior for the long term pay off.

Companies that aren't offering Big Tech compensation find it very difficult to compete on this.

The best juniors will get a job paying more than your company can offer in 2 years. The worst juniors will remain the "still haven't progressed beyond what they could do after the first month."

In this situation, unless the company can continue to offer pay increases to match what Big Tech can offer, it is disadvantageous to hire a junior developer.


This is absolutely FUD.

Most engineers don't work at FAANG. Most _good_ engineers DONT work at FAANG. FAANG is still composed of almost all good engineers. Most software engineers are NOT _good_.

All of these things are simultaneously true.

Most of your junior engineering hires will never develop to FAANG levels, and as such are never in positions to seriously only hypercompete for those FAANG salaries. There vast majority of devs, even in the US, that are perfectly adequate (note, not great, adequate) to act as developers for non-FAANG companies for non-FAANG wages. This is the kind of developer universities are churning out at insane rates.


It doesn't help that a lot of the graduates I've talked to or interviewed seemed to treat a compsci degree as nothing more than a piece of paper they needed to get to be handed a high paying tech job. If you're motivated enough to learn enough job skills to be useful on your own then I guess you can treat your degree that way. But if you got through 4 years through cheating and minmaxing the easiest route possible and wound up with no retained skills to show for it? Congrats, you played yourself and fell for the "college is useless" meme. Coulda just skipped the student loans and bombed interviews without the 4 year degree.

> Coulda just skipped the student loans and bombed interviews without the 4 year degree.

I think college is useless for the ones out there whom already know how to code, collaborate and other skills the industry is looking for. Many out there are developing high level projects on GitHub and other places without having any degree.

Also, most of the stuff you learn in college has absolutely no relation to what you will do in the industry.


Personally, I disagree. Software engineering encompasses a lot more than frontend dev work. In previous engineering positions, I’ve used linear regression, evolutionary computation, dynamic algorithms, calculus, image processing, linear algebra, circuit design, etc. almost all of which I originally learned as part of my computer science degree.

Just because you won't use it doesn't mean it's not useful. Lots of programmers use math. Lots of programmers use DSA knowledge on a daily basis - and if you aren't you're probably writing bad code. I see a lot of O(n^2) code or worse making apps slow for no reason. Pretty basic stuff that most people don't understand despite taking a whole class on it.

Sure I learned lots of stuff I've never used. Like relational algebra. But I also learned lots of stuff I use a lot, and it's unlikely I'd have studied most of that stuff on my own. During my degree I also had time and opportunity to pursue lots of other topics outside the mandated course material, you're not limited to what they force you to learn.

So sure if you have the motivation, discipline and resourcefulness to learn all that stuff on your own go right ahead. Most people aren't even close. Most people are much better off with a degree.


> motivation, discipline and resourcefulness

In my experience those that lack these do not have chance in tech in the first place, so save yourself lot of debt.


You know there's a lot of places in this world where education is pretty much free. Turns out it's good for society when people do well, and most first world countries have figured that out and leaned into the whole helping each other thing. We even get to be sick or injured without losing our job and getting a $100,000 bill, it's crazy.

> Lots of programmers use DSA knowledge on a daily basis - and if you aren't you're probably writing bad code. I see a lot of O(n^2) code or worse making apps slow for no reason

I don't think one can seriously argue that. This as much a meme as anything. I know it's popular to rag on devs writing inefficient software, but there's plenty of apps with functions where a user couldn't possibly notice the difference between O(n^2) and O(1). You wouldn't take the time to make everything O(1) for no speedup because someone told you that's what good code is, that's just wasting dev time.

In fact, one of the first things you learn is that O(1) can be slower. Constant time is not good if the constant is big and n is small.


Obviously I'm not talking about the cases where it doesn't matter. I'm talking about the cases where it does.

I fixed one where a report took 25 minutes to generate and after switching out an O(n^2) list lookup with a dict it too less than 5. Still embarrassingly slow but a lot better.

There's also a lot of cases where it didn't matter when the dev wrote it and they had 400 rows in the db but 5 years later theres a lot more rows so now it matters.

Doesn't cost anything to just use a better algorithm. Usually takes exactly the same amount of time, and even if it is marginally slower at small n values who cares? I don't give a shit about saving nanoseconds. I care about the exponential timewaste that happens when you don't consider what happens when the input grows.

For small inputs it doesn't matter what you do. Everything is fast when the input is small. That's why it makes sense to prefer low complexity by default.


It has happened several times - junior web devs can't find jobs, junior Java devs can't find jobs, etc... usually after a surge wave in the related tech area. We had large overall surge in tech around Covid time, and as usually there is some adjustment now.

The dotcom bubble had comp sci lecture halls with students overflowing into the hallway. I don’t blame people, it’s migratory. Jobs and resources are there, so, go there.

Then we blame the other group of students for not going there and picking majors where the jobs aren’t.

We need some kind of apprenticeship program honestly, or AI will solve the thing entirely and let people follow their honest desires and live reasonably in the world.


> AI will solve the thing entirely and let people follow their honest desires and live reasonably in the world.

I always find hilarious that people treat transformer tech as a public good. Transformers, like any other tech out there owned by large tech companies. Short of forcing the few companies who own the top models to abide to your rule, there is no chance OpenAI is going to give itself up to the government. And even if they did, it means nothing if Microsoft/Amazon/Google/etc do not provide you with the facilities to deploy the model.

A much realistic solution is that Big Tech will collude with governments to keep certain autonomy and restrict its use only for the elites


AI? Ah, India.

"Over $50 billion in under 24 hours: Why Big Tech is doubling down on investing in India" https://www.cnbc.com/2025/12/11/big-tech-microsoft-amazon-go...


I can confirm from consulting experience that India is where the jobs went. My office provides professional services to North American and European industrial customers in manufacturing and distribution. Roughly 85% of these customers have fully Indian IT teams. Running a SOQL query in our Salesforce instance for 'Devi', 'Singh', and 'Kumar' yields over two thousand hits across client contacts, even.

Since the workers are hired for cost over quality, they're typically incompetent. Though many have learned to parasitize SME and support staff expertise by asking highly specific questions in an extended sequence. It's a salami-slicing strategy where the majority of the work ends up being performed by those SMEs and support staff while the incompetent workers collect the paychecks and credit. I'm pushing my teams to more aggressively identify and call out this behavior, but it's so systemic that it's an endless battle with every new project coming in the door.

Personal frustrations aside, it's very dangerous from both economic and national security perspectives for India to be building and administering so much of the West's IT infrastructure. Our entire economy depends on it, yet we're voluntarily concentrating that dependency in a single foreign nation. A Pacific conflict alone could sever us from the majority of our IT workforce, regardless of India's intentions.


Currently looking for a new role in biotech and it seems like at many companies it is almost 40:1 india vs united states roles being posted. This is in R&D not even manufacturing.

If the US actually cared about retaining jobs for the people they would enforce ratios of onshore / offshore with heavy taxes if companies did not reach that ratio.

Companies don't want to pay US salaries, cost of living in the US are not going down, costs of engineering talent in India is cheaper, you can hire 2 devs for the cost of 1 US dev. Why would you ever have any US engineering devs?

It won't change organically unless the costs of India engineers goes up or the costs of US engineers goes down.


the US could just require indians be paid the same as US workers then companies would be incentivized to hire more from home. You are correct, it is far cheaper to hire indians due to tax incentivize and regulations

> If the US actually

Who has more control over government, the people or the 0.0001%? There is no "US", you are not part of the club.


Microsoft recently announced the intent to train 20 MILLION Indian workers.

Saying that "we're firing to use AI" makes you look like you have ROI on your AI investments and you're keeping up.

In fact there are possibly other macro-economic effects at play:

1. The inability to deduct engineering for tax purposes in the year they were spent: "Under the Tax Cuts and Jobs Act (TCJA) from 2017, the law requires companies to amortize (spread out) all domestic R&D expenses, including software development costs, over five years, starting in tax years after December 31, 2021, instead of deducting them immediately. This means if you spend $100,000 on software development in 2023, you can only deduct 1/5th (or $20,000) each year over five years"

2. End of zero-interest rates.

3. Pandemic era hiring bloat - let's be honest we hired too many non-technical people, companies are still letting attrition take place (~10%/yr where I am) instead of firing.

4. Strong dollar. My company is moving seats to Canada, Ireland, and India instead of hiring in the US. Getting 1.5-2 engineers in Ireland instead of 1 senior on the US west coast.

Otherwise AI is an accelerator to make more money, increase profits and efficiency. Yes it has a high cost, but so does/did Cloud, every SaaS product we've bought/integrated.


No it's not. There is no shortage of tech problems to solve and there are no tech jobs that AI can do alone.

AI is sucking up investment and AI hype is making executives stupid. Hundreds of billions of dollars that used to go towards hiring is now going towards data centers. But AI is not doing tech jobs.

These headlines do nothing but increase the hype by pointing towards the wrong cause entirely.

Edit: You cannot square these headlines https://news.ycombinator.com/item?id=46289160


It might be a question of where the seniors put their time: coaching juniors or working with AI tools.

My senior SWE job at FAANG has essentially turned into prompting Opus 4.5.

There is almost no reason to delegate the work, especially low level grunt work.

People disputing this are either in denial, or lacking the skill set to leverage AI.

One or two more Opus releases from anthropic and this field is cooked


What kind of work do you do that is simple enough that can be accomplished solely through prompting?

What kind of work do you do that CANT BE divided enough into tasks that can be accomplished mostly through prompting?

distributed systems, log diving, deployments, etc

The golden handcuff type where you update documentation with new UI elements.


any sort of web tech based development.

Frontend, backend, animations, design, infra, distributed systems engineering, networking.


It's a troll account called llmslave made a couple months ago. Odds are low it's even a human.

It all depends on how you prompt. and the prompt system you’ve setup.. when done well, you just “steer” the code /system. Quite amazing to see it come together. But there are multiple layers to this.

> lacking the skill set to leverage AI

It possible that your job is simply not that difficult to begin with?


yes, but so are most jobs like mine

What job is so difficult that LLMs cant allow an experienced user an order of magnitude gain in efficiency?

An order of magnitude, really? An experienced user with an LLM is going to accomplish in 2026 what would have otherwise taken until 2036?

Yes, I personally think so. In the hands of an experienced user you can crank out work that would take days or weeks even, and get to the meat of the problem you care about much quicker. Just churning out bespoke boilerplate code is a massive time saver, as is using LLMs to narrow in on docs, features etc. Even high level mathematicians are beginning to incorporate LLM use (early days though).

I cant think of an example where an LLM will get in the way of 90% of the stuff people do. The 10% will always be bespoke and need a human to drive forward as they are the ones that create demand for the code / work etc.


sounds about right

The problem is many users are not experienced. And the more they rely on AI to do their work, the less likely they are to ever become experienced.

An inexperienced junior engineer delegating all their work to an LLM is an absolute recipe for disaster, both for the coworkers and product. Code reviews take at least 3x as long. They cannot justify their decisions because the decisions aren't theirs. I've seen it first hand.


I agree totally; most people are no experienced, and there is a weird situation where the productivity gains are bifurcated. I have also seen a lot of developers unable to steer the LLM as they can’t pick up on issues they would otherwise have learned through experience. Interesting to see what will happen but probably gonna be a shit show for younger devs.

Unfortunatelly, i have the same experience.

It seems you've registered this account a couple of months ago only to basically repeat this opinion over and over (sprinkled with some anti-science opinions on top).

Really weird.


the world has changed, have you caught up?

It hasn't really.

Now people can just search stack overflow quicker for the wrong answer, and even more confidently than ever before.


its nothing like stack overflow

Yeah its even more likely to give you a non working answer

username checks out

The best part about your account is the people who don't understand the satire and unironically agree with you :D

great engineering effort was spent to make software at FAANG built on clear service oriented modular architectures, and thus easy to develop for. Add to that good organization of process where engineers spend most of their time doing actual dev work.

Enterprise software is different beast - large fragile [quasi]monoliths, good luck for [current] AI to make a meaningful fixes and/or feature development in it. And even if AI manages to speed up actual development multiple times, the impact would be still small as actual development takes relatively small share of overall work in enterprise software. Of course it will come here too, just somewhat later than at places like FAANG.


Unfortunately if it takes you 4 years to significantly upskill in tech, you are learning way too slow to survive in this industry. Most of the major innovators I know are dropouts, because they realized college is suited to train you to work in academia, where very few jobs exist, almost no one worth working for cares about degrees anymore, and the debt only makes surviving harder.

IMO the best education and credentials come from picking interesting projects you have no idea how to do, then learn everything in your way to ship them as open source so potential employers can see your work.

If you can get a degree on a scholarship for free, wonderful, but college should be viewed as more of a hobby or a way to network, rather than a way of obtaining marketable technical skills.


I don’t agree that “college is to train you to work in academia”.

I work in FAANG, none of my colleagues are dropouts.

Many BigTech founders are dropouts, but that’s a separate game altogether.


I would agree FAANGs are an exception who have historically hired almost exclusively academics who hire other academics, and it shows. They let many coast for years at a time and get away with being a specialist unable to deliver value outside of their specialization or rapidly learn new skills. Many get the job with academic success and treat the job as a continuation of their academic career. Many get "tenure" and can do whatever they want and are effectively paid to just not work for competitors.

I know lots of people working at those orgs that brag about how well they get away with doing nothing of value and we all know these people (but of course not everyone is like that).

No offense but I do not feel the overwhelming majority of roles at these companies are delivering value to humanity apart from shareholders, or something most people should aspire towards in a career, and do not think most of the skills learned in these orgs are all that useful in the world outside those walls.

Also those same FAANGs are clearly aware of the above at some level and doing mass layoffs, or not replacing people who leave, and those workers are having a really hard time finding a home in the non-FAANG working world where they are expected to be highly motivated generalists.


Reminds me of that comic where the dog runs a ball up to his owner with the thought bubble "Throw!" When the owner goes to take the ball, the dog steps back, thinking, "No take! Only throw!"

So in the glorious future, well only need senior devs to manage AI. No juniors! Only seniors!


Outsourcing, end of ZIRP, end of R&D tax credit. Macro-economic conditions are pushing companies to do more with fewer people. AI might be helping with this, but it's pure marketing BS to blame it for the state of tech employment.

What happens when there are no more entry-level humans to be promoted to mid-level, and so on?

> What happens when there are no more entry-level humans to be promoted to mid-level, and so on?

No business cares about that question, just like the Onceler didn't care how many Truffula trees were left. It's not their problem. Business is business, and business must grow, regardless of crummies in tummies, you know.


It even has a name, tragedy of the commons. I have been saying it constantly for the last few years with all this AI hype over LLM's going on. But with business focus really narrowing down to short time frames, what do you expect

That line always hits hard whenever I read that story to my kids.

The "business" doesn't care about this, but individual employees care about their job duties, not their business. And some of them do have a job duty where they care about this.

(i.e. this cynical complaint is exactly the opposite of the cynical complaint about managers/directors engaging in empire building.)


Well looked at what has always happened in society when young people have no hope for the future: massive societal disruption mostly in the forms of revolution + violence.

Since this isn't the 1800s anymore there won't be any major revolutions but I expect way more societal violence going forward. If you have no hope for the future it's not hard to go to very dark paths quickly, usually through no fault of your own sadly.

Now add how easy it is for malicious actors to get an audience and how LLM tech makes this even easier to do. Nice recipe for a powder keg.


> Since this isn't the 1800s anymore there won't be any major revolutions

I'm sure they were saying the same thing in the 1800s


Well I have an idea:

what if we all just blame the youth?

I think that might fix the situation


In the cobol world, lots of highly paid senior consultants, who come in and out of retirement to support systems.

Other than that, I am guessing junior roles will move offshore to supply the body shops where the corporate IT work has been going.


Not that like I think one should put too much stock in head lines. But "Wiping Out"

seems to translate to a 6.1% unemployment rate and 16.5% underemployment rate?

https://www.finalroundai.com/blog/computer-science-graduates...


I think the numbers you are arguing with here are for all employees, not just fresh graduates.

Blame the article for using suboptimal numbers, but the "wiping out" part is definitely justified when talking about jobs for graduates


When you see 6.1% unemployment for computer science new grads, that invariably comes from

https://www.newyorkfed.org/research/college-labor-market#--:...

Computer Science is tied for fourth lowest underemployment and is the 7th highest unemployment... and is also the highest early career median wage.

That needs to be compared to the underemployment chart https://www.newyorkfed.org/research/college-labor-market#--:... and the unemployment chart https://www.newyorkfed.org/research/college-labor-market#--:... (and make sure to compare that with 2009).

Computer science is not getting wiped out by AI. Entry level jobs exist, though people may need to reset their expectations (note that median job being $80k) from getting a $150k job out of college - that was always the exception rather than the average.

There are average jobs out there that people with a "want to be on the coast and $150k" or "must be remote so I don't relocate" are thumbing their nose at.


I see people posting this all the time without mentioning that the page says "based on data from 2023." As someone who graduated in 2025, I can tell you that the market has changed significantly since then - Trump won election in 2024 and tariffs went into effect in 2025, for one.

It would be justified if AI were actually the cause, but this article does nothing to prove that. The only "tech jobs" that can even demonstrate direct replacement are call-center type roles. Everything else is just loosely blamed on AI, which is a convenient scapegoat as billions of dollars of investment are redirected from hiring to building data centers.

>I think the numbers you are arguing with here are for all employees, not just fresh graduates.

If you click through to new york fed's website, the unemployment figures are 4.8% for "recent college graduates (aged 22-27)", 2.7% for all college graduates, and 4.0% for all workers. That's elevated, but hardly "wiping out".


The article refers to this article from May, which claims a 50% reduction in graduate tech hiring since pre-pandemic levels, 25% reduction since 2023

https://www.signalfire.com/blog/signalfire-state-of-talent-r...


The chart with that data is https://cdn.prod.website-files.com/6516123533d9510e36f3259c/...

Starting at 2019 and saying "pre-pandemic levels" might be a bit disingenuous since that was a leap to a boom... and the bust we're seeing now.

https://www.cbre.com/insights/articles/tech-boom-interrupted

    At $113B, 2019 was the third-highest year on record for VC deal volume.
    2019 had the second-highest volume of “mega rounds” ($100M deals or greater)–mega rounds represented 44% of total annual deal volume.
    Revenue grew by an average of 12.2% in 2019 and the total revenues of the tech giants was greater than the GDP of four of the G20 nations.
Yes, tech hiring in 2025 is down from 2019. That's a lot like saying "tech hiring is down from 2000" in 2003.

Thanks for the context, but there hasn't been a general tech sector crash since 2019, so I don't think the 2000-03 comparison is apt.

And while 2019 might have been third-highest year for investment in 2020, according to this it's been surpassed in 2021, 2022, and 2024

https://kpmg.com/xx/en/media/press-releases/2025/01/2024-glo...

So why have graduate hires continued to decline since 2023? It seems funds have been diverted from junior hiring into AI investments.

However, as others have remarked, this might be a case of "AI is not taking your jobs, AI investment is taking your jobs"

Junior hiring might pick up again once the spending spree is over


Interesting. At least some of this has to be the bullwhip effect modeled with employers as retail, universities as suppliers, and graduating students as further back suppliers. The 4 year lead time in production of employable labour causes a whip crack backwards through the supply chain when there is a sudden shift at the retail end.

It's true that a lot of things which were once junior contributor things are now things I'd rather do, but my scarce resource is attention. And humans have a sufficiently large context window and self-agentic behaviour that they're still superior to a machine.


The 'recent graduates' quoted in this article all seem to be from (for lack of a better description) 'developing countries' hoping to get a (again, generalizing) 'high-paying FAANG job'.

My initial reaction would be that these people, unfortunately, got scammed, and that the scammers-promising-abundant-high-paying-jobs have now found a convenient scapegoat?

AI has done nothing so far to reduce the backlog of junior developer positions from where I can see, but, yeah, that's all in "Europoor" and "EU residency required" territory, so what do I know...


For the last few decades its been offshoring that filled the management agenda in the way AI does today so it doesn't seem surprising to me that the first gap would be in the places you might offshore a testing department to, etc.

Offshoring has the exact same benefits/problems that AI has (i.e: it's cheap, yet you have to specify everything in excruciating detail) and has not been a significant factor in junior hiring, like, ever, in my experience.

My experience is that it is not a reduction in work in the place being offshored, but it changes the shape of the labor market and certainly in the places being offshored to. Replace offshore with something cheaper and a lot of juniors in top offshore locations are the quickest to feel it. Local juniors might be worth hiring again if they need a lot of oversight once agents make them questionably productive.

Currently helping with hiring and can't help but reflect on how it changed over past couple of years. We are now filtering for much stronger candidates across all experience levels, but junior side of the scale had been affected much more. Where previously we would take top 5% of junior applicants that made it past first phone screen, now it's below 2%.

> AI has done nothing so far to reduce the backlog of junior developer positions from where I can see

Job openings for graduates are significantly down in at least one developed nation: https://www.theguardian.com/money/2025/jun/25/uk-university-...


"This article was amended on 26 June 2025 to clarify that the link between AI and the decline in graduate jobs is something suggested by analysts, rather than documented by statistics"

Plus, that decline seems specious anyway (as in: just-about visible when you only observe the top-5% of the chart), plus, the UK job market has always been very different from the EU-they-left-behind.


Am I reading this article correctly: the job market was worse in 2017?

Was Ai also responsible for that market? This seems a bit unsupported.


Consider what happened in the UK in 2016.

And, as usual, no mention of the massive shortsighted overhiring during the post-covid bull market.

Again, in my experience, that simply never happened, at least not with regard to junior positions.

During COVID we were struggling to retain good developers that just couldn't deal with the full-remote situation[1], and afterwards, there was a lull in recent graduates.

Again, this is from a EU perspective.

[1] While others absolutely thrived, and, yeah, we left them alone after the pandemic restrictions ended...


Huh. It sounds like your perspective isn't just EU focused but N=1, based solely on your company.

The post-pandemic tech hiring boom was well documented both at the time and retrospectively. Lots of resources on it available with a quick web search.


I never claimed a broad perspective. But I've yet to see a "post-pandemic hiring boom" anywhere in junior-level-IT jobs in the EU, and a quick trip to Google with those exact words turned up nothing either.

So, please elaborate?


My personal experience is that it's not AI wiping out jobs, it's offshoring.

what if cool new tech is just slowing down and AI is masking it.

Not a "what if". Can you name 3 new cool technologies that have come out in the last 5 years?

1. Copilot for Microsoft PowerPoint

2. Copilot for Windows Notepad

3. Copilot for Windows 11 Start Menu


Nah man, I’m still waiting for Copilot for vim.

Yeah. Where are all the great new Mac native apps putting electron to shame, avalanche of new JS frameworks, and affordable SaaS to automate more of life? AI can write decent code, why am I not benefiting from that a consumer?

Well, if you're a consumer of code, then technically you benefit. Otherwise, you probably won't notice it as much.

It's almost like a lot of our technologies were pretty mature already and an AI trained on 'what has been' has little to offer with respect to 'what could be'.

oof that's profound. Really nice closing thought for 2025.

Incredibly cheaper batteries and solar panels. Much better induction stoves.

LLMs, Apple Silicon, self-driving cars just off the top of my head without really thinking about it.

GPT-2 was 6 years ago, the first Apple silicon (though not branded as such at the time) was 15 years ago, and the first public riders in autonomous vehicles happened around 10 years ago. Also, 2/3 of those are "AI".

> the first Apple silicon (though not branded as such at the time) was 15 years ago

Nobody, not even Apple was using the term "Apple Silicon" in 2010.

The first M series Macs shipped November 2020.


1 year is being pedantic. Apple Silicon is clearly referring to the M series chips which have disrupted and transformed the desktop/laptop market. Self driving also refers to the recent boom and ubiquity of self driving vehicles.

M series is an interation of A series, "disrupting markets" since 2010. LLMs are an iteration of SmarterChild. "Self driving vehicles" is an iteration of the self-parking and lane assist vehicles of the last decade.

I'm bored.


All of those things are more than 5 years old.

Neura Link, Quantum computers are making interesting milestones with Microsoft releasing a processor chip for Quantum computing. Green steel is another interesting one, though not as 'sexy' as the previous two.

didn't believe the quantum stuff, so I googled it. I'm shocked how far its come. Even China has some kind of photonic quantum chips now.

Wait so quantum is going to actually deliver something useful within the next 10-20 years??

Uhhh, LLMs? The shit computers can do now is absurd compared to 2020. If you showed engineers from 2020 Claude, Cursor, and Stable Diffusion and didn't tell them how they worked their minds would be fucking exploding.

So LLMs exist therefore nothing else is worth the time? That’s sort of the gist of HN these past few years

Moreover: people’ve been crowing about LLM-enabled productivity for longer than it took a tiny team to conceive and build goddamn Doom. In a cave! With a box of scraps!

Isn’t the sales pitch that they greatly expand accessibility and reduce cost of a variety of valuable work? Ok, so where’s the output? Where’s the fucking beef? Shit’s looking all-bun at the moment, unless you’re into running scams, astroturfing, spammy blogs, or want to make ELIZA your waifu.


No I was just skeptical of the GPs assertion that tech hasn't produced anything "cool" in the last 5 years when it has been a nonstop barrage of insane shit that people are achieving with LLMs.

Like the ability for computers to generate images/videos/songs so reliably that we are debating if it is going to ruin human artists... whether you think that is terrible or good it would be dumb to say "nothing is happening in tech".


This was posted earlier today:

https://www.danshapiro.com/blog/2025/12/i-made-the-xkcd-impo...

The xkcd comic is from 11 years ago (September 2014).


Surely you have realized by now that a large portion of the HN userbase is here for get rich quick schemes.

ahh brings me back to the blockchain days, and the many excuses people tried to use them instead of a SQL database for whatever reason

It’s really incredible how quickly people take things for granted.

LLMs are one, granted. GP asked for three, though.

GGPs question doesn't make sense though. What does it mean for a technology to "come out".

Also what does three prove? Is three supposed to be a benchmark of some kind?

I would wager every year there are dozens, probably hundreds, of novel technologies being successfully commercialized. The rate is exponentially increasing.

New procedural generation methods for designing parking garages.

New manufacturing approaches for fuselage assembly of aircraft.

New cold-rolled steel shaping and folding methods.

New solid state battery assembly methods.

New drug discovery and testing methods.

New mineral refinement processes.

New logistics routing software.

New heat pump designs.

New robotics actuators.

See what I mean?


Great list, and most of those don't involve big tech. I think what your list illustrates is that progress is being made, but it requires deep domain expertise.

Technology advances like a fractal stain, ever increasing the diversity of jobs to be done to decrease entropy locally while increasing it globally.

I would wager we are very far from peak complexity, and as long as complexity keeps increasing there will always be opportunities to do meaningful innovative work.


1. We may be at the peak complexity that our population will support. As the population stops growing, and then starts declining, we may not have the number of people to maintain this level of specialization.

2. We may be at the peak complexity that our sources of energy will support. (Though the transition to renewables may help with that.)

3. We may be at the peak complexity that humans can stand without too many of them becoming dehumanized by their work. I could see evidence for this one already appearing in society, though I'm not certain that this is the cause.


1. Human potential may be orders of magnitude greater than what people are capable of today. Population projections may be wrong.

2. Kardachev? You think we are at peak energy production? Fusion? Do you see energy usage slowing down, or speeding up, or staying constant?

3. Is the evidence you're seeing appear in society just evidence you're seeing appear in media? If media is an industry that competes for attention, and the best way to get and keep attention is not telling truth but novel threats + viewpoint validation, could it be that the evidence isn't actually evidence but misinformation? What exactly makes people feel dehumanized? Do you think people felt more or less dehumanized during the great depression and WW2? Do you think the world is more or less complex now than then?

From the points you're making you seem young (maybe early-mid 20s) and I wonder if you feel this way because you're early in your career and haven't experienced what makes work meaningful. In my early career I worked jobs like front-line retail and maintenance. Those jobs were not complex, and they felt dehumanizing. I was not appreciated. The more complex my work has become, the more creative I get to be, the more I'm appreciated for doing it, and the more human I feel. I can't speak for "society" but this has been a strong trend for me. Maybe it's because I work directly for customers and I know the work I do has an impact. Maybe people who are caught up in huge complex companies tossed around doing valueless meaningless work feel dehumanized. That makes sense to me, but I don't think the problem is complexity, I think the problem is getting paid to be performative instead of creating real value for other people. Integrity misalignment. Being paid to act in ways that aren't in alignment with personal values is dehumanizing (literally dissolves our humanity).


Not even close. I'm 63. You would be nearer the mark if you guessed that I was old, tired, and maybe burned out.

I've had meaningful work, and I've enjoyed it. But I'm seeing more and more complexity that doesn't actually add anything, or at least doesn't add enough value to be worth the extra effort to deal with it all. I've seen products get more and more bells and whistles added that fewer and fewer people cared about, even as they made the code more and more complex. I've seen good products with good teams get killed because management didn't think the numbers looked right. (I've seen management mess things up several other ways, too.)

You say "Maybe it's because I work directly for customers and I know the work I do has an impact". And that's real! But see, the more complex things get, the more the work gets fragmented into different specialties, and the (relative) fewer of us work directly with customers, and so the fewer of us get to enjoy that.


I am not a hiring manager. But if there is more supply of non-junior workers. Doesn't it make sense to actually hire those if compensation might even be lower than before? You hire juniors to support them, or because experienced devs are too expensive or there simply aren't enough of them. If there are enough of them on market for more reasonable price wouldn't actually choosing from that cohort make more sense?

This article asserts 7 times that jobs are being replaced by AI and the only data to substantiate it is a link to an EY report that is paywalled, doesn't hold up to the text of the link, and doesn't hold up to what contemporary journalists wrote about the report.

Bad article. Hope a human didn't write it.


I had the privilege of working with a great SWE intern this year. Their fresh ideas and strong work ethic made a real impact. Experienced engineers need this kind of energy.

Yes many over-rely on LLMs, but new engineers see possibilities we've stopped noticing and ask the questions we've stopped asking. Experience is invaluable, but it can quietly calcify into 'this is just how things are done.'


Everyone loves blaming AI for entry-level woes, but check the numbers: CS grads hit 6.1% unemployment while nursing sits at 1.4%. That's not "wiping out" jobs, that's oversupply meeting picky hiring after years of "learn to code" hype.

AI is eating the boring tasks juniors used to grind: data cleaning, basic fixes, report drafts. Companies save cash, skip the ramp-up, and wonder why their mid-level pipeline is drying up. Sarcastic bonus: great for margins, sucks for growing actual talent.

Long term though, this forces everyone to level up faster. Juniors who grok AI oversight instead of rote coding will thrive when the real systems engineering kicks in. Short term pain, massive upside if you adapt.

I will include this thread in the https://hackernewsai.com/ newsletter.


Basic coding to solve simple problems is something that high schoolers and even bright middle schoolers can do. By the time I was in college I had been coding for most of a decade. Part of the issue is that many of the folks coming out of school started learning this stuff WAY too late.

It's like if you waited until college to start learning to play piano, and wonder why you can't get a job when you graduate. You need a lot of time at the keyboard (pun intended) to build those skills.


Youth unemployment is up and among new hires in general bc of the uncertain and deteriorating business conditions.

We have new grads, they could not be replaced by AI. If you have new grads AI can replace I’m not sure why it required a college degree.

What's your field, broadly, if you don't mind sharing?

Entry level jobs have been getting wiped out for at least 5 years, including tech jobs, which includes 2 years that not even ChatGPT 3.5 was available. That was the first version that would reasonably respond to any useful question. And if you're being honest, other entry level jobs are far worse of than tech jobs. Entry-level bakers ... outright don't really exist anymore.

Even agentic computing (ie. an AI doing anything on it's own accord for tech-savy users, never mind average users) is new from this year. I would argue it's still pretty far from widespread. Neither my wife nor my kids, despite my explaining repeatedly, even know what that is, never mind caring.

I'm repeating the mantra from before, and I get that it's not useful. But no, it's not AI wiping out entry-level jobs. It's governments failing to prop up the economy.

On the plus side, this means it can be fixed. However, I very much doubt the current morons in charge are going to ...


I’d go farther and guess that the tech job market would be even worse today without every company with at least 500 headcount (and many smaller than that), whether a tech company or not, putting money into “AI initiatives”.

I don’t think we’ve seen any amount of a net drop in tech jobs on account of LLMs (yet). I actually think they’re (spending on projects using them, that is) countering a drop that was going to happen anyway due to other factors (tightening credit being a huge one; business investment hesitation due to things like batshit crazy and chaotic handling of tariffs; consumer sentiment; et c)


Evidence I can give in support of the article:

- very few teams have headcount, or expecting to grow - the number of interview requests get has dropped off a cliff.

So BigTech is definitely hiring less IMHO.

That said, I am not sure if it's only or even primarily due to replacement by AI. I think there's generally a lot of uncertainty about the future, and the AI investment bubble popping, and hence companies are being extra cautious about costs that repeat (employees) vs costs that can be stopped whenever they want (buying more GPUs).

And in parallel, they are hoping that "agents" will reduce some of the junior hiring need, but this hasn't happened at scale in practice, yet.

I would expect junior SWE hiring to slowly rebound, but likely stabilize at a slower pace than in the pre-layoff years.


> Evidence I can give in support of the article:

I only want to point out that evidence of less hiring is not evidence for AI-anything.

As others have pointed out, here and previously, things like outsourcing to India, or for Europe Eastern Europe, is also going strong. That's another explanation for less jobs "here", but they are not gone, they just moved to cheaper places. As has been going on for decades, it just continues unevenly.

https://www.cnbc.com/2025/12/11/big-tech-microsoft-amazon-go...

> Over $50 billion in under 24 hours: Why Big Tech is doubling down on investing in India

https://news.microsoft.com/source/asia/2025/12/09/microsoft-...

> Microsoft invests US$17.5 billion in India to drive AI diffusion at population scale


Maybe rather than telling everyone to "learn to code" we could have told them to do jobs they are more suited to doing: serving food, nursing, construction etc. all which have tangible benefits to society.

When I went to Japan, it felt like all kinds of people were doing all kinds of jobs many hours into the day, whether it is managing an arcade, selling tickets at the station, working at a konbini or whatever small job. Maybe we need to not give such lofty ideas to the new generation and represent blue collar jobs as "foreigner" or "failure" jobs.


For that to work, we would first need to make those blue collar jobs into ones that actually pay well enough for people to thrive instead of merely survive

Japan doesn't do that. Those part time jobs don't pay very well. There's just much lower overhead to having them, you don't have to own a car or a giant house to be in commute range etc.

Just to be clear not everywhere that's facing rising unemployment or requiring an influx of foreigners that citizens don't want to do is America and not everyone doing those jobs were part-time. But it is true, they do have a huge part-time work force.

Both the economy and culture play a role in the types of jobs people aim for. Tech used to be for "nerds" now it is "cool" because they watched The Social Network and Bill Gates told them they are smart enough to be part of the club.

H1B and foreign worker visas are, AI is political cover and it's a lie.

Big tech are doing it on purpose with h1b’s and exportation of labor to capture the market in India and non-china asia. they are desperate and afraid.

The U.S has a national security interest in completely stopping all of it. They dont, because every administration is paid not to.

Regulate tech, ban labor export, ban labor import, protect your countries from the sellout.


I don't see why you're being downvoted. Aside from being a little inflammatory your premise is correct.

It's not a secret companies do not want to hire Americans. Americans are expensive, demand too many benefits like fair pay, healthcare, and vacations. They also are (mostly) at-will. H1B solves all these problem. When that doesn't work, there's 400 Infosys-likes available to export that labor cheaply. We have seen this with several industries, the last most prominent one being auto manufacture.

All that matters is that the next quarters earnings are more than the last. No one hates the American worker more than Americans. Other countries have far better worker protections than us.

I see no reason H1B couldn't be solved by having an high barrier to entry (500k one time fee) and maintenance (100k per year). Then, force them to be paid at the highest bracket in their field. If H1Bs are what it's proponents say - necessary for rare talent not found else where - then this fee should be pennies on the value they provide. I also see no reason we can't tax exported labor in a similarly extreme manner. If the labor truly can't be found in America the high price of the labor on tax and fee terms should be dwarfed by their added value.

If it is not the case that high fees and taxes on H1B and exported labor make sense then the only conclusion is the vast majority of H1Bs and exported labor are not "rare talent" and thus aren't necessary. They can come through the normal immigration routes and integrate into the workforce as a naturalized American.


What exactly are the normal immigration routes? Employment-based immigration (H1B) is the only avenue that makes sense for a skilled worker. And usually skilled immigrants are the ones a country wants to attract.

There’s dozens of visas one can apply for and many of them will fast track citizenship. Talent comes in on these all the time.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: