Hacker Newsnew | past | comments | ask | show | jobs | submit | more materielle's commentslogin

My opinion is that AI isn’t actually the root of the problem here.

It’s that we are heading towards a big recession.

As in all recessions, people come up with all sorts of reasons why everything is fine until it can’t be denied anymore. This time, AI was a useful narrative to have lying around.


I think a kind of AI complacency has set in. Companies are just in chill mode right now, laying off people here and there while waiting for AI to get good enough to do actual work.


Everyone is bracing for a labor supply shock. It will move in the direction opposite what investors expect.

2030 will be 2020 all over again.


Why?


If (a) companies lay too many people off because the magic robots will make engineers unnecessary and (b) the pipeline collapses, because being a software engineer is an undesirable career because it is being replaced by robots and (c) it emerges that the robots are kinda bullshit, then there's going to be one hell of a shortage.

When I started a CS degree in 2003, we were still kinda in the "dot com crash has happened, no-one will ever hire a programmer again" phase, and there were about 50 people in my starting class. I think in the same course two years ago, there'd been about 200. The 'correct' number, for actual future demand, was certainly closer to 200 than 50, and the industry as a whole had a bit of a labour crunch in the early 10s in particular.


I believe we are vastly underestimating the number of programmers needed, as some companies reap unusually high rewards from hiring programmers. Companies like Google can pay huge sums of money to programmers because they make even higher sums of money from the programmer's work.

This means that they inflate programmer salaries, which makes it impossible for most companies that could benefit from software development to hire developers.

We could probably have five times as many software developers as we have now, and they would not be out of work; they would only decrease average salaries for programmers.


But if only Google or similarly sized companies can pay that well, and there’s tons of programmers, obviously the average salary will balance out lower than what Google pay but will still be competitive to the thousands of programmers who didn’t get hired at Google.


>but will still be competitive to the thousands of programmers who didn’t get hired at Google

Why would this be the case? Many programmers join Google or Meta (or similar tier companies) and immediately double or triple their income. Software salaries are famously bimodal and people often transition from the lower mode to the higher mode practically overnight.

In fact (and I'm not an economist) I conjecture that the lower mode exists because the upper mode exists. That is, people purposefully don't really care what their salary is (i.e. don't put upward wage pressure) when they're at lower-mode companies because they know one day they'll make the leap to the upper-mode. In other words, the fact that Google-tier companies pay well allows other companies to pay poorly because those guys are just padding their resumes to get a 350k job at Google and don't really care whether Bank of Nowhere pays them $90k or $110k.


People absolutely do care what their salary is. And most people never work at Google...


Well clearly not enough to make the two modes meet.


You could make this argument about almost literally every field.


If a company could benefit from software developers but can’t afford them, then they can purchase Saas offerings written by companies that can afford developers. I don’t think we’ve run out of opportunities to improve the business world with software quite yet.


The fact that there is a market for these products, but they are almost universally terrible, supports my point.


I think it might be worse that that as staff reductions are across the board, not just in software development roles. My hope is start up creation will be unprecedented to take advantage of the complacency. They will wonder why AI deleted their customers when they thought it was supposed to delete their employees.


Holding on for that sweet sweet pay bump after the coming AI winter


Combine a bunch of factors:

1) fewer students are studying computer science, I'm faculty at a top CS program and we saw our enrollment decline for the first time ever. Other universities are seeing similar slowdowns of enrollment [1]

2) fewer immigrants coming to the united states to work and live, US is perhaps looking at its first population decline ever [2]

3) Current juniors are being stunted by AI, they will not develop the necessary skills to become seniors.

4) Seniors retiring faster because they don't want to have to deal with this AI crap, taking their knowledge with them.

So we're looking at a negative bubble forming in the software engineering expertise pipeline. The money people are hoping that AI can become proficient enough to fill that space before before everything bursts. Engineers, per usual, are pointing out the problem before it becomes one and no one is listening.

[1]: https://www.theatlantic.com/economy/archive/2025/06/computer...

[2]: https://nypost.com/2025/09/03/us-news/us-population-could-sh...


1. OBBB rolled back the R&D deduction changes in Section 174 that (allegedly) triggered the layoffs and froze up hiring in 2022-2023.

2. It looks like rates will keep going down.

3. Fewer people are going into CS due to the AI hysteria. You might say oh there's a 4 year lag, but not quite. We should see an immediate impact from career changers, CS grads choosing between career and professional school, and those switching out of CS careers.

The tech AI fear hysteria is so widespread that I've even heard of people avoiding non-SWE tech careers like PM.


First thing I thought of was Benioff saying he cut thousands of customer support roles because AI can do it better then turning around and giving lackluster earnings report with revised down guidance and the stock tanks


I have never, ever seen SVPs, CEOs, and PMs completely misunderstand a technology before. And I agree with you, I think it's more of an excuse to trim fat--actual productivity is unlikely to go up (it hasn't at our Fortune 500 company)


>> productivity is unlikely to go up

I wonder how that would even be measured? I suppose you could do it for roles that do the same type of work every day. I.e. perhaps there is some statistical relevance to number of calls taken in a call center per day or something like that. One the software development side however, productivity metrics are very hard to quantify. Of course, you can make a dashboard look however you want, but impossible, essentially to tie those metrics to NPV.


Productivity = profit / employees


> I have never, ever seen SVPs, CEOs, and PMs completely misunderstand a technology before.

I'm legit not sure if that's sarcasm or not


> we are heading towards a big recession

Who is we? One country heading into a recession is hardly enough to nudge the trend of "all things code"


The last US recession that didn't also pull in the rest of the western world was in 1982, over 40 years ago. Western Europe, Aus, NZ, Canada, and the US all largely rise and sink on the same tides, with differences measured in degrees.


Enough of the tech industry is America-based that a US recession is enough to do much more than nudge the trend of "all things code". Much as I would prefer that it were not so.


America's recessions are global recessions.


Sadly yes - "When America sneezes, the World catches cold"


If that “one country” is the US and not, say, Burkina Faso, it is a major impact on financing, and software has an unusually high share of positions dependent on speculative investment for future return rather than directly related to current operations.


Traditionally, there are two strategies:

1) Use the network thread pool to also run application code. Then your entire program has to be super careful to not block or do CPU intensive work. This is efficient but leads to difficult to maintain programs.

2) The network thread pool passes work back and forth between an application executor. That way, the network thread pool is never starved by the application, since it is essentially two different work queues. This works great, but now every request performs multiple thread hops, which increases latency.

There has been a lot of interest lately to combine scheduling and work stealing algorithms to create a best of both worlds executor.

You could imagine, theoretically, an executor that auto-scales, and maintains different work queues and tries to avoid thread hops when possible. But ensures there are always threads available for the network.


That really seems to be the defining characteristic of the 21st century elite: they’re shameless and proud of it.


Only 21st century? Have you read any history at all?


This has actually been one of the ideas floated by regulators.

The idea is that merit based admissions is actually pretty complicated, so we can allow individual universities continue to experiment with their own implementations and approaches.

However, we can hold them accountable by grading them based on retrospective data.


I think there are two things to keep in mind.

1) Apple and Firefox have enough resources to implement the most recent web standards. When you see a feature which goes un-implemented for too long, it’s almost surely because nobody was even working on it because of internal resourcing fights.

2) Devs aren’t created equal. It’s possible for a team of 8 people to be 10x more productive than another team of 8.


> When you see a feature which goes un-implemented for too long, it’s almost surely because nobody was even working on it because of internal resourcing fights.

Or because they are reluctant to implement it for technical reasons? Not every "standard" that gets thrown on the table and implemented by Google is a brilliant idea.


I think a problem with AI productivity metrics is that a lot of the productivity is made up.

Most enterprise code involves layers of interfaces. So implementing any feature requires updating 5 layers and mocking + unit testing at each layer.

When people say “AI helps me generate tests”, I find that this is what they are usually referring to. Generating hundreds of lines of mock and fake data boilerplate in a few minutes, that would otherwise take an entire day to do manually.

Of course, the AI didn’t make them more productive. The entire point of automated testing is to ensure software correctness without having to test everything manually each time.

The style of unit testing above is basically pointless. Because it doesn’t actually accomplish the goal. All the unit tests could pass and the only thing you’ve tested is that your canned mock responses and asserts are in-sync in the unit testing file.

A problem with how LLMs are used is that they help churn through useless bureaucratic BS faster. But the problem is that there’s no ceiling to bureaucracy. I have strong faith that organizations can generate pointless tasks faster than LLMs can automate them away.

Of course, this isn’t a problem with LLMs themselves, but rather an organization context in which I see them frequently being used.


I think it's appropriate to be skeptical with new tools, and being appropriately, respectfully, prosocially, skeptical, point out failure modes. Kudos.

Something that crosses my mind is if AI generating tests necessitates that it only generates tests with fakes and stubs that exercise no actual logic, the expertise required to notice that, and if it is correctable.

Yesterday, I was working on some OAuth flow stuff. Without replayed responses, I'm not quite sure how I'd test it without writing my own server, and I'm not sure how I'd develop the expertise to do that without, effectively, just returning the responses I expected.

It reminds me that if I eschewed tests with fakes and stubs as untrustworthy in toto, I'd be throwing the baby with the bathwater.


But this is the problem. Our premier academic institutions shouldn’t merely exist as job training programs for big tech.

If anything, tech is still one of the better off fields in the university.

Look at history or literature programs for where this is heading. I’d imagine that most literature majors don’t even read at all these days. As recent as 50 years ago, the requirement involved hundreds of pages of reading per week, over a sustained 4 year period.

Honestly, just close down the university at this point, if all it wants to do is print out degree certificates for social signaling in the job market.


oldpersonintx2, your account is shadowbanned.

Which colleges did you send your kids to, what kind of degrees (just bachelors? undergrad and grad?), and how many kids?

The $800k figure without that context tells us nothing. If that's for 2 kids to get a BA/BS/BE, you got ripped off. If it's for 4 or 5 kids it makes much more sense when examining current costs.


[flagged]


I understand your feelings about this but on HN we still need you to follow the guidelines, which include avoiding uppercase for emphasis and avoiding personal swipes like this:

> I'm laughing at your naive take

https://news.ycombinator.com/newsguidelines.html


What's more interesting is that it's their second account, because the previous one was banned. What's HN policy on ban evasion?

Previous account: https://news.ycombinator.com/user?id=oldpersonintx


Thanks for that. If an account is banned but the user signs up a new account and starts contributing positively and respecting the guidelines, that’s a good outcome. If they just pick up where they left off, that’s what we call a “serial troll” and we’ll ban the new account with fewer or no warnings.


The C++ standard committee is definitely smart. But language design requires sense beyond just being smart.

They didn’t do the best with what they had. Sure, some problems were caused by C backwards compatibility.

But so much of the complexity and silliness of the language was invented by the committee themselves.


As someone that really enjoys C++, I would say that the current issues are cause by lack of preview implementations before being voted into the standard, this is just broken, but there are not enough people around to be able to turn the ship around.

Those folks eventually move to something else and adopt "C++ the good parts" instead.


I’d also point out, that even in the compiler space, there are basically no production compilers written in Haskell and OCaml.

I believe those two languages themselves self-host. So not saying it’s impossible. And I have no clue about the technical merits.

But if you look around programming forums, there’s this ideas that”Ocaml is one of the leading languages for compiler writers”, which seems to be a completely made up statistic.


I don't know that many production compilers are in them, but how much of that is compilers tending towards self hosting once they get far enough along these days? My understanding is early Rust compilers were written in Ocaml, but they transitioned to Rust to self-host.


What do you define as a production compiler? Two related languages have compilers built in Haskell: PureScript and Elm.

Also, Haskell has parsers for all major languages. You can find them on Hackage with the `language-` prefix: language-python, language-rust, language,javascript, etc.

https://hackage.haskell.org/packages/browse?terms=language


Obviously C is the ultimate compiler of compilers.

But I would call Rust, Haxe and Hack production compilers. (As mentioned by sibling, Rust bootstraps itself since its early days. But that doesn't diminish that OCaml was the choice before bootstrapping.)


Most C compilers are written in C++ nowadays.


Yes, C and C++ have an odd symbiosis. I should have said C/C++.


Most C and C++ developers take umbrage with combining them. Since C++11, and especially C++17, the languages have diverged significantly. C is still largely compatible (outside of things like uncasted malloc) since the rules are still largely valid in C++; but both have gained fairly substantial incompatibilities to each other. Writing a pure C++ application today will look nothing like a modern C app.

RAII, iterators, templates, object encapsulation, smart pointers, data ownership, etc are entrenched in C++; while C is still raw pointers, no generics (no _Generic doesn’t count), procedural, void* casting, manual malloc/free, etc.

I code in both, and enjoy each (generally for different use cases), but certainly they are significantly differing experiences.


Unfortunately we still have folks writing C++ in the style of pre-C++98 with no desire to change.

It is like adopting Typescript, but the only thing they do is renaming the file extension for better VScode analysis.

Another one is C++ "libraries" that are plain C with extern "C" blocks.


Sure, and we also still have people coding in K&R-style C. Some people are hard to change in their ways, but that doesn't mean the community/ecosystem hasn't moved on.

> Another one is C++ "libraries" that are plain C with extern "C" blocks.

Sure, and you also see "C Libraries" that are the exact same. I don't usually judge the communities on their exceptions or extremists.


What are you on? Rust was written in ocaml, and Haxe is still after 25 years going strong with a ocaml based compiler, and is very much production grade.


We must be looking at different compilers.


I agree with this, airline price discrimination is an example of the market actually working correctly.

Most people could afford to pay more for airline tickets. It’s just that they’ve done the math and they don’t want to.

I plan carefully and don’t have to rebook my flights. I pack my own food. I leave behind that second pair of shoes so that I only bring a carry on.

I’d rather do all of this so that I have an extra $100 to spend on a nice meal or experience at my destination.

If any of those things are important to you, then you can have them! You just have to pay for them.

I don’t see why the government needs to mandate that airlines provide certain services like checked bags (which would require increasing minimum ticket prices). Why not use price signals to allow each individual to tailor the experience for themselves?

I feel like everyone leaves out that last part. Everyone wants extras when you don’t have to pay for them. Are the people arguing for more regulation mentioning the tidbit that ticket prices will go up?

I know this is veering into the political, but I just don’t understand the ideology that most people seem to have. Which is that we need to use the power of the government to force people to buy things they don’t want.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: