Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> what happen if the market is right and this is "new normal"?????

Then there's an oversupply of programmers, salaries will crash, and lots of people will have to switch careers. It's happened before.



Some people will lose their homes. Some marriages will fail from the stress. Some people will chose to exit life because of it all.

It's happened before and there's no way we could have learned from that and improved things. It has to be just life changing, life ruining, career crippling. Absolutely no other way for a society to function than this.


That's where the post-scarcity society AI will enable comes in! Surely the profits from this technology will allow these displaced programmers to still live comfortable lives, not just be hoarded by a tiny number of already rich and powerful people. /s


I'd sooner believe that a unicorn will fly over my house and poop out rainbow skittles on my lawn. Yeah /s for sure!

You and I both know we're probably headed for revolutionary times.


It's not as simple as putting all programmers into one category. There can be oversupply of web developers but at the same time undersupply of COBOL developers. If you are a very good developer, you will always be in demand.


> If you are a very good developer, you will always be in demand.

"Always", in the same way that five years ago we'd "never" have an AI that can do a code review.

Don't get me wrong: I've watched a decade of promises that "self driving cars are coming real soon now honest", latest news about Tesla's is that it can't cope with leaves; I certainly *hope* that a decade from now will still be having much the same conversation about AI taking senior programmer jobs, but "always" is a long time.


Five years ago we had pretty good static analysis tools for popular languages which could automate certain aspects of code reviews and catch many common defects. Those tools didn't even use AI, just deterministic pattern matching. And yet due to laziness and incompetence many developers didn't even bother taking full advantage of those tools to maximize their own productivity.


The devs themselves can still be lazy, claude and copilot code review can be automated on all pull requests by demand of the PM — and the PM can be lazy and ask the LLMs to integrate themselves.

And the LLMs can use the static analysis tools.


An LLM can run the static analysis tool and copy/paste its output onto your PR, sure. I'm not sure I would call that "doing code review".


> copy/paste

I did not say that.

That it can *also* use tools to help, doesn't mean it can *only* get there by using tools.

They can *also* just do a code review themselves.

As in, I cloned a repo of some of my old manually-written code, cd'd into it, ran `claude`, and gave it the prompt "code review" (or something close to that), and it told me a whole bunch of things wrong with it, in natural language, even though I didn't have the relevant static analysis tools for those languages installed.


> I cloned a repo of some of my old manually-written code, cd'd into it, ran `claude`, and gave it the prompt "code review" (or something close to that), and it told me a whole bunch of things wrong with it, in natural language, even though I didn't have the relevant static analysis tools for those languages installed.

Well sure, but was the result any better than that of installing and running the tools? If the AI can provide better or at least different (but accurate!) PR feedback from conventional tools, that's interesting. If it's just offering the same thing (which is not really "code review" as I'd define it, even if it is something that code reviewers in some contexts spend some of their time on) through a different interface, that's much less interesting.


I can't even imagine what time wasting bs the LLMs are finding with static analysis tools! It's all just a circle jerk everywhere now.


Static analysis was pretty limited imho. It wasn't finding anything that interesting. I spent untold hours trying to satisfy SonarQube in 2021 & 2022. It was total shit busy work they stuck me with because all our APIs had to have at least 80% code coverage and meet a moving target of code analysis profiles that were updated quarterly. I had to do a ton of refactoring on a lot of projects just to make them testable. I barely found any bugs and after working on over 100 of those stupid things, I was basically done with that company and its bs. What an utter waste of time for a senior dev. They had to have been trying to get me to quit.


Even if someday we get AI that can generalize well, the need for a person who actually develops things using AI is not going anywhere. The thing with AI is that you cannot make it responsible, there will still be a human in the loop who is responsible for conveying ideas to the AI and controlling its results, and that person will be the developer. Senior developers are not hired just because they are smart or can write code or build systems, they are also hired to share the load of responsibility.

Someone with a name, an employment contract, and accountability is needed to sign off on decisions. Tools can be infinitely smart, but they cannot be responsible, so AI will shift how developers work, not whether they are needed.


Even where a human in the loop is a legal obligation, it can be QA or a PM, roles as different from "developer" as "developer" is from "circuit designer".


A PM or QA can sign off only on process or outcome quality. They cannot replace the person who actually understands the architecture and the implications of technical decisions. Responsibility is about being able to judge whether the system is correct, safe, maintainable, and aligned with real-world constraints.

If AI becomes powerful enough to generate entire systems, the person supervising and validating those systems is, functionally, a developer — because they must understand the technical details well enough to take responsibility for them.

Titles can shift, but the role dont disappear. Someone with deep technical judgment will still be required to translate intent into implementation and to sign off on the risks. You can call that person "developer", "AI engineer" or something else, but the core responsibility remains technical. PMs and QA do not fill that gap.


> They cannot replace the person who actually understands the architecture and the implications of technical decisions.

LLMs can already do that.

What they can't do is be legally responsible, which is a different thing.

> Responsibility is about being able to judge whether the system is correct, safe, maintainable, and aligned with real-world constraints.

Legal responsibility and technical responsibility are not always the same thing; technical responsibility is absolutely in the domain of PM and QA, legal responsibility ultimately stops with either a certified engineer (which software engineering famously isn't), the C-suite, the public liability insurance company, or the shareholders depending on specifics.

Ownership requires legal personhood, which isn't the same thing as philosophical personhood, which is why corporations themselves can be legal owners.


If LLMs truly "understood architecture" in the engineering sense, they would not hallucinate, contradict themselves, or miss edge cases that even a mid-level engineer catches instinctively.

They are powerful tools but they are not engineers.

And its not about legal responsibility at all. Developers dont go to jail for mistakes, but they are responsible within the engineering hierarchy. A pilot is not legally liable for Boeing's corporate decisions, and the plane can mostly fly on the autopilot, but you still need a pilot in the cockpit.

AI cannot replace the human whose technical judgment is required to supervise, validate, and interpret AI-generated systems.


ai can do code review? do people actually believe this? we have a mr llm bot, it is wrong 95% of the time


I have used it for code review.

Like everything else they do, it's amazing how far you can get even if you're incredibly lazy and let it do everything itself, though of course that's a bad idea because it's got all the skill and quality of result you'd expect if I said "endless hoarde of fresh grads unwilling to say 'no' except on ethical grounds".


I've been taking self-driving cars to get around regularly for a year or more.


waymo and tesla already operate in certain areas, even if tech is ready

regulation still very much a thing


“certain areas” is a very important qualifier, though. Typically areas with very predictable weather. Not discounting the achievement just noting that we’re still far away from ubiquity.


Waymo is doing very well around San Francisco, which is certainly very challenging city driving. Yes, it doesn't snow there. Maybe areas with winter storms will never have autonomous vehicles. That doesn't mean there isn't a lot of utility created even now.


My original point, clearly badly phrased given the responses I got, is that the promises have been exceeding the reality for a decade.

Musk's claims about what Tesla's would be able to do wasn't limited to just "a few locations" it was "complete autonomy" and "you'll be able to summon your car from across the country"… by 2018.

And yet, 2025, leaves: https://news.ycombinator.com/item?id=46095867



I'm young, please when was that and in what industry


After the year 2000. dot com burst.

An tech employee posted he looked for job for 6 months, found none and has joined a fast food shop flipping burgers.

That turned tech workers switching to "flipping burgers" into a meme.


What was a little different then was that tech jobs paid about 30% more than other jobs, it wasn't anything like the highs we have seen the last few years. I used to describe it as you used to have the nicer house on the block, but then in the 2010s+ FNG salaries had people living in whole other neighborhoods. So switching out of the industry, while painful was not as traumatic. Obviously though having to actually flip burgers was a move of desperation and traumatic. The .com bust was largely centered around SV as well, in NYC (where I live) there was some fallout, but there was still a tailwind of businesses of all sorts expanding their tech footprint, so while you may not have been able to land at a hot startup and dream of getting rich in an IPO, by the end of 2003 it was mostly stabilized and you could likely have landed a somewhat boring corporate job even if it was just building internal apps.

I feel like there are a lot of people in school or recently graduated though that had FNG dreams and never considered an alternative. This is going to be very difficult for them. I now feel, especially as tech has gone truly borderless with remote work, that this downturn is now way worse than the .com bust. It has just dragged on for years now, with no real end in sight.


I used to watch all of the "Odd Todd" episodes religiously. Does anyone else remember that Adobe Flash-based "TV show" (before YouTube!)?


.com implosion, tech jobs of all kinds went from "we'll hire anyone who knows how to use a mouse" to the tech jobs section of the classifieds was omitted entirely for 20 months. There have been other bumps in the road since then but that was a real eye-opener.


well same like covid right??? digital/tech company overhiring because everyone is home and at the same time the rise of AI reduce the number of headcount

covid overhiring + AI usage = massive layoff we ever see in decades


It was nothing like covid. The dot com crash lasted years where tech was a dead sector. Equity valuations kept declining year after year. People couldn't find jobs in tech at all.

There are still plenty of tech jobs these days, just less than there were during covid, but tech itself is still in a massive expansionary cycle. We'll see how the AI bubble lasts, and what the fallout of it bursting will be.

The key point is that the going is still exceptionally good. The posts talking about experienced programmers having to flip burgers in the early 2000s is not an exaggeration.


After the first Internet bubble popped, service levels in Silicon Valley restaurants suddenly got a lot better. Restaurants that had struggled to hire competent, reliable employees suddenly had their pick of applicants.

History always repeats itself in the tech industry. The hype cycle for LLMs will probably peak within the next few years. (LLMs are legitimately useful for many things but some of the company valuations and employee compensation packages are totally irrational.)


The defense industry in southern California used to be huge until the 1980s. Lots and lots of ex-defense industry people moved to other industries. Oil and gas has gone through huge economic cycles of massive investment and massive cut-backs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: