The recurring dream of replacing developers

(caimito.net)

150 points | by glimshe 6 hours ago

36 comments

  • jackfranklyn 4 hours ago
    The pattern that gets missed in these discussions: every "no-code will replace developers" wave actually creates more developer jobs, not fewer.

    COBOL was supposed to let managers write programs. VB let business users make apps. Squarespace killed the need for web developers. And now AI.

    What actually happens: the tooling lowers the barrier to entry, way more people try to build things, and then those same people need actual developers when they hit the edges of what the tool can do. The total surface area of "stuff that needs building" keeps expanding.

    The developers who get displaced are the ones doing purely mechanical work that was already well-specified. But the job of understanding what to build in the first place, or debugging why the automated thing isn't doing what you expected - that's still there. Usually there's more of it.

    • zby 3 hours ago
      Classic Jevons Paradox - when something gets cheaper the market for it grows. The unit cost shrinks but the number of units bought grows more than this shrinkage.
      • enos_feedler 3 hours ago
        Of course that is true. The nuance here is that software isn’t just getting cheaper but the activity to build it is changing. Instead of writing lines of code you are writing requirements. That shifts who can do the job. The customer might be able to do it themselves. This removes a market, not grows one. I am not saying the market will collapse just be careful applying a blunt theory to such a profound technological shift that isn’t just lowering cost but changing the entire process.
        • AstroBen 9 minutes ago
          > The customer might be able to do it themselves

          Have you ever paid for software? I have, many times, for things I could build myself

          Building it yourself as a business means you need to staff people, taking them away from other work. You need to maintain it. It's not the norm that that's going to be good ROI

          No matter how good these tools get, they can't read your mind. It takes real work to get something production ready and polished out of them

        • lotu 1 hour ago
          You say that like someone that has been coding for so long you have forgotten what it's like to not know how to code. The customer will have little idea what is even possible and will ask for a product that doesn't solve their actual problem. AI is amazing at producing answers you previously would have looked up on stack overflow, which is very useful. It often can type faster that than I can which is also useful. However, if we are going to see the exponential improvements towards AGI AI boosters talk about we would have already seen the start of it.

          When LLMs first showed up publicly it was a huge leap forward, and people assumed it would continue improving at the rate they had seen but it hasn't.

          • akhil08agrawal 51 minutes ago
            Exactly. The customer doesn't know what's possible, but increasingly neither do we unless we're staying current at frontier speed. AI can type faster and answer Stack Overflow questions. But understanding what's newly possible, what competitors just shipped, what research just dropped... that requires continuous monitoring across arXiv, HN, Reddit, Discord, Twitter. The gap isn't coding ability anymore. It's information asymmetry. Teams with better intelligence infrastructure will outpace teams with better coding skills. That's the shift people are missing.
        • array_key_first 48 minutes ago
          There are also technical requirements, which, in practice, you will need to make for applications. Technical requirements can be done by people that can't program, but it is very close to programming. You reach a manner of specification where you're designing schemas, formatting specs, high level algorithms, and APIs. Programmers can be, and are, good at this, and the people doing it who aren't programmers would be good programmers.

          At my company, we call them technical business analysts. Their director was a developer for 10 years, and then skyrocket through the ranks in that department.

        • ngrilly 1 hour ago
          "Thinking clearly about complexity" is much more that writing requirements.
          • mistrial9 1 hour ago
            "yours is not to reason why, yours is just to do, or die"

            ( variation of .. "Ours is not to reason why, ours is but to do and die" )

    • jjmarr 5 minutes ago
      The remaining developers also got a big pay bump.
    • xnx 3 hours ago
      Machinery made farmers more efficient and now there are more farmers than ever.
      • tejtm 1 hour ago
        Pre industrial revolution something like 80+ percent of the population was involved in agriculture. I question the assertion of more farmers now especially since an ever growing percentage of farms are not even owned by corporeal entities never mind actual farmers.

        ooohhh I think I missed the intent of the statement... well done!

      • wordpad 1 hour ago
        Machinery and scale efficiencies made cost of entry higher than ever though

        That's not the case for IT where entry barrier has been reduced to nothing.

      • asdff 1 hour ago
        The machinery replaced a lot of low skill labor. But in its wake modern agriculture is now dependent on high skill labor. There are probably more engineers, geologists, climatologists, biologists, chemists, veterinarians, lawyers, and statisticians working in the agriculture sector today than there ever were previously.
      • teaearlgraycold 58 minutes ago
        There’s only so much land and only so much food we need to eat. The bounds on what software we need are much wider. But certainly there is a limit there as well.
      • victorbjorklund 2 hours ago
        Is this sarcasm?
    • cryptonector 26 minutes ago
      Right! Sysadmins got displaced, but many became developers.
    • AstroBen 16 minutes ago
      > lowers the barrier to entry, way more people try to build things, and then those same people need actual developers when they hit the edges of what the tool can do

      I was imagining companies expanding the features they wanted and was skeptical that would be close to enough, but this makes way more sense

    • Palomides 3 hours ago
      >every "no-code will replace developers" wave actually creates more developer jobs, not fewer

      you mean "created", past tense. You're basically arguing it's impossible for technical improvements to reduce the number of programmers in the world, ever. The idea that only humans will ever be able to debug code or interpret non-technical user needs seems questionable to me.

      • groundzeros2015 3 hours ago
        This doesn’t seem immediately false. Industrial society creates more complexity and specializations. There is more work to do all the time.
        • Retric 3 hours ago
          Actual AI seems like a possibility here.

          Also the percentage of adults working has been dropping for a while. Retired used to be a tiny fraction of the population that’s no longer the case, people spend more time being educated or in prison etc.

          Overall people are seeing a higher standard of living while doing less work.

          • groundzeros2015 1 hour ago
            > Also the percentage of adults working has been dropping for a while.

            There are lots of negative reasons for this that aren’t efficiency. Aging demographics. Poor education. Increasing complexity leaves people behind.

            • Retric 14 minutes ago
              Efficiency is why things continue to work as fewer people work. Social programs, bank account, etc are just an abstraction you need a surplus or the only thing that changes is who starves.
    • pydry 3 hours ago
      This suggests that the latent demand was a lot but it still doesnt prove it is unbounded.

      At some point the low hanging automation fruit gets tapped out. What can be put online that isnt there already? Which business processes are obviously going to be made an order magnitude more efficient?

      Moreover, we've never had more developers and we've exited an anomalous period of extraordinarily low interest rates.

      The party might be over.

      • fn-mote 3 hours ago
        Look at traditional manufacturing. Automation has made massive inroads. Not as much of the economy is directly supporting (eg, auto) manufacturers as it used to be (stats check needed). Nevertheless, there are plenty of mechanical engineering jobs. Not so many lower skill line worker jobs in the US any more, though. You have to ask yourself which category you are in (by analogy). Don’t be the SWE working on the assembly line.
        • pydry 2 hours ago
          >Don’t be the SWE working on the assembly line.

          The job is literally building automation.

          There is no equivalent to "working on the assembly line" as an SWE.

          >Not so many lower skill line worker jobs in the US any more, though

          Because Globalization.

          • lotu 1 hour ago
            Yes there totally are web development, shovel ware app development, are two that I can think of off the top of my head.
            • pydry 1 hour ago
              That's an assembly line just one churning out cheap crap.
  • dijit 4 hours ago
    I've watched this pattern play out in systems administration over two decades. The pitch is always the same: higher abstractions will democratise specialist work. SREs are "fundamentally different" from sysadmins, Kubernetes "abstracts away complexity."

    In practice, I see expensive reinvention. Developers debug database corruption after pod restarts without understanding filesystem semantics. They recreate monitoring strategies and networking patterns on top of CNI because they never learned the fundamentals these abstractions are built on. They're not learning faster: they're relearning the same operational lessons at orders of magnitude higher cost, now mediated through layers of YAML.

    Each wave of "democratisation" doesn't eliminate specialists. It creates new specialists who must learn both the abstraction and what it's abstracting. We've made expertise more expensive to acquire, not unnecessary.

    Excel proves the rule. It's objectively terrible: 30% of genomics papers contain gene name errors from autocorrect, JP Morgan lost $6bn from formula errors, Public Health England lost 16,000 COVID cases hitting row limits. Yet it succeeded at democratisation by accepting catastrophic failures no proper system would tolerate.

    The pattern repeats because we want Excel's accessibility with engineering reliability. You can't have both. Either accept disasters for democratisation, or accept that expertise remains required.

    • chis 1 hour ago
      If Kubernetes didn't in any way reduce labor, then the 95% of large corporations that adopted it must all be idiots? I find that kinda hard to believe. It seems more likely that Kubernetes has been adopted alongside increased scale, such that sysadmin jobs have just moved up to new levels of complexity.

      It seems like in the early 2000s every tiny company needed a sysadmin, to manage the physical hardware, manage the DB, custom deployment scripts. That particular job is just gone now.

      • dijit 1 hour ago
        You’re absolutely right that sysadmin jobs moved up to new levels of complexity rather than disappeared. That’s exactly my point.

        Kubernetes didn’t democratise operations, it created a new tier of specialists. But what I find interesting is that a lot of that adoption wasn’t driven by necessity. Studies show 60% of hiring managers admit technology trends influence their job postings, whilst 82% of developers believe using trending tech makes them more attractive to employers. This creates a vicious cycle: companies adopt Kubernetes partly because they’re afraid they won’t be able to hire without it, developers learn Kubernetes to stay employable, which reinforces the hiring pressure.

        I’ve watched small companies with a few hundred users spin up full K8s clusters when they could run on a handful of VMs. Not because they needed the scale, but because “serious startups use Kubernetes.” Then they spend six months debugging networking instead of shipping features. The abstraction didn’t eliminate expertise, it forced them to learn both Kubernetes and the underlying systems when things inevitably break.

        The early 2000s sysadmin managing physical hardware is gone. They’ve been replaced by SREs who need to understand networking, storage, scheduling, plus the Kubernetes control plane, YAML semantics, and operator patterns. We didn’t reduce the expertise required, we added layers on top of it. Which is fine for companies operating at genuine scale, but most of that 95% aren’t Netflix.

        • stoneforger 1 hour ago
          All this is driven by numbers. The bigger you are, the more money they give you to burn. No one is really working solving problems, it's 99% managing complexity driven by shifting goalposts. Noone wants to really build to solve a problem, it's a giant financial circle jerk, everybody wants to sell and rinse and repeat z line must go up. Noone says stop because at 400mph hitting the breaks will get you killed.
    • krackers 1 hour ago
      All abstractions are leaky abstractions. E.g. C is a leaky abstraction because what you type isn't actually what gets emitted (try the same code in two different compilers and one might vectorize your loop while the other doesn't).
    • walterbell 3 hours ago
      > accept disasters for democratisation

      Will insurance policy coverage and premiums change when using non-deterministic software?

      • aleph_minus_one 2 hours ago
        Rather: Barely any insurance company will likely be willing to insure this because of the high unpredictability and high costs in case of disasters.
  • jackfranklyn 6 minutes ago
    The pattern I've noticed building tooling for accountants: automation rarely removes jobs, it changes what the job looks like.

    The bookkeepers I work with used to spend hours on manual data entry. Now they spend that time on client advisory work. The total workload stayed the same - the composition shifted toward higher-value tasks.

    Same dynamic played out with spreadsheets in the 80s. Didn't eliminate accountants - it created new categories of work and raised expectations for what one person could handle.

    The interesting question isn't whether developers will be replaced but whether the new tool-augmented developer role will pay less. Early signs suggest it might - if LLMs commoditise the coding part, the premium shifts to understanding problems and systems thinking.

  • MontyCarloHall 4 hours ago
    It's not so much about replacing developers, but rather increasing the level of abstraction developers can work at, to allow them to work on more complex problems.

    The first electronic computers were programmed by manually re-wiring their circuits. Going from that to being able to encode machine instructions on punchcards did not replace developers. Nor did going from raw machine instructions to assembly code. Nor did going from hand-written assembly to compiled low-level languages like C/FORTRAN. Nor did going from low-level languages to higher-level languages like Java, C++, or Python. Nor did relying on libraries/frameworks for implementing functionality that previously had to be written from scratch each time. Each of these steps freed developers from having to worry about lower-level problems and instead focus on higher-level problems. Mel's intellect is freed from having to optimize the position of the memory drum [0] to allow him to focus on optimizing the higher-level logic/algorithms of the problem he's solving. As a result, software has become both more complex but also much more capable, and thus much more common.

    (The thing that distinguishes gen-AI from all the previous examples of increasing abstraction is that those examples are deterministic and often formally verifiable mappings from higher abstraction -> lower abstraction. Gen-AI is neither.)

    [0] http://catb.org/jargon/html/story-of-mel.html

    • SkiFire13 4 hours ago
      > It's not so much about replacing developers, but rather increasing the level of abstraction developers can work at, to allow them to work on more complex problems.

      People do and will talk about replacing developers though.

      • MontyCarloHall 4 hours ago
        Were many of the aforementioned advancements marketed as "replacing developers"? Absolutely. Did that end up happening? Quite the opposite; each higher-level abstraction only caused the market for software and demand for developers to grow.

        That's not to say developers haven't been displaced by abstraction; I suspect many of the people responsible for re-wiring the ENIAC were completely out of a job when punchcards hit the scene. But their absence was filled by a greater number of higher-level punchcard-wielding developers.

        • Palomides 4 hours ago
          the infinite-fountain-of-software machine seems more likely to replace developers than previous innovations, and the people pushing the button will not be, in any current sense of the word, programming
          • fn-mote 3 hours ago
            You absolutely need to be trying to accomplish these things personally to understand what is/will be easy and where the barriers.

            Recognizing the barriers & modes of failure (which will be a moving target) lets you respond competently when you are called. Raise your hourly rate as needed.

    • ori_b 3 hours ago
      The goal of AI companies is to replace all intellectual labor. You can argue that they're going to fail, but it's very clear what the actual goal is.
      • billy99k 44 minutes ago
        One of my clients is an AI startup in the security industry. Their business model is to use AI agents to perform the initial assessment and then cut the security contractors hours by 50% to complete the job.

        I don't think AI will completely replace these jobs, but it could reduce job numbers by a very large amount.

    • smj-edison 3 hours ago
      I think one thing I've heard missing from discussions though is that each level of abstraction needs to be introspectable. LLMs get compared to compilers a lot, so I'd like to ask: what is the equivalent of dumping the tokens, AST, SSA, IR, optimization passes, and assembly?

      That's where I find the analogy on thin ice, because somebody has to understand the layers and their transformations.

      • fn-mote 3 hours ago
        “Needs to be” is a strong claim. The skill of debugging complex problems by stepping through disassembly to find a compiler error is very specialized. Few can do it. Most applications don’t need that “introspection”. They need the “encapsulation” and faith that the lower layers work well 99.9+% of the time, and they need to know who to call when it fails.

        I’m not saying generative AI meets this standard, but it’s different from what you’re saying.

        • smj-edison 3 hours ago
          Sorry, I should clarify: it's needs to be introspectable by somebody. Not every programmer needs to be able to introspect the lower layers, but that capability needs to exist.

          Now I guess you can read the code an LLM generates, so maybe that layer does exist. But, that's why I don't like the idea of making a programming language for LLMs, by LLMs, that's inscrutable by humans. A lot of those intermediate layers in compilers are designed for humans, with only assembly generation being made for the CPU.

    • AndrewKemendo 4 hours ago
      I think the thing that’s so weird to me is this idea that we have to all somehow internalize the concept of transistor switching as the foundational unchangeable root of computing and therefore anything that is too far abstract from that is not somehow real computing or something mess like that

      Again ignoring completely that when you would program vacuum tube computers it was an entirely different type of abstraction than you do with Mosfets for example

      I’m finding myself in the position where I can safely ignore any conversation about engineering with anybody who thinks that there is a “right” way to do it or that there’s any kind of ceremony or thinking pattern that needs to stay stable

      Those are all artifacts of humans desiring very little variance and things that they’ve even encoded because it takes real energy to have to reconfigure your own internal state model to a new paradigm

  • CodingJeebus 5 hours ago
    > Which brings us to the question: why does this pattern repeat?

    The pattern repeats because the market incentivizes it. AI has been pushed as an omnipotent, all-powerful job-killer by these companies because shareholder value depends on enough people believing in it, not whether the tooling is actually capable. It's telling that folks like Jensen Huang talk about people's negativity towards AI being one of the biggest barriers to advancement, as if they should be immune from scrutiny.

    They'd rather try to discredit the naysayers than actually work towards making these products function the way they're being marketed, and once the market wakes up to this reality, it's gonna get really ugly.

    • lotu 1 hour ago
      Yes very much so, if they could make their product do the things they claim they would be focused on doing that, not telling people to stop being naysayers.
    • psychoslave 4 hours ago
      >The pattern repeats because the market incentivizes it.

      Market is not universal gravity, it's just a storefront for social policy.

      No political order, no market, no market incentives.

  • cryptonector 27 minutes ago
    > Yet demand for software far exceeds our ability to create it.

    In particular the demand for software tools grows faster than our ability to satisfy it. More demand exists than the people who would do the demanding can imagine. Many people who are not software engineers can now write themselves micro software tools using LLMs -- this ranges from home makers to professionals of every kind. But the larger systems that require architecting, designing, building, and maintaining will continue to require some developers -- fewer, perhaps, but perhaps also such systems will proliferate.

  • strict9 4 hours ago
    As I have heard from mid level managers and C suite types across a few dev jobs. Staff are the largest expense and the technology department is the largest cost center. I disagree because Sales couldn't exist with a product but that's a lost point.

    This is why those same mid level managers and C suite people are salivating over AI and mentioning it in every press release.

    The reality is that costs are being reduced by replacing US teams with offshore teams. And the layoffs are being spun as a result of AI adoption.

    AI tools for software development are here to stay and accelerate in the coming months and years and there will be advances. But cost reductions are largely realized via onshore/offshore replacement.

    The remaining onshore teams must absorb much more slack and fixes and in a way end up being more productive.

    • Tade0 2 hours ago
      > The reality is that costs are being reduced by replacing US teams with offshore teams.

      Hailing from an outsourcing destination I need to ask: to where specifically? We've been laid off all the same. Me and my team spent the second half of 2025 working half time because that's the proposition we were given.

      What is this fabled place with an apparent abundance of highly skilled developers? India? They don't make on average much less than we do here - the good ones make more.

      My belief is that spending on staff just went down across the board because every company noticed that all the others were doing layoffs, so pressure to compete in the software space is lower. Also all the investor money was spent on datacentres so in a way AI is taking jobs.

    • jandrese 3 hours ago
      > I disagree because Sales couldn't exist with a product

      There are a lot of counterexamples throughout history.

      • fn-mote 3 hours ago
        This is too cryptic. Be clearer what you mean. Ponzi schemes?
        • jasonjmcghee 2 hours ago
          Many companies aren't selling anything special or are just selling an "idea".

          Like liquid death sells water for a strangely high amount of money - entirely sales / marketing.

          International Star Registry gives you a piece of paper and a row in a database that says you own a star.

          Many luxury things are just because it's sold by that luxury brand. They are "worth" that amount of money for the status of other people knowing you paid that much for it.

  • svilen_dobrev 4 hours ago
    Science is hated because its mastery requires too much hard work, and, by the same token, its practitioners, the scientists, are hated because of their power they derive from it. - Dijkstra '1989

    https://www.cs.utexas.edu/~EWD/transcriptions/EWD10xx/EWD104...

  • PeterStuer 5 hours ago
    The reverse is developer's recurring dream of replacing non-IT people, usually with a 100% online automated self promoting SaaS. AI is also the latest incarnation of that.
    • mjevans 4 hours ago
      When do we get the Star Trek / Orville dream of every job is a good job?
  • _pdp_ 1 hour ago
    I was skeptical until 3-4 months ago, but my recent experience has been entirely different.

    For context: we're the creators of ChatBotKit and have been deploying AI agents since the early days (about 2 years ago). These days, there's no doubt our systems are self-improving. I don't mean to hype this (judge for yourself from my skepticism on Reddit) but we're certainly at a stage where the code is writing the code, and the quality has increased dramatically. It didn't collapse as I was expecting.

    What I don't know is why this is happening. Is it our experience, the architecture of our codebase, or just better models? The last one certainly plays a huge role, but there are also layers of foundation that now make everything easier. It's a framework, so adding new plugins is much easier than writing the whole framework from scratch.

    What does this mean for hiring? It's painfully obvious to me that we can do more with less, and that's not what I was hoping for just a year ago. As someone who's been tinkering with technology and programming since age 12, I thought developers would morph into something else. But right now, I'm thinking that as systems advance, programming will become less of an issue—unless you want to rebuild things from scratch, but AI models can do that too, arguably faster and better.

    It is hard to convey that kind of experience.

    I am wondering if others are seeing it too.

    • akhil08agrawal 52 minutes ago
      I'm seeing it too, but there's a distinction I think matters: AI isn't replacing the thinking, it's shifting where the bottleneck is. You mention systems are self-improving and code quality has increased dramatically. But the constraint isn't execution anymore. It's judgment at scale. When AI collapses build time from weeks to hours, the new bottleneck becomes staying current with what's actually changing. You need to know what competitors shipped, what research dropped, what patterns are emerging across 50+ sources continuously. Generic ChatGPT can't do that. It doesn't know what YOU care about. It starts from scratch every time. The real question is how do you build personal AI that learns YOUR priorities and filters the noise? That's where the leverage is now.

      Excited for the future :)

    • monus 52 minutes ago
      Agreed. I don’t know if it will create or eliminate jobs but this is certainly another level from what we’ve seen before.

      Since last 2 months, calling LLMs even internet-level invention is underserving.

      You can see the sentiment shift happening last months from all prominent experienced devs to.

    • skybrian 36 minutes ago
      My guess: projects "learn" every time we improve documentation, add static analysis, write tests, make the API's clearer, and so on. Once newly started agents onboard by reading AGENTS.md, they're a bit "smarter" than before.

      Maybe there's a threshold where improvements become easy, depending on the LLM and the project?

      As a hobbyist programmer, I feel like I've been promoted to pointy-haired boss.

  • rectang 3 hours ago
    Can semi-technical people replace developers if those semi-technical people accept that the price of avoiding developers is a commitment to minimizing total system complexity?

    Of course semi-technical people can troubleshoot, it's part of nearly every job. (Some are better at it than others.)

    But how many semi-technical people can design a system that facilitates troubleshooting? Even among my engineering acquaintances, there are plenty who cannot.

    • mattgreenrocks 3 hours ago
      Remains to be seen for production settings.

      My guess is no. I’ve seen people talk about understanding the output of their vibe coding sessions as “nerdy,” implying they’re above that. Refusing the vet AI output is the kiss of death to velocity.

      • rectang 3 hours ago
        > Refusing the vet AI output is the kiss of death to velocity.

        The usual rejoinder I've seen is that AI can just rewrite your whole system when complexity explodes. But I see at least two problems with that.

        AI is impressively good at extracting intent from a ball of mud with tons of accidental complexity, and I think we can expect it to continue improving. But when a system has a lot of inherent complexity, and it's poorly specified, the task is harder.

        The second is that small, incremental, reversible changes are the most reliable way to evolve a system, and AI doesn't repeal that principle. The more churn, the more bugs — minor and major.

  • zozbot234 28 minutes ago
    Is this a real article or just AI-generated text? This whole text has a lot of very weird phrasing in it, also it's so strange how it just seems to keep trudging on and on without ever getting to the point. Actual human-written articles are not like this.
  • jwsteigerwalt 23 minutes ago
    Nothing says it like this quote: “quickly discovered that readable syntax didn’t eliminate the complexity of logic, data structures, or system design”
  • BiraIgnacio 1 hour ago
    "This time is different"

    - Me, the last time it wasn't different

  • ilaksh 2 hours ago
    I think that programming as a job has already changed. Because it is hard for most people to tell the difference between someone who actually has programming skills and experience versus someone who has some technical ingenuity but has only ever used AI to program for them.

    Now the expectation from some executives or high level managers is that managers and employees will create custom software for their own departments with minimal software development costs. They can do this using AI tools, often with minimal or no help from software engineers.

    Its not quite the equivalent of having software developed entirely by software engineers, but it can be a significant step up from what you typically get from Excel.

    I have a pretty radical view that the leading edge of this stuff has been moving much faster than most people realize:

    2024: AI-enhanced workflows automating specific tasks

    2025: manually designed/instructed tool calling agents completing complex tasks

    2026: the AI Employee emerges -- robust memory, voice interface, multiple tasks, computer and browser use. They manage their own instructions, tools and context

    2027: Autonomous AI Companies become viable. AI CEO creates and manages objectives and AI employees

    Note that we have had the AI Employee and AI Organization for awhile in different somewhat weak forms. But in the next 18 months or so as the model and tooling abilities continue to improve, they will probably be viable for a growing number of business roles and businesses.

  • zkmon 1 hour ago
    The real reason is, expectations and requirements increased whenever tools helped more productivity or solved problems. This kept complexity growing and the work flowing. Just because you use cars instead of horses, it doesn't mean you get more free time.
  • xnx 5 hours ago
    Don't take it personal. All business want to reduce costs. As long as people cost money, they'll want to reduce people.
    • bill_joy_fanboy 4 hours ago
      Which is why quiet quitting is the logical thing.

      Managers and business owners shouldn't take it personally that I do as little as possible and minimize the amount of labor I provide for the money I receive.

      Hey, it's just business.

      • tbrownaw 3 hours ago
        And you do this honestly, by negotiating reduced hours for the same pay or by negotiating piecework rather than time-based pay. Right?
        • bill_joy_fanboy 2 hours ago
          Like any shrewd businessman, I negotiate to receive the highest price possible for the minimum cost on my end. This is how business is done.
        • nullorempty 3 hours ago
          What does the term mean? I think the answer to your question is obvious.
      • safety1st 4 hours ago
        What a nihilistic perspective and empty life.

        If the deck is stacked against labor and in favor of the owner, become the owner. Start a business. Create things that are better. Enrich the world. Put food on the table for a few people in the process.

        Be something instead of intentionally being nothing. Win.

        • bill_joy_fanboy 2 hours ago
          > What a nihilistic perspective and empty life.

          Equally nihilistic are owners, managers, and leaders who think they will replace developers with LLMs.

          Why care about, support, defend, or help such people? Why would I do that?

        • Giefo6ah 3 hours ago
          Let's say the average firm has 10 workers. 90% of people are nihilists and empty lifers?

          Do I want to lead a business filled with losers?

    • bdcravens 4 hours ago
      The irony being that software, and developers, have often been the tool for reducing head count.
    • CodingJeebus 4 hours ago
      > Don't take it personal. All business want to reduce costs. As long as people cost money, they'll want to reduce people.

      "Don't take it personal" does not feed the starving and does not house the unhoused. An economic system that over-indexes on profit at the expense of the vast majority of its people will eventually fail. If capitalism can't evolve to better provide opportunities for people to live while the capital-owning class continues to capture a disproportionate share of created economic value, the system will eventually break.

      • bdcravens 4 hours ago
        While not an incorrect statement, trillions of dollars have been paid to software developers to build software that invariably reduced labor costs.
        • CodingJeebus 4 hours ago
          You're absolutely correct on that. The technology industry, at least the segment driven by VC (which is a huge portion of it), is funded based on ideas that the capital-owning class thinks is a good idea. Reducing labor costs is always an easy sell when you're trying to raise a round.
          • bdcravens 3 hours ago
            Even in boring development jobs. For example, one of my first development jobs was for a large hospital, building an intranet app to make nurse rounds more efficient so they didn't have to hire as many.
    • psychoslave 4 hours ago
      Some businesses want to reduce costs. Some want to tackle the challenge of using resources available in the most profitable manner, including making their employees grow to better contribute in tackling tomorrow's challenges.

      A business leader board that only consider people as costs are looking at the world through sociopath lenses.

      • cannonpalms 14 minutes ago
        This is just a layer of emotion on top of raw capitalism. And it will always prove to be a lie when push comes to shove.
  • arvindh-manian 1 hour ago
  • johanbcn 4 hours ago
    The link redirects back to the blog index if your browser is configured in Spanish, because it forces to change the language to spanish and the article is not available in spanish.

    Here's an archived link: https://archive.is/y9SyQ

  • klodolph 5 hours ago
    Kind of off topic but this has got to be one of my least favorite CSS rules that I’ve seen in recent memory:

      .blog-entry p:first-letter {
        font-size: 1.2em;
      }
    • backwardsponcho 7 minutes ago
      I love me a good drop cap, but you're right in that this example is not good.
  • akhil08agrawal 1 hour ago
    This resonates with what I'm experiencing, but I think the article misses the real shift happening now.

    The conversation shouldn't be "will AI replace developers". It should be "how do humans stay competitive as AI gets 10x better every 18 months?"

    I watched Claude Code build a feature in 30 minutes that used to take weeks. That moment crystallised something: you don't compete WITH AI. You need YOUR personal AI.

    Here's what I mean: Frontier teams at Anthropic/OpenAI have 20-person research teams monitoring everything 24/7. They're 2-4 weeks ahead today. By 2027? 16+ weeks ahead. This "frontier gap" is exponential.

    The real problem isn't tools or abstraction. It's information overload at scale. When AI collapses execution time, the bottleneck shifts to judgment. And good judgment requires staying current across 50+ sources (Twitter, Reddit, arXiv, Discord, HN).

    Generic ChatGPT is commodity. What matters is: does your AI know YOUR priorities? Does it learn YOUR judgment patterns? Does it filter information through YOUR lens?

    The article is right that tools don't eliminate complexity. But personal AI doesn't eliminate complexity. It amplifies YOUR ability to handle complexity at frontier speed.

    The question isn't about replacement. It's about levelling the playing field. And frankly we all are figuring out on how will this shape out in the future. And if you have any solution that can help me level up, please hit me up.

    • teaearlgraycold 1 hour ago
      What feature is it that Claude Code built in 30 minutes?
      • pirates 51 minutes ago
        For some reason everyone that says things like this never follow up with anything concrete, don’t share prompts or snippets, etc.
  • walterbell 5 hours ago
    > Understanding this doesn’t mean rejecting new tools. It means using them with clear expectations about what they can provide and what will always require human judgment.

    Speaking of tools, that style of writing rings a bell.. Ben Affleck made a similar point about the evolving use of computers and AI in filmmaking, wielded with creativity by humans with lived experiences, https://www.youtube.com/watch?v=O-2OsvVJC0s. Faster visual effects production enables more creative options.

  • submeta 36 minutes ago
    What I’m seeing is that seniors need fewer juniors, not because seniors are being replaced, but because managers believe they can get the same output with fewer people. Agentic coding tools reinforce that belief by offloading the most time-consuming but low-complexity work. Tests, boilerplate, CRUD, glue code, migrations, and similar tasks. Work that isn’t conceptually hard, just expensive in hours.

    So yes, the market shifts, but mostly at the junior end. Fewer entry-level hires, higher expectations for those who are hired, and more leverage given to experienced developers who can supervise, correct, and integrate what these tools produce.

    What these systems cannot replace is senior judgment. You still need humans to make strategic decisions about architecture, business alignment, go or no-go calls, long-term maintenance costs, risk assessment, and deciding what not to build. That is not a coding problem. It is a systems, organizational, and economic problem.

    Agentic coding is good at execution within a frame. Seniors are valuable because they define the frame, understand the implications, and are accountable for the outcome. Until these systems can reason about incentives, constraints, and second-order effects across technical and business domains, they are not replacing seniors. They are amplifying them.

    The real change is not “AI replaces developers.” It is that the bar for being useful as a developer keeps moving up.

  • SonnyTark 4 hours ago
    I recently did a higher education contract for one semester in a highly coding focused course. I have a few years of teaching experience pre-LLMs so I could evaluate the impact internally, my conclusion is that academic education as we know it is basically broken forever.

    If educators use AI to write/update the lectures and the assignments, students use AI to do the assignments, then AI evaluates the student's submissions, what is the point?

    I'm worried about some major software engineering fields experiencing the same problem. If design and requirements are written by AI, code is mostly written by AI, and users are mostly AI agents. What is the point?

    • UncleEntity 3 hours ago
      >> What is the point?

      To replace humans permanently from the work force so they can focus on the things which matter like being good pets?

      Or good techno-serfs...

  • blahnjok 4 hours ago
    > We’re still in that same fundamental situation. We have better tools—vastly better tools—but the thinking remains essential.

    But less thinking is essential, or at least that’s what it’s like using the tools.

    I’ve been vibing code almost 100% of the time since Claude 4.5 Opus came out. I use it to review itself multiple times, and my team does the same, then we use AI to review each others’ code.

    Previously, we whiteboarded and had discussions more than we do now. We definitely coded and reviewed more ourselves than we do now.

    I don’t believe that AI is incapable of making mistakes, nor do I think that multiple AI reviews are enough to understand and solve problems, yet. Some incredibly huge problems are probably on the horizon. But for now, the general “AI will not replace developers” is false; our roles have changed- we are managers now, and for how long?

    • cannonpalms 9 minutes ago
      Those whiteboarding sessions and discussions used to serve as useful opportunities for context building. Where will that context be built within the cycle now? During a production incident?
    • mattgreenrocks 3 hours ago
      You made the choice to change your development workflow to that. You chose to abdicate thinking to the LLM.

      If it’s working for you, then great. But don’t pretend like it is some natural law and must be true everywhere.

  • gedy 1 hour ago
    It might just be companies I have worked for in past 25 years, but engineers were virtually always the ones to make sense of whatever vague idea product and UX were trying to make. It's not just code monkey follow the mockup stuff. AI code tools don't really solve that.
  • cyanydeez 5 hours ago
    Mythical Man Month -> Mythical AI Agent Swarm
  • erichocean 5 hours ago
    Spreadsheets replaced developers for that kind of work, while simultaneously enabling multiple magnitudes more work of that type to be performed.
    • ozim 4 hours ago
      I do agree, that’s like my go to thought.

      Citizen developers were already there doing Excel. I have seen basically full fledged applications in Excel since I was in high school which was 25 years ago already.

      • tacostakohashi 4 hours ago
        If anything, there were a bunch of low barrier to entry software development options like HyperCard, MS Access, Visual Basic, Delphi, 4GLs etc. around in the 90s, that went away.

        It feels like programming then got a lot harder with internet stuff that brought client-server challenges, web frontends, cross platform UI and build challenges, mobile apps, tablets, etc... all bringing in elaborate frameworks and build systems and dependency hell to manage and move complexity around.

        With that context, it seems like the AI experience / productivity boost people are having is almost like a regression back to the mean and just cutting through some of the layers of complexity that had built up over the years.

    • 65 3 hours ago
      And I would argue speadsheets still created more developers. Analytics teams need developers to put that data somewhere, to transform it for certain formats, to load that data from a source so they can create spreadsheets from it.

      So now instead of one developer lost and one analyst created, you've actually just created an analyst and kept a developer.

  • dvcoolarun 3 hours ago
    A few observations from the current tech + services market:

    Service-led companies are doing relatively better right now. Lower costs, smaller teams, and a lot of “good enough” duct-tape solutions are shipping fast.

    Fewer developers are needed to deliver the same output. Mature frameworks, cloud, and AI have quietly changed the baseline productivity.

    And yet, these companies still struggle to hire and retain people. Not because talent doesn’t exist, but because they want people who are immediately useful, adaptable, and can operate in messy environments.

    Retention is hard when work is rushed, ownership is limited, and growth paths are unclear. People leave as soon as they find slightly better clarity or stability.

    On the economy: it doesn’t feel like a crash, more like a slow grind. Capital is cautious. Hiring is defensive. Every role needs justification.

    In this environment, it’s a good time for “hackers” — not security hackers, but people who can glue systems together, work with constraints, ship fast, and move without perfect information.

    Comfort-driven careers are struggling. Leverage-driven careers are compounding.

    Curious to see how others are experiencing this shift.

    • dangus 3 hours ago
      Let’s not forget that we are just now recovering from the market corrections of the pandemic. Pandemic level tech industry hiring was insane and many of those companies who later held layoffs were just sending the growth line back to where it should be.

      I think pressure to ship is always there. I don’t know if that’s intensifying or not. I can understand where managers and executives think AI = magical work faster juice, but I imagine those expectations will hit their correction point at some time.

    • relaxing 2 hours ago
      > Service-led companies are doing relatively better right now

      who

  • bdcravens 4 hours ago
    It's like developers are only now awakening to the reality that despite being paid well, they never were the capitalists.
  • heliumtera 4 hours ago
    Business quacks being forever bamboozled because turns out implementation is the only thing that matters and hacker culture outlived every single promise to eradicate hacker culture.
  • Twey 4 hours ago
    This is the best explanation of (my take on) this I've seen so far.

    On top of the article's excellent breakdown of what is happening, I think it's important to note a couple of driving factors about why (I posit) it is happening:

    First, and this is touched upon in the OP but I think could be made more explicit, a lot of people who bemoan the existence of software development as a discipline see it as a morass of incidental complexity. This is significantly an instance of Chesterton's Fence. Yes, there certainly is incidental complexity in software development, or at least complexity that is incidental at the level of abstraction that most corporate software lives at. But as a discipline, we're pretty good at eliminating it when we find it, though it sometimes takes a while — but the speed with which we iterate means we eliminate it a lot faster than most other disciplines. A lot of the complexity that remains is actually irreducible, or at least we don't yet know how to reduce it. A case in point: programming language syntax. To the outsider, the syntax of modern programming languages, where the commas go, whether whitespace means anything, how angle brackets are parsed, looks to the uninitiated like a jumble of arcane nonsense that must be memorized in order to start really solving problems, and indeed it's a real barrier to entry that non-developers, budding developers, and sometimes seasoned developers have to contend with. But it's also (a selection of competing frontiers of) the best language we have, after many generations of rationalistic and empirical refinement, for humans to unambiguously specify what they mean at the semantic level of software development as it stands! For a long time now we haven't been constrained in the domain of programming language syntax by the complexity or performance of parser implementations. Instead, modern programming languages tend toward simpler formal grammars because they make it easier for _humans_ to understand what's going on when reading the code. AI tools promise to (amongst other things; don't come at me AI enthusiasts!) replace programming language syntax with natural language. But actually natural language is a terrible syntax for clearly and unambiguously conveying intent! If you want a more venerable example, just look at mathematical syntax, a language that has never been constrained by computer implementation but was developed by humans for humans to read and write their meaning in subtle domains efficiently and effectively. Mathematicians started with natural language and, through a long process of iteration, came to modern-day mathematical syntax. There's no push to replace mathematical syntax with natural language because, even though that would definitely make some parts of the mathematical process easier, we've discovered through hard experience that it makes the process as a whole much harder.

    Second, humans (as a gestalt, not necessarily as individuals) always operate at the maximum feasible level of complexity, because there are benefits to be extracted from the higher complexity levels and if we are operating below our maximum complexity budget we're leaving those benefits on the table. From time to time we really do manage to hop up the ladder of abstraction, at least as far as mainstream development goes. But the complexity budget we save by no longer needing to worry about the details we've abstracted over immediately gets reallocated to the upper abstraction levels, providing things like development velocity, correctness guarantees, or UX sophistication. This implies that the sum total of complexity involved in software development will always remain roughly constant. This is of course a win, as we can produce more/better software (assuming we really have abstracted over those low-level details and they're not waiting for the right time to leak through into our nice clean abstraction layer and bite us…), but as a process it will never reduce the total amount of ‘software development’ work to be done, whatever kinds of complexity that may come to comprise. In fact, anecdotally it seems to be subject to some kind of Braess' paradox: the more software we build, the more our society runs on software, the higher the demand for software becomes. If you think about it, this is actually quite a natural consequence of the ‘constant complexity budget’ idea. As we know, software is made of decisions (https://siderea.dreamwidth.org/1219758.html), and the more ‘manual’ labour we free up at the bottom of the stack the more we free up complexity budget to be spent on the high-level decisions at the top. But there's no cap on decision-making! If you ever find yourself with spare complexity budget left over after making all your decisions you can always use it to make decisions about how you make decisions, ad infinitum, and yesterday's high-level decisions become today's menial labour. The only way out of that cycle is to develop intelligences (software, hardware, wetware…) that can not only reason better at a particular level of abstraction than humans but also climb the ladder faster than humanity as a whole — singularity, to use a slightly out-of-vogue term. If we as a species fall off the bottom of the complexity window then there will no longer be a productivity-driven incentive to ideate, though I rather look forward to a luxury-goods market of all-organic artisanal ideas :)

    • daxfohl 2 hours ago
      I don't even think that "singularity-level coding agents" get us there. A big part of engineering is working with PMs, working with management, working across teams, working with users, to help distill their disparate wants and needs down into a coherent and usable system.

      Knowing when to push back, when to trim down a requirement, when to replace a requirement with something slightly different, when to expand a requirement because you're aware of multiple distinct use cases to which it could apply, or even a new requirement that's interesting enough that it might warrant updating your "vision" for the product itself: that's the real engineering work that even a "singularity-level coding agent" alone could not replace.

      An AI agent almost universally says "yes" to everything. They have to! If OpenAI starts selling tools that refuse to do what you tell them, who would ever buy them? And maybe that's the fundamental distinction. Something that says "yes" to everything isn't a partner, it's a tool, and a tool can't replace a partner by itself.

      • Twey 1 hour ago
        I think that's exactly an example of climbing the abstraction ladder. An agent that's incapable of reframing the current context, given a bad task, will try its best to complete it. An agent capable of generalizing to an overarching goal can figure out when the current objective is at odds with the more important goal.

        You're correct in that these aren't really ‘coding agents’ any more, though. Any more than software developers are!

    • njhnjh 3 hours ago
      > don't come at me AI enthusiasts!

      no need to worry; none of them know how to read well enough to make it this far into your comment

      • daxfohl 2 hours ago
        Actually they're the only ones who do: copy and paste into chatgpt with "distill this please".
        • Twey 1 hour ago
          Twitter has a lot to answer for!
  • krater23 5 hours ago
    The link doesn't works for me, just get thrown on the main page after a second.
    • DeadlineDE 4 hours ago
      Looks like the article was pulled down? I could not find it in the english archive either.
  • assaddayinh 2 hours ago
    [dead]
  • jalapenos 6 hours ago
    The dumb part of this is: so who prompts the AI?

    Well probably we'd want a person who really gets the AI, as they'll have a talent for prompting it well.

    Meaning: knows how to talk to computers better than other people.

    So a programmer then...

    I think it's not that people are stupid. I think there's actually a glee behind the claims AI will put devs out of work - like they feel good about the idea of hurting them, rather than being driven by dispassionate logic.

    Maybe it's the ancient jocks vs nerds thing.

    • kankerlijer 5 hours ago
      Outside of SV the thought of More Tech being the answer to ever greater things is met with great skepticism these days. It's not that people hate engineers, and most people are content to hold their nose while the mag7 make 401k go up, but people are sick of Big Tech. Like it or not, the Musks, Karps, Thiels, Bezos's have a lot to do with that.
      • cyanydeez 5 hours ago
        Popularity gets you nowhere though. What matters is money and money. Those 401k holders are tied down to the oligarchy.
        • psychoslave 4 hours ago
          Not imputing that to you, but it seems like they are people out there that believe money is all that matters. The map with the richest details won't save anyone in a territory that was turned into a wasteland unable to produce a single apple on the whole land.
          • cyanydeez 6 minutes ago
            Yes, but thats because Capitalism is mostly built of the idea of fungibility. So yeah, Americans have told themselves for over a century, whatever they need, just substitute money and you can get it eventually. All other things aside, that's a pretty toxic way if not downright psychotic, way to reframe your relationship with society and other people.
    • peacebeard 5 hours ago
      Devs are where projects meet the constraints of reality and people always want to kill the messenger.
      • nasmorn 4 hours ago
        No high paid manager wants to learn that their visionary thinking was just the last iteration of the underpants gnome meme. Some things sound good at first but unfortunately are not that easy to actually do
      • booleandilemma 5 hours ago
        Devs are where the project meets reality in general, and this is what I always try to explain to people. And it's the same with construction, by the way. Pictures and blueprints are nice but sooner or later you're going to need someone digging around in the dirt.
    • benoau 5 hours ago
      Some people just see it as a cost, one "tech" startup I worked at I got this lengthy pitch from a sales exec that they shouldn't have a software team at all, that we'd never be able to build anything useful without spending millions and that money would be better-spent on the sales team, although they'd have nothing to sell lmfao. And the real laugh was the dev team was heavily subsidized by R&D grants anyway.
    • spwa4 5 hours ago
      Even that is the wrong question. The whole promise of the stock market, of AI is that you can "run companies" by just owning shares and knowing nothing at all. I think that is what "leaders" hope to achieve. It's a slightly more dressed get-rich-quick scheme.

      Invest $1000 into AI, have a $1000000 company in a month. That's the dream they're selling, at least until they have enough investment.

      It of course becomes "oh, sorry, we happen to have taken the only huge business for ourselves. Is your kidney now for sale?"

      • rvz 4 hours ago
        > Invest $1000 into AI, have a $1000000 company in a month. That's the dream they're selling, at least until they have enough investment.

        But you need to buy my AI engineer course for that first.

    • rvz 5 hours ago
      Who fixes the unmaintainable mess that the AI created in which the vibe coder prompted?

      The Vibe Coder? The AI?

      Take a guess who fixes it.

      • lkjdsklf 4 hours ago
        The real question is, do you even need to fix it? Does it matter?

        The reason those things matter in a traditional project is because a person needs to be able to read and understand the code.

        If you're vibe coding, that's no longer true. So maybe it doesn't matter. Maybe the things we used to consider maintenance headaches are irrelevant.

        • otabdeveloper4 3 hours ago
          The reason those things matter in a traditional project is because the previous developers fucked up, and the product is now crashing and leaking money and clients like a sinking Titanic.
      • tosapple 4 hours ago
        For now, training these things on code and logic is the first step of building a technological singularity.
    • plagiarist 5 hours ago
      They don't need to put all developers out of work to have a financial impact on the career.
    • heliumtera 4 hours ago
      The day you successfully implemented your solution with a prompt, you solution is valued at the cost of a prompt. There is no value to anything easily achieved by generative tools anymore. Now it is in either:

      a. generative technology but requiring substantial amount of coordination, curation, compute power. b. substantial amount of data. c. scarce intelectual human work.

      And scarce but non intellectually demanding human work was dropped from the list of valuable things.

    • dboreham 5 hours ago
      > who prompts the AI

      LLMs are a box where the input has to be generated by someone/something, but also the output has to be verified somehow (because, like humans, it isn't always correct). So you either need a human at "both ends", or some very clever AI filling those roles.

      But I think the human doing those things probably needs slightly different skills and experience than the average legacy developer.

      • reactordev 5 hours ago
        Rules engines were designed for just such a thing. Validating input/output. You don’t need a human to prompt AI, you need a pipeline.

        While a single LLM won’t replace you. A well designed system of flows for software engineering using LLMs will.

        • Alex_L_Wood 4 hours ago
          Well, who designs the system of flows?
          • lkjdsklf 4 hours ago
            If you ask the AI labs, the AI systems themselves will build these kinds of workflows for themselves.

            That's the goal.

          • reactordev 2 hours ago
            Business Analysts

            (or rather, Business People)

    • duskdozer 5 hours ago
      How about another AI? And who prompts that AI? You're right - another AI!
      • lkjdsklf 4 hours ago
        With all these AIs chaining and prompting eachother, we're approaching the point where some unlucky person is going to ask an AI something and it will consume all the energy in the universe trying to compute the answer.
        • in_a_society 42 minutes ago
          Only to get in response: “INSUFFICIENT DATA FOR MEANINGFUL ANSWER”
        • lstodd 31 minutes ago
          The answer would be 42.
  • JohnCClarke 50 minutes ago
    Consider what happened to painters after the invention of photography (~1830s). At first the technology was very limited and no threat at all to portrait and landscape painters.

    By the 1860s artists were feeling the heat and responded by inventing all the "isms" - starting with impressionism. That's kept them employed so far, but who knows whether they'll be able to co-exist with whatever diffusion models become in 30 years.

    • EEBio 9 minutes ago
      But the 18th century artist who did portraits and wedding paintings is the today’s (wedding) photographer.

      Does it take less money to commission a single wedding photo rather than a wedding painting? Yes. But many more people commission them and usually in tens to hundreds, together with videos, etc.

      An 18th century wedding painter wasn’t in the business of paintings, but in the business of capturing memories and we do that today on much larger scale, more often and in a lot of different ways.

      I’d also argue more landscape painters exist today than ever.

    • bflesch 5 minutes ago
      I can't take these kind of comments serious at all. You're totally off topic and offer a platitude comparing apples to oranges.