• Feweroptions@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    3
    ·
    edit-2
    1 year ago

    I think AI isn’t going to replace the upper level of programmers, but I do believe the absolute number of programmers will drop as AI completes more and more of the labor involved in coding.

    A lot of entry level jobs just won’t exist anymore, because the AI will do the typing work while a small number of people manage the AI.

    And this will apply to pretty much all white collar work - at least that’s my prediction.

    I believe that besides blue collar jobs, AI will practically eradicate the middle class, and sadly there won’t be a UBI to pick up the slack. But maybe I’m just too damn cynical.

    • fidodo@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      I think that’s assuming a relatively consistent level of scope, but the scope will just get bigger. How big is your feature backlog? Will you be able to easily get through it even with the help of AI? How big will your feature backlog get if implementation friction is lowered?

      • AggressivelyPassive@feddit.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        But those people don’t need to be programmers.

        The reality is, that most software is complex, but trivial. It’s s bunch of requirements, but there’s no magic behind it. An AI that can turn a written text containing the requirements into a decently running program will replace tons of developers.

        And since a future AI, that’s actually trained to do software, won’t have problem juggling 300 requirements at once (like humans have), it’s relatively easy to trust the result.

        • dust_accelerator@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          it’s relatively easy to trust the result.

          … just as easy as taking the responsibility for it if it fails?

          • AggressivelyPassive@feddit.de
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Do human programmers not fail?

            I don’t want to hype AI, but you’re basically comparing a high school graduate AI (lots of general knowledge, no specialization) with a perfect senior dev. But that’s not really fair.

            As soon as an AI works better than the average developer in a given area, it will outperform them. Simple as that.

            Of course it will make errors, but the question is, are the extra errors compared to a human worth the savings?

            Just a quick example: let’s say you’d need 10 devs of 100k a year and they produce errors worth 200k a year. That means costs of 1.2million a years.

            If an AI costs 100k in licenses, replaces 5 devs and only adds, say 200k in errors, you’re still at only 1 million a year.

    • gnus_migrate@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      Honestly all the claimed use cases of generative AI for coding are much more easily fulfilled with normal tools. You can’t perform mass refactorings with them because you need to manually check their output or prove that the code they’re generating is correct, they can’t generate code that well unless your domain is well documented online, which isnt the case for most companies.

      There are places that generative AI will replace workers, especially in art, which makes it all the more important to ensure that whoever has their work used in training data is fairly compensated for their work is generating for the AI company. For programming however, I personally don’t see a ton of value in what exists today at least.