• Knusper@feddit.de
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        I’m also not worried. Software complexity generally grows proportional to the complexity of the requirements. And most projects I’ve been a part of, no one could have told you all the requirements even after we’ve figured them out.

        The code + test code is usually the only document that describes the requirements. And with high-level languages, there’s not that much boilerplate around the codified requirements either. Besides, we can use LLMs for that boilerplate ourselves.

      • Holyginz@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        Lmao, I’m not a programmer although i know how, and even if I was I wouldn’t be worried for good reason. AI requires explicit instructions for everything. So in order to use it to code you need to be a programmer.

        • Meho_Nohome@sh.itjust.works
          link
          fedilink
          arrow-up
          0
          ·
          1 year ago

          I’m not a programmer and I’ve used it to code. It rarely works the first time around, but I’m sure it will improve quickly to be more accurate.

          • 30p87@feddit.de
            link
            fedilink
            arrow-up
            0
            ·
            1 year ago

            Even if it could produce bug free, safe and manageable code, all this falls apart once one needs multiple files, such as in Projects containing more than 50 lines of code or so. It could replace some of my script, not without many tries as my setup is pretty specific, but it could never even start to code any of my real projects.

      • RandomVideos@programming.dev
        link
        fedilink
        arrow-up
        0
        arrow-down
        1
        ·
        1 year ago

        If an AI was made that was smarter than programmers, couldnt it make a smarter AI, which could make an even smaryet AI repeating