• space_comrade [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    10
    ·
    1 year ago

    I don’t think it’s gonna go that way. In my experience the bigger the chunk of code you make it generate the more wrong it’s gonna be, not just because it’s a larger chunk of code, it’s gonna be exponentially more wrong.

    It’s only good for generating small chunks of code at a time.

    • FunkyStuff [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      It won’t be long (maybe 3 years max) before industry adopts some technique for automatically prompting a LLM to generate code to fulfill a certain requirement, then iteratively improve it using test data to get it to pass all test cases. And I’m pretty sure there already are ways to get LLM’s to generate test cases. So this could go nightmarishly wrong very very fast if industry adopts that technology and starts integrating hundreds of unnecessary libraries or pieces of code that the AI just learned to “spam” everywhere so to speak. These things are way dumber than we give them credit for.

      • space_comrade [he/him]@hexbear.net
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        1 year ago

        Oh that’s definitely going to lead to some hilarious situations but I don’t think we’re gonna see a complete breakdown of the whole IT sector. There’s no way companies/institutions that do really mission critical work (kernels, firmware, automotive/aerospace software, certain kinds of banking/finance software etc.) will let AI write that code any time soon. The rest of the stuff isn’t really that important and isn’t that big of a deal it if breaks for a few hours/days because the AI spazzed out.

        • FunkyStuff [he/him]@hexbear.net
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Agreed, don’t expect it to break absolutely everything but I expect that software development is going to get very hairy when you have to use whatever bloated mess AI is creating.

        • SmoothIsFast@citizensgaming.com
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          If you have seen the crunch before demos for military projects you might start to think the other way. I doubt the bigger vendors will change much but you definetly could see contracts being won for shit that will just be ai generated because they got some base manager to eat up their proposal filled with buzz words. I’d be more worried about it, causing more contract bloat and wasted resources in critical systems going to these vapor ware solutions. Then you take general government contracts which go to the lowest bidder and you are gonna see a ton of AI bullshit start cropping up and bloating our systems because some high-school kid got chatgpt to make a basic website and no thinks he is the AI website God. Plus I work in the financial sector now and they have been eating up all the AI buzzwords like fucking hot cakes, the devs all know it will be a shit show but the ego from the executives thinking it’s a great idea won’t hear any of it, because think of the efficiency and bonuses they could get if they cut the implementation timeline down to a quarter. Not realizing the vulnerability, maintainence cost, and lack of understanding from the llm that will cause massive long-term issues regardless if they can get a buggy alpha created.

    • theluddite@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Yes I agree. I meant the fundamental problem with the idea of LLMs doing more and more of our code, even if they get quite good.