• peto (he/him)@lemm.ee
    link
    fedilink
    English
    arrow-up
    66
    ·
    2 months ago

    Isn’t the entire purpose of copilot that it shouldn’t need much in the way of training? I think the extent of it at my employer is “this is the one you use.”

    I’ve tried it a few times, the only thing it seems remotely good for is when your recollection of a source is too fuzzy to form a traditional search query around. “What’s that book series I read in the early 2000s about kids who traveled to another world and the things they brought back from it just looked like junk.” Kind of questions.

    • Amanduh@lemm.ee
      link
      fedilink
      arrow-up
      44
      ·
      2 months ago

      That’s my favorite use of ai, remembering old ass movies I have fragments of memories about from my childhood

    • Sc00ter@lemm.ee
      link
      fedilink
      arrow-up
      18
      arrow-down
      1
      ·
      2 months ago

      This was our company too. They struck some sort of deal with chat gpt that we use their base code, but aren’t connected to their machine learning. Feels like a pretty reasonable approach in my opinion.

      So our training was, “use ours. Don’t use anyone else’s because we don’t want our proprietary information out there to never be able to be scrubbed from the internet”

    • Tar_Alcaran@sh.itjust.works
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      2 months ago

      It’s pretty decent at unimportant optimisation tasks with limited options. Like “I’m driving from X to Y, my friend travels by train from Z, what are good places to pick them up?”

    • ggppjj@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 months ago

      I’m a self-taught C# dev, I’ve found tremendous success specifically just describing what I want to do in dumb language that I’d feel stupid asking people IRL about and that aren’t googleable without knowing what both the terms “null-coalescing” and “non-merchandise supergroup” are describing.

      There are a lot of patterns that don’t have obvious names and that aren’t easily described without describing a specific scenario in a way that might only make sense institutionally, or with additional context that your average person might not have. ChatGPT is fairly good at being the “buddy that you have a bunch of in-jokes with that can remember things better than you”. I can skip a lot of explaining why I need to do a thing a certain way like I can with my coworkers (who all aren’t programmers), and I can get helpful answers for programming questions that my coworkers don’t know the answers to.

      It’s frustrating to see this incredibly advanced context-aware autocorrect on steroids get used in ways that don’t acknowledge the inherent strengths of what LLMs are actually great at doing. It’s infuriating to have that potential be actively misused and packaged as a service and have that mediocre service sold to you once a month as a necessity by idiots in suits watching a line on a chart.

      • peto (he/him)@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        Yeah, this is much the same kind of use. If you work on the assumption that it is just something that has read everything, and everything that has been written about everything you can find it’s utility. Folk want it to be some kind of fact genie, but the only facts it knows are what words go together, and it literally doesn’t know the difference between real and made up.