• passepartout
    link
    fedilink
    arrow-up
    13
    ·
    2 months ago

    If you have a supported GPU you could try Ollama (with openwebui), works like a charm.

        • bi_tux@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          2 months ago

          I tried it with my cpu (with llama 3.0 7B), but unfortunately it ran really slow (I got a ryzen 5700x)

          • tomjuggler@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            2 months ago

            I ran it on my dual core celeron and… just kidding try the mini llama 1B. I’m in the same boat with Ryzen 5000 something cpu

      • passepartout
        link
        fedilink
        arrow-up
        2
        ·
        2 months ago

        I have the same gpu my friend. I was trying to say that you won’t be able to run ROCm on some Radeon HD xy from 2008 :D