Phoenix3875@lemmy.world to Programmer Humor@programming.dev · 5 days agoWork of pure human soul (and pure human sweat, and pure human tears)lemmy.worldimagemessage-square47fedilinkarrow-up1393arrow-down139
arrow-up1354arrow-down1imageWork of pure human soul (and pure human sweat, and pure human tears)lemmy.worldPhoenix3875@lemmy.world to Programmer Humor@programming.dev · 5 days agomessage-square47fedilink
minus-squarepassepartoutlinkfedilinkarrow-up13·5 days agoIf you have a supported GPU you could try Ollama (with openwebui), works like a charm.
minus-squarebi_tux@lemmy.worldlinkfedilinkarrow-up6·5 days agoyou don’t even need a supported gpu, I run ollama on my rx 6700 xt
minus-squarepassepartoutlinkfedilinkarrow-up2·4 days agoI have the same gpu my friend. I was trying to say that you won’t be able to run ROCm on some Radeon HD xy from 2008 :D
minus-squareBaroqueInMind@lemmy.onelinkfedilinkarrow-up3·4 days agoYou don’t even need a GPU, i can run Ollama Open-WebUI on my CPU with an 8B model fast af
minus-squarebi_tux@lemmy.worldlinkfedilinkarrow-up2·4 days agoI tried it with my cpu (with llama 3.0 7B), but unfortunately it ran really slow (I got a ryzen 5700x)
minus-squaretomjuggler@lemmy.worldlinkfedilinkarrow-up2·3 days agoI ran it on my dual core celeron and… just kidding try the mini llama 1B. I’m in the same boat with Ryzen 5000 something cpu
If you have a supported GPU you could try Ollama (with openwebui), works like a charm.
you don’t even need a supported gpu, I run ollama on my rx 6700 xt
I have the same gpu my friend. I was trying to say that you won’t be able to run ROCm on some Radeon HD xy from 2008 :D
You don’t even need a GPU, i can run Ollama Open-WebUI on my CPU with an 8B model fast af
I tried it with my cpu (with llama 3.0 7B), but unfortunately it ran really slow (I got a ryzen 5700x)
I ran it on my dual core celeron and… just kidding try the mini llama 1B. I’m in the same boat with Ryzen 5000 something cpu