Hellfire103@lemmy.ca to linuxmemes@lemmy.worldEnglish · 15 hours agoDistro Focuseslemmy.caimagemessage-square161fedilinkarrow-up1747arrow-down18
arrow-up1739arrow-down1imageDistro Focuseslemmy.caHellfire103@lemmy.ca to linuxmemes@lemmy.worldEnglish · 15 hours agomessage-square161fedilink
minus-squareLucy :3linkfedilinkarrow-up1·9 hours agoIf you have a decent GPU or CPU, you can just set up ollama with ollama-cuda/ollama-rocm and run llama3.1 or llama3.1-uncensored.
minus-square1985MustangCobra@lemmy.calinkfedilinkEnglisharrow-up2·9 hours agoI have a ryzen 5 laptop. not really decent enough for that workload. and im not crazy about AI.
minus-squareLucy :3linkfedilinkarrow-up1·8 hours agoI bet even my Pi Zero W could run such a model* * with 1 character per hour or so
minus-square1985MustangCobra@lemmy.calinkfedilinkEnglisharrow-up2·8 hours agointeresting, well it’s something to look into, but id like a place to communicate with like minded people.
If you have a decent GPU or CPU, you can just set up ollama with ollama-cuda/ollama-rocm and run llama3.1 or llama3.1-uncensored.
I have a ryzen 5 laptop. not really decent enough for that workload. and im not crazy about AI.
I bet even my Pi Zero W could run such a model*
* with 1 character per hour or so
interesting, well it’s something to look into, but id like a place to communicate with like minded people.