I’d like to self host a large language model, LLM.

I don’t mind if I need a GPU and all that, at least it will be running on my own hardware, and probably even cheaper than the $20 everyone is charging per month.

What LLMs are you self hosting? And what are you using to do it?

  • cmgvd3lw@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 months ago

    I am not self hosting an LLM, but running on my laptop with Alpaca. Google’s Gemma 2B. On my hardware its pretty slow, but kind of gets the work done. My hardware is getting old, need to upgrade soon.