I am researching doing the same, but know nothing about running my own yet. Did you train your llm for programming in any way, or just download and run an open source one? If so which model etc do you use?
Run an open source one. Training requires lots of knowledge and even more hardware resources/time. Fine tuned models are available for free online, there is not much use in training it yourself.
I recommend llavafiles, as this is the easiest option to run. The GitHub has all the stuff you need in the “quick start” section.
Though the default is a bit restricted on windows. Since the llavafiles are bundling the LLM weights with the executable and Windows has a 4GB limit on executables you’re restricted to very small models. Workarounds are available though!
Me running an LLM at home:
The same image, but the farmer is standing in front of a field of poppy (for opioid production)
I am researching doing the same, but know nothing about running my own yet. Did you train your llm for programming in any way, or just download and run an open source one? If so which model etc do you use?
Run an open source one. Training requires lots of knowledge and even more hardware resources/time. Fine tuned models are available for free online, there is not much use in training it yourself.
Options are
https://github.com/oobabooga/text-generation-webui
https://github.com/Mozilla-Ocho/llamafile
https://github.com/ggerganov/llama.cpp
I recommend llavafiles, as this is the easiest option to run. The GitHub has all the stuff you need in the “quick start” section.
Though the default is a bit restricted on windows. Since the llavafiles are bundling the LLM weights with the executable and Windows has a 4GB limit on executables you’re restricted to very small models. Workarounds are available though!