Installs text-generation-webui into a docker container using CPU only mode (llama.cpp)
NOTE: This addon is not the preferred way to run LLama.cpp as part of Home Assistant and will not be updated.
Content-Length: 244219 | pFad | http://github.com/acon96/home-llm/blob/develop/addon/README.md
4BFetched URL: http://github.com/acon96/home-llm/blob/develop/addon/README.md
Alternative Proxies: