Releases: acon96/home-llm
Releases · acon96/home-llm
v0.2.12
Fix cover ICL examples, allow setting number of ICL examples, add min P and typical P sampler options, recommend models during setup, add JSON mode for Ollama backend, fix missing default options
v0.2.11
Add prompt caching, expose llama.cpp runtime settings, build llama-cpp-python wheels using GitHub actions, and install wheels directly from GitHub
v0.2.10
v0.2.9
v0.2.8
v0.2.7
v0.2.6
v0.2.5 (alpha)
Fix Ollama max tokens parameter, fix GGUF download from Hugging Face, update included llama-cpp-python to 0.2.32, and add parameters to function calling for dataset + component, & model update
v0.2.4 (alpha)
Fix API key auth on model load for text-generation-webui, and add support for Ollama API backend
v0.2.3 (alpha)
Fix API key auth, Support chat completion endpoint, and refactor to make it easier to add more remote backends