Break Changing
Replace roles.yaml
with roles/<name>.md
(see #804)
Migrate ollama
/qianwen
/cloudflare
clients to openai-compatible
clients:
- - type: ollama
- api_base: http://localhost:11434
+ - type: openai-compatible
+ name: ollama
+ api_base: http://localhost:11434/v1
- - type: qianwen
+ - type: openai-compatible
+ name: qianwen
- - type: cloudflare
- account_id: xxx
- api_base: https://api.cloudflare.com/client/v4
+ - type: openai-compatible
+ name: cloudflare
+ api_base: https://api.cloudflare.com/client/v4/accounts/{ACCOUNT_ID}/ai/v1
Clients Changes
- migrate ollama to openai-compatible
- migrate qianwen to openai-compatible
- migrate cloudflare to openai-compatible
- add github
- add ai21
- add huggingface
New Features
- support builtin website crawling (recursive_url) (#786)
- no check model's support for function calls (#791)
- enable custom
api_base
for most clients (#793)
- support github client (#798)
- support ai21 client (#800)
- replace
roles.yaml
with roles/<name>.md
(#810)
- save temp session with
temp-<timestamp>
if save_session: true
(#811)
- webui use querystring as settings (#814)
- webui support RAG (#815)
- migrate
ollama
/qianwen
clients to openai-compatible
(#816)
- migrate
cloudflare
client to openai-compatible
(#821)
- add huggingface client (#822)
- use dynamic batch size for embedding (#826)
Bug Fixes
- incorrect function call handling with session in non-REPL (#777)
- claude fails to run tools with zero arguments (#780)
- invalid model error while switching roles if the model_id is same to current (#788)
- incomplete stream response in proxy LLM api (#796)