Skip to content

0.2

Compare
Choose a tag to compare
@simonw simonw released this 15 Dec 05:05
· 28 commits to main since this release
  • Mistral LLM models now support options: -o temperature 0.7, -o top_p 0.1, -o max_tokens 20, -o safe_mode 1, -o random_seed 12. #2
  • Support for the Mistral embeddings model, available via llm embed -m mistral-embed -c 'text goes here'. #3
  • The --no-stream option now uses the non-streaming Mistral API.