Skip to content

Releases: simonw/llm-mistral

0.9

02 Dec 00:15
Compare
Choose a tag to compare
  • Track token usage using new capability introduced in LLM 0.19. #15

0.9a0

20 Nov 23:16
Compare
Choose a tag to compare
0.9a0 Pre-release
Pre-release
  • Track token usage. #15

0.8

19 Nov 00:56
Compare
Choose a tag to compare
0.8
  • Provide async versions of the Mistral models, compatible with LLM 0.18. #13
  • Added a pixtral-large alias for the new Pixtral Large model. #14
    llm -m pixtral-large describe -a https://static.simonwillison.net/static/2024/pelicans.jpg

0.7

29 Oct 04:16
Compare
Choose a tag to compare
0.7
  • Support for sending images to the pixtral-12b vision model using the new llm 'prompt' -a image.jpeg option. #12

0.6

16 Oct 18:54
Compare
Choose a tag to compare
0.6
  • Fixed an issue where an invalid API key caused llm models to fail with an error. #10
  • Added new aliases for Ministral 3B and 8B: llm -m ministral-3b hi and llm -m ministral-8b hi. #11

0.5

24 Jul 15:55
Compare
Choose a tag to compare
0.5
  • New alias llm -m mistral-nemo "prompt goes here" for Mistral's NeMo model. Thanks, Thaddée Tyl #9
  • The new Mistral Large 2 model is already supported by the llm -m mistral-large alias, since that alias uses the latest release in the Large series.

0.4

16 Jul 16:28
Compare
Choose a tag to compare
0.4
  • Documentation for the llm mistral refresh command, which can be used to refresh the list of available Mistral API models.
  • New default aliases: llm -m codestral for the latest release of Codestral and llm -m codestral-mamba for the latest release of the new Codestral Mamba. #8

0.3.1

18 Apr 03:06
Compare
Choose a tag to compare
  • No longer raises an error if you run llm models without first setting a mistral API key. #6
  • Mixtral 8x22b is now available as llm -m mistral/open-mixtral-8x22b 'say hello'. New installations will get this model automatically - if you do not see the model in the llm models list you should run llm mixtral refresh to update your local cache of available models. #7

0.3

26 Feb 16:05
Compare
Choose a tag to compare
0.3
  • Support for the new Mistral Large model - llm -m mistral-large "prompt goes here". #5
  • All Mistral API models are now supported automatically - LLM fetches a list of models from their API the first time the plugin is installed, and that list can be refreshed at any time using the new llm mistral refresh command.
  • When using the Python API a model key can now be set using model.key = '...' - thanks, Alexandre Bulté. #4

0.2

15 Dec 05:05
Compare
Choose a tag to compare
0.2
  • Mistral LLM models now support options: -o temperature 0.7, -o top_p 0.1, -o max_tokens 20, -o safe_mode 1, -o random_seed 12. #2
  • Support for the Mistral embeddings model, available via llm embed -m mistral-embed -c 'text goes here'. #3
  • The --no-stream option now uses the non-streaming Mistral API.