Releases: simonw/llm-mistral
Releases · simonw/llm-mistral
0.9
0.9a0
0.8
0.7
0.6
0.5
- New alias
llm -m mistral-nemo "prompt goes here"
for Mistral's NeMo model. Thanks, Thaddée Tyl #9 - The new Mistral Large 2 model is already supported by the
llm -m mistral-large
alias, since that alias uses the latest release in the Large series.
0.4
- Documentation for the
llm mistral refresh
command, which can be used to refresh the list of available Mistral API models. - New default aliases:
llm -m codestral
for the latest release of Codestral andllm -m codestral-mamba
for the latest release of the new Codestral Mamba. #8
0.3.1
- No longer raises an error if you run
llm models
without first setting amistral
API key. #6 - Mixtral 8x22b is now available as
llm -m mistral/open-mixtral-8x22b 'say hello'
. New installations will get this model automatically - if you do not see the model in thellm models
list you should runllm mixtral refresh
to update your local cache of available models. #7
0.3
- Support for the new Mistral Large model -
llm -m mistral-large "prompt goes here"
. #5 - All Mistral API models are now supported automatically - LLM fetches a list of models from their API the first time the plugin is installed, and that list can be refreshed at any time using the new
llm mistral refresh
command. - When using the Python API a model key can now be set using
model.key = '...'
- thanks, Alexandre Bulté. #4