Skip to content

Releases: matlab-deep-learning/llms-with-matlab

v4.0.0: Structured Output

29 Oct 10:27
Compare
Choose a tag to compare

This release includes new features and bug fixes.

Ensure output format using structured output

You can now use structured output to generate output using openAIChat or azureChat objects.

In LLMs with MATLAB, you can specify the structure of the output in two different ways.

  • Specify a valid JSON Schema directly.
  • Specify an example structure array that adheres to the required output format. The software automatically generates the corresponding JSON Schema and provides this to the LLM. Then, the software automatically converts the output of the LLM back into a structure array.

To do this, set the ReponseFormat name-value argument of openAIChat, azureChat, or generate to:

  • A string scalar containing a valid JSON Schema.
  • A structure array containing an example that adheres to the required format, for example: ResponseFormat=struct("Name","Rudolph","NoseColor",[255 0 0])

For more information on structured output, see https://platform.openai.com/docs/guides/structured-outputs.

Argument name changes

The Model name-value argument of ollamaChat has been renamed to ModelName. However, you can still use Model instead.

The Deployment name-value argument of azureChat has been renamed to DeploymentID. However, you can still use Deployment instead.

v3.4.0: Documentation, support for OpenAI o1 models, general updates

27 Sep 14:35
Compare
Choose a tag to compare

This release includes new features, new documentation, and bug fixes.

New Features

Support for OpenAI® o1 models

You can now use the OpenAI models o1-mini and o1-preview to generate text from MATLAB®. When you create an openAIChat object, set the ModelName name-value argument to "o1-mini" or "o1-preview".

Temporarily override model parameters when generating text

You can now set model parameters such as MaxNumToken and ResponseFormat for a single API call by using the corresponding name-value arguments of the generate function. The generate function will then use the specified parameter for text generation instead of the corresponding model parameter of the openAIChat, azureChat, or ollamaChat input.

For a full list of supported model parameters, see generate.

Support for min-p sampling in Ollama™

You can now set the minimum probability ratio to tune the frequency of improbable tokens when generating text using Ollama models. You can do this in two different ways:

  1. When you create an ollamaChat object, specify the MinP name-value argument.
  2. When you generate text using the generate function with an ollamaChat object, specify the MinP name-value argument.

New Documentation

There is now detailed documentation for the features included in LLMs with MATLAB:

You can find these pages in a new directory functions inside the doc directory.

Full Changelog: v3.3.0...v3.4.0

v3.3.0: Vision support in Ollama

09 Aug 13:37
150d9c1
Compare
Choose a tag to compare

Vision Support in Ollama

This release adds support for vision models in Ollama™. This will allow you to use the ollamaChat function to describe images.

For an example, see Understanding the Content of an Image.

Full Changelog: v3.2.0...v3.3.0

v3.2.0: Supporting Ollama servers not on localhost:11434

24 Jul 10:05
bdb84b8
Compare
Choose a tag to compare

What's Changed

Full Changelog: v3.1.0...v3.2.0

v3.1.0: Update OpenAI models

19 Jul 09:00
d74ec2b
Compare
Choose a tag to compare

What's Changed

Full Changelog: v3.0.0...v3.1.0

Ollama and Azure Support

24 Jun 13:46
05c861b
Compare
Choose a tag to compare

Adding Ollama® and Azure® support

With this release, we have added ollamaChat and azureChat to connect to local LLMs (with an Ollama installation) or deployments hosted on Azure OpenAI® Services. See Ollama.md and Azure.md for details.

Also included is a list of updates (such as supporting newer models on OpenAI®), bug fixes, and general code and test improvements. See the long list below.

Auto-generated What's Changed List

New Contributors

Full Changelog: v2.0.0...v3.0.0

v2.0.0

27 Jan 11:48
7e789c3
Compare
Choose a tag to compare
  • The Functions parameter was deprecated by OpenAI, now use Tools
  • Added support to Parallel function call
  • Added support for JSON output
  • Added support for DALL·E
  • Added support for GPT-4V
  • Added latest versions of all models

Initial release

15 Jan 09:55
Compare
Choose a tag to compare
v1.0.0

Fixing argument block