Skip to content

Releases: TabbyML/tabby

v0.12.1-rc.0

11 Jun 01:49
Compare
Choose a tag to compare
v0.12.1-rc.0 Pre-release
Pre-release
Release 0.12.1-rc.0

aim-downloader@0.12.1-rc.0
http-api-bindings@0.12.1-rc.0
llama-cpp-server@0.12.1-rc.0
ollama-api-bindings@0.12.1-rc.0
tabby@0.12.1-rc.0
tabby-common@0.12.1-rc.0
tabby-db@0.12.1-rc.0
tabby-db-macros@0.12.1-rc.0
tabby-download@0.12.1-rc.0
tabby-git@0.12.1-rc.0
tabby-inference@0.12.1-rc.0
tabby-scheduler@0.12.1-rc.0
tabby-schema@0.12.1-rc.0
tabby-webserver@0.12.1-rc.0

Generated by cargo-workspaces

v0.12.0

06 Jun 03:40
Compare
Choose a tag to compare

πŸš€ Features

  • Support Gitlab SSO
  • Support connect with Self-Hosted Github / Gitlab
  • Repository Context is now utilizied in "Code Browser" as well

🧰 Fixed and Improvements

  • llama-server from llama.cpp is now distributed as an individual binary, allowing for more flexible configuration
  • HTTP API is out of experimental - you can connect tabby to models through HTTP API. Right now following APIs are supported:
    • llama.cpp
    • ollama
    • mistral / codestral
    • openai

πŸ’« New Contributors

Full Changelog: v0.11.0...v0.12.0

v0.12.0-rc.5

05 Jun 14:24
Compare
Choose a tag to compare
v0.12.0-rc.5 Pre-release
Pre-release
v0.12.0-rc.5

v0.12.0-rc.4

03 Jun 03:55
Compare
Choose a tag to compare
v0.12.0-rc.4 Pre-release
Pre-release
Release 0.12.0-rc.4

aim-downloader@0.12.0-rc.4
http-api-bindings@0.12.0-rc.4
llama-cpp-server@0.12.0-rc.4
ollama-api-bindings@0.12.0-rc.4
tabby@0.12.0-rc.4
tabby-common@0.12.0-rc.4
tabby-db@0.12.0-rc.4
tabby-db-macros@0.12.0-rc.4
tabby-download@0.12.0-rc.4
tabby-git@0.12.0-rc.4
tabby-inference@0.12.0-rc.4
tabby-scheduler@0.12.0-rc.4
tabby-schema@0.12.0-rc.4
tabby-webserver@0.12.0-rc.4

Generated by cargo-workspaces

v0.12.0-rc.3

30 May 15:19
Compare
Choose a tag to compare
v0.12.0-rc.3 Pre-release
Pre-release
v0.12.0-rc.3

v0.12.0-rc.2

29 May 15:42
Compare
Choose a tag to compare
v0.12.0-rc.2 Pre-release
Pre-release
v0.12.0-rc.2

v0.12.0-rc.1

29 May 01:27
Compare
Choose a tag to compare
v0.12.0-rc.1 Pre-release
Pre-release
v0.12.0-rc.1

v0.12.0-rc.0

27 May 08:37
Compare
Choose a tag to compare
v0.12.0-rc.0 Pre-release
Pre-release
v0.12.0-rc.0

v0.11.1

15 May 06:10
Compare
Choose a tag to compare

For release notes of the v0.11 minor version, please refer to v0.11.0.

🧰 Fixes and Improvements

  • Fixed the display of files with special characters in their paths. (#2081)
  • Resolved the issue where non-admin users were unable to see the repository in the Code Browser. (#2110)

v0.11.0

11 May 04:52
Compare
Choose a tag to compare

⚠️ Notice

  • BREAKING: The --webserver flag is now enabled by default in tabby serve. To turn off the webserver and only use OSS features, use the --no-webserver flag.
  • The /v1beta/chat/completions endpoint has been moved to /v1/chat/completions, while the old endpoint is still available for backward compatibility.

πŸš€ Features

  • Added storage usage statistics in the System page.
    image

  • Added support for integrating repositories from GitHub and GitLab using personal access tokens.

    image
  • Introduced a new Activities page to view user activities.

    image
  • Included an Ask Tabby feature in the source code browser to provide in-context help from AI.

    image
  • Upgraded llama.cpp to version b2715.

  • Implemented incremental indexing for faster repository context updates.

🧰 Fixes and Improvements

  • Changed the default model filename from q8_0.v2.gguf to model.gguf in MODEL_SPEC.md.
  • Excluded activities from deactivated users in reports.

πŸ’« New Contributors

Full Changelog: v0.10.0...v0.11.0