Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add LiteLLM as a representation model #2213

Merged
merged 5 commits into from
Dec 10, 2024
Merged

Add LiteLLM as a representation model #2213

merged 5 commits into from
Dec 10, 2024

Conversation

MaartenGr
Copy link
Owner

@MaartenGr MaartenGr commented Nov 13, 2024

What does this PR do?

Adds LiteLLM as a representation model, which should open up many LLMs to use.

To use this, you will need to install the openai package first:

pip install litellm

Then, get yourself an API key of any provider (for instance OpenAI) and use it as follows:

import os
from bertopic.representation import LiteLLM
from bertopic import BERTopic

# set ENV variables
os.environ["OPENAI_API_KEY"] = "your-openai-key"

# Create your representation model
representation_model = LiteLLM(model="gpt-3.5-turbo")

# Use the representation model in BERTopic on top of the default pipeline
topic_model = BERTopic(representation_model=representation_model)

To do (in a separate PR)

  • vllm support
  • Add instructions for ollama

Before submitting

  • This PR fixes a typo or improves the docs (if yes, ignore all other checks!).
  • Did you read the contributor guideline?
  • Was this discussed/approved via a Github issue? Please add a link to it if that's the case.
  • Did you make sure to update the documentation with your changes (if applicable)?
  • Did you write any new necessary tests?

@Skar0
Copy link

Skar0 commented Nov 16, 2024

Hi! I’m just asking out of curiosity—would it make sense to use the LangChain connection here together with the LangChain representation instead of having a separate implementation altogether? I understand it might not be the preferred approach, but I came across this and thought it might be worth mentioning 🙂

@MaartenGr
Copy link
Owner Author

@Skar0 Thanks for sharing! Although that is a perfectly reasonable approach, LangChain changes its API quite often and has a large set of dependencies to take into account. In contrast, LiteLLM is light in its dependencies and its API shouldn't change much since it adheres to OpenAIs offering.

I figured that a lighter alternative is welcome in this case.

@MaartenGr MaartenGr merged commit 84dbf36 into master Dec 10, 2024
6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants