Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🦙 LLAMA embeddings #1

Open
jobergum opened this issue Apr 16, 2023 · 2 comments
Open

🦙 LLAMA embeddings #1

jobergum opened this issue Apr 16, 2023 · 2 comments

Comments

@jobergum
Copy link

Will you add support for LLAMA? 🦙 embeddings are not normalized to unit length like the OpenAI embeddings, meaning you can both represent direction and magnitude.

@rjadr
Copy link
Contributor

rjadr commented Apr 16, 2023

I like the direction where this is headed, but I like alpaca's a magnitude more.

jdagdelen pushed a commit that referenced this issue Jun 27, 2023
Change self.vectors to be initialized as None instead of as an empty list
@johndpope
Copy link

I maybe mistaken - but it seems like there's no examples using this with chatgpt?
https://github.com/search?q=hyperdb+chatgpt&type=code&p=4

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants