Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add plumming for additional model parameters. #21

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

hhughes
Copy link

@hhughes hhughes commented Oct 1, 2023

Defaults set to current values in llama-cpp-python. Descriptions taken from llama-cpp-python documentation where defined.

New parameters:

  • suffix
  • temperature
  • top_p
  • logprobs
  • echo
  • stop
  • frequency_penalty
  • presence_penalty
  • repeat_penalty
  • top_k
  • stream
  • tfs_z
  • mirostat_mode
  • mirostat_tau
  • mirostat_eta
  • model

Defaults set to current values in llama-cpp-python. Descriptions taken from llama-cpp-python documentation where defined.

New parameters:
- suffix
- temperature
- top_p
- logprobs
- echo
- stop
- frequency_penalty
- presence_penalty
- repeat_penalty
- top_k
- stream
- tfs_z
- mirostat_mode
- mirostat_tau
- mirostat_eta
- model
@hhughes
Copy link
Author

hhughes commented Oct 1, 2023

Motivation: I wanted to be able to tune temperature, top_k and top_p when using llama via llm package. I copied almost all current params wholesale, if you think some do not belong I'm happy to adjust the CL.

I left out stopping_criteria, logits_processor and grammar because pydantic didn't like their types (and I don't need them for now).

@aawadat
Copy link

aawadat commented Jul 6, 2024


Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants