Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Anthropic Prompt Caching and more #108

Merged
merged 21 commits into from
Aug 29, 2024
Merged

Anthropic Prompt Caching and more #108

merged 21 commits into from
Aug 29, 2024

Conversation

biobootloader
Copy link
Member

@biobootloader biobootloader commented Aug 27, 2024

Add support for Anthropic prompt caching. See scripts/caching.py for usage

Other things:

  • SpiceMessage is no longer just openai.ChatCompletionMessageParam
  • lots of things switched to pydantic
  • we now don't auto-convert as much to force things to work with Anthropic - we error instead
  • message chaining / get a messages object from client:
messages = (
    client.new_messages()
    .add_system_message("You are a helpful assistant.")
    .add_user_message("list 5 random species of birds")
)

Copy link
Member

@granawkins granawkins left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. I like the update to pydantic models and using .attr instead of ['attr'], and adding cache as a message input seems convenient.

@biobootloader biobootloader merged commit 1b1ca8e into main Aug 29, 2024
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants