Add Support for Prompt Caching in SpiceMessages Class #107
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This update introduces support for Anthropic's prompt caching feature to the
SpiceMessages
class in thespice
library. The changes will enable faster and more cost-efficient API calls by reusing cached prompt prefixes. Additionally, cache performance metrics are tracked to verify cache hits and the number of input tokens cached.Key Changes:
Update
SpiceMessages
Class:cache
argument to the message creation methods.cache_control
parameter based on thecache
argument.Modify Message Creation Functions:
cache
argument.Track Cache Performance Metrics:
get_response
method in theSpice
class to handle the new API response fields related to caching.client.extract_text_and_tokens
method.Example Usage:
Acceptance Criteria:
SpiceMessages
class should support thecache
argument.get_response
method should log cache performance metrics.cache
argument for non-Anthropic clients.Closes #106
Thanks for using MentatBot. Give comments a 👍 or 👎 to help me improve!