-
Notifications
You must be signed in to change notification settings - Fork 590
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
added support for bedrock/claude3 #781
Conversation
|
||
[aws] # in .secrets.toml | ||
bedrock_region = "us-east-1" | ||
``` | ||
|
||
Note that you have to add access to foundational models before using them. Please refer to [this document](https://docs.aws.amazon.com/bedrock/latest/userguide/setting-up.html) for more details. | ||
|
||
If you are using the claude-3 model, please configure the following settings as there are parameters incompatible with claude-3. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
AmazonAnthropicClaude3Config
does not support temperature
parameter
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Discussion:
Another approach could be to set the temperature only if the model is bedrock's claude3 within the chat_completion function.
@@ -68,6 +70,7 @@ def __init__(self): | |||
) | |||
if get_settings().get("AWS.BEDROCK_REGION", None): | |||
litellm.AmazonAnthropicConfig.max_tokens_to_sample = 2000 | |||
litellm.AmazonAnthropicClaude3Config.max_tokens = 2000 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In litellm, a separate Config class is created for AmazonAnthropicClaude3
Similar to the following pull request, the value of max_tokens is hardcoded.
#483 (comment)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Discussion:
Another approach within chat_completion is to handle which configuration to set based on the model, either AmazonAnthropicConfig or AmazonAnthropicClaude3Config.
/describe |
1 similar comment
/describe |
/help |
PR-Agent was enabled for this repository. To use it, please link your git user with your CodiumAI identity here. PR Agent Walkthrough🤖 Welcome to the PR Agent, an AI-powered tool for automated pull request analysis, feedback, suggestions and more. Here is a list of tools you can use to interact with the PR Agent:
(1) Note that each tool be triggered automatically when a new PR is opened, or called manually by commenting on a PR. (2) Tools marked with [*] require additional parameters to be passed. For example, to invoke the |
/describe |
PR-Agent was enabled for this repository. To use it, please link your git user with your CodiumAI identity here. PR Description updated to latest commit (0cdd3bf)
|
/describe |
PR-Agent was enabled for this repository. To use it, please link your git user with your CodiumAI identity here. PR Description updated to latest commit (3bae515)
|
Hi @koid , thanks for the PR Is the upgrade for litellm version essential for this PR ? As a side note, did you form an impression of Claude-3 opus, compared to GPT4, on code? is it competitive ? |
Hi @mrT23 , thanks for review The support status for litellm's Claude-3 is as follows, and an update is required. I'm sorry, I haven't had the chance to try Claude-3 Opus yet. I plan to try it soon. |
PR Review
Code feedback:
✨ Review tool usage guide:Overview:
With a configuration file, use the following template:
See the review usage page for a comprehensive guide on using this tool. |
User description
added support for bedrock/claude3
Type
enhancement, documentation
Description
drop_params
to ensure compatibility with Claude3 models.drop_params
setting.litellm
library version to1.31.10
for enhanced functionality and compatibility.Changes walkthrough
__init__.py
Add Support for Bedrock Claude3 Models
pr_agent/algo/init.py
bedrock/anthropic.claude-3-sonnet-20240229-v1:0
andbedrock/anthropic.claude-3-haiku-20240307-v1:0
models.litellm_ai_handler.py
Enhance Litellm AI Handler for Claude3 Compatibility
pr_agent/algo/ai_handlers/litellm_ai_handler.py
models.
max_tokens
forAmazonAnthropicClaude3Config
.configuration.toml
Add Drop Params Configuration Option
pr_agent/settings/configuration.toml
drop_params
configuration option inlitellm
section.additional_configurations.md
Update Documentation for Claude3 Model Configuration
docs/docs/usage-guide/additional_configurations.md
drop_params
for Claude3 compatibility.requirements.txt
Update Litellm Library Version
requirements.txt
litellm
library version from1.29.1
to1.31.10
.