Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for Anthropic Claude LLM #233

Open
M-Tahar opened this issue Dec 6, 2024 · 1 comment
Open

Add support for Anthropic Claude LLM #233

M-Tahar opened this issue Dec 6, 2024 · 1 comment
Assignees
Labels
enhancement New feature or request

Comments

@M-Tahar
Copy link

M-Tahar commented Dec 6, 2024

Proposal to add Claude/Anthropic as a supported LLM provider in the bee-agent-framework.

Motivation:

  • Claude offers strong performance and capabilities that would benefit Bee Stack users
  • Adds more provider options for users alongside existing OpenAI, watsonx, and ollama options
  • Complements existing providers with Claude's specific strengths (longer context windows, strong coding abilities, etc.)
  • Provides an alternative for users who prefer Claude's approach to AI safety and alignment

Implementation Plan:

  1. Add Claude adapter in src/adapters/claude

    • Implement API integration using Anthropic's official SDK
    • Support both Claude 2 and Claude 3 models
    • Handle Claude-specific features like system prompts and tool use
  2. Update configuration:

    • Add Claude-specific environment variables (ANTHROPIC_API_KEY)
    • Add Claude as a supported backend option in bee-stack setup
    • Add model configuration options (temperature, max tokens, etc.)
  3. Documentation:

    • Add setup instructions for Claude API
    • Document Claude-specific features and configurations
    • Update example configurations
  4. Testing:

    • Add unit tests for Claude adapter
    • Add integration tests
    • Add example usage in documentation

Technical Details:

  • Will follow existing adapter patterns seen in other providers
  • Will implement proper error handling for Claude-specific API responses
  • Will ensure compatibility with bee-stack's existing monitoring and observability features

Would love to contribute this feature if the maintainers are interested! Let me know if you have any specific requirements or preferences for the implementation.

@Tomas2D Tomas2D added the enhancement New feature or request label Dec 6, 2024
@Tomas2D
Copy link
Contributor

Tomas2D commented Dec 6, 2024

Hello, @M-Tahar, and thank you for your interest in Bee.

To start, please first go through our Contributing Guidlines followed by Adding a new LLM provider (adapter) instructions(https://i-am-bee.github.io/bee-agent-framework/#/llms?id=adding-a-new-provider-adapter). Feel free to open a draft PR where we can discuss.

Relevant PRs (adding a new LLM provider):

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants