feat: added observability for llm check #27
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Autogenerated PR Description
Pull Request Description
This pull request adds observability for litellm by introducing the ability to log success and failure callbacks to Supabase. It includes modifications to the configuration settings, addition of new features in the LLMProvider class, and updates to the chat_completion method. The changes aim to enhance observability and logging capabilities for the language model provider.
Motivation
The motivation behind these changes is to improve the observability of the language model provider, litellm, by enabling logging of success and failure callbacks to Supabase. This will provide better insights into the behavior and performance of the language model.
Potential Concerns
The addition of observability logging introduces a dependency on Supabase, which may require proper handling of environment variables and potential impact on the overall system performance. It's important to ensure that the logging functionality does not introduce any security vulnerabilities or performance bottlenecks.
Recommendations
It's recommended to thoroughly test the observability logging functionality, including handling of edge cases and error scenarios. Additionally, consider documenting the usage and configuration of the observability feature to facilitate future maintenance and troubleshooting.
-- Generated with love by Cloud Code AI
Original Description
Added observability for litellm