Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Kyma Companion AI Chat #3436

Open
1 of 22 tasks
mrCherry97 opened this issue Oct 30, 2024 · 0 comments
Open
1 of 22 tasks

Kyma Companion AI Chat #3436

mrCherry97 opened this issue Oct 30, 2024 · 0 comments
Labels
Epic lifecycle/frozen Indicates that an issue or PR should not be auto-closed due to staleness.

Comments

@mrCherry97
Copy link
Contributor

mrCherry97 commented Oct 30, 2024

Description

As Sebastian and Mathew I would like to have Kyma Companion AI Chat in the Busola Dashboard that allows to talk with the kyma companion backend which helps to automate the process to analyze and troubleshoot a Kyma cluster by LLM (Language Model).

Acceptance Criteria

  • AI chat is easily available in the Busola Dashboard
  • UI is following the designs
    Kyma Companion-compressed.pdf
  • The Busola Dashboard is integrated with Kyma Companion API (gh tools /kyma/ai-force/tree/main/docs/api-structure)
  • User could be navigated from AI Chat to the specific edit/create view of the resource to copy paste response from AI Chat

To integrate the Kyma Companion into Busola based on the POC changes, the following steps are required:

API / Backend related:

  • Add cluster authorization to all API requests
  • Integrate POST /conversations endpoint to initialize conversations
  • Integrate POST /conversations/{conversation_id}/messages endpoint to send and receive messages
  • Handle streaming of responses
  • Integrate GET /conversations/{conversation_id}/questions to receive follow-up questions
  • Proper error handling (with meaningful message from backend)

Frontend related:

  • Store session_id securely on client-side
  • Open AI assistant from top toolbar, leave open upon resource / view changes
  • Remove initial popup -> populate chat with welcome message and initial questions
  • Remove Tabs component from chat interface
  • Add AI disclaimer below input
  • Add Setup in editor button capabilities
  • block input when waiting for response using time-icon
  • Update UI5 components and adjust look to match design Kyma.Companion-compressed.pdf
  • Hide behind feature flag
  • adjust types

Testing:

  • Add unit tests
  • Add integration tests

Business Value

This feature enhances user convenience by providing an AI Chat which helps automate the process of analyzing and troubleshooting a Kyma cluster by LLM (Language Model).

Reasons

Adopting this feature simplifies the user experience and increases efficiency, especially for users unfamiliar with Kubernetes and Kyma.

Dependencies

Depends on the Kyma Companion API requirements.

Non-functional Requirements

  • Security and privacy must be ensured for stored user data.
  • The system should maintain high availability and reliability.

Constraints

  • Must comply with data protection and privacy laws.
  • Should be designed for easy update and maintenance.

Additional Features

  • Settings and configurations persistence across sessions.
  • We should be able to enable and disable this feature for different users(experimental channel, restricted markets, etc)

Notes and Comments

Size or Effort

High effort, considering the development and integration of new functionalities.

Mockups or Diagrams

image

Useful Links

@mrCherry97 mrCherry97 added Epic lifecycle/frozen Indicates that an issue or PR should not be auto-closed due to staleness. labels Oct 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Epic lifecycle/frozen Indicates that an issue or PR should not be auto-closed due to staleness.
Projects
None yet
Development

No branches or pull requests

1 participant