Skip to content

Bring AI into your workspace using a chatbot powered by Anthropic and OpenAI

License

Notifications You must be signed in to change notification settings

slack-samples/bolt-python-ai-chatbot

Repository files navigation

Slack AI Chatbot

This Slack chatbot app template offers a customizable solution for integrating AI-powered conversations into your Slack workspace. Here's what the app can do out of the box:

  • Interact with the bot by mentioning it in conversations and threads
  • Send direct messages to the bot for private interactions
  • Use the /ask-bolty command to communicate with the bot in channels where it hasn't been added
  • Utilize a custom function for integration with Workflow Builder to summarize messages in conversations
  • Select your preferred API/model from the app home to customize the bot's responses
  • Bring Your Own Language Model BYO LLM for customization
  • Custom FileStateStore creates a file in /data per user to store API/model preferences

Inspired by ChatGPT-in-Slack

Before getting started, make sure you have a development workspace where you have permissions to install apps. If you don’t have one setup, go ahead and create one.

Installation

Prerequisites

  • To use the OpenAI and Anthropic models, you must have an account with sufficient credits.
  • To use the Vertex models, you must have a Google Cloud Provider project with sufficient credits.

Create a Slack App

  1. Open https://api.slack.com/apps/new and choose "From an app manifest"
  2. Choose the workspace you want to install the application to
  3. Copy the contents of manifest.json into the text box that says *Paste your manifest code here* (within the JSON tab) and click Next
  4. Review the configuration and click Create
  5. Click Install to Workspace and Allow on the screen that follows. You'll then be redirected to the App Configuration dashboard.

Environment Variables

Before you can run the app, you'll need to store some environment variables.

  1. Open your apps configuration page from this list, click OAuth & Permissions in the left hand menu, then copy the Bot User OAuth Token. You will store this in your environment as SLACK_BOT_TOKEN.
  2. Click Basic Information from the left hand menu and follow the steps in the App-Level Tokens section to create an app-level token with the connections:write scope. Copy this token. You will store this in your environment as SLACK_APP_TOKEN.

Next, set the gathered tokens as environment variables using the following commands:

# MacOS/Linux
export SLACK_BOT_TOKEN=<your-bot-token>
export SLACK_APP_TOKEN=<your-app-token>
# Windows
set SLACK_BOT_TOKEN=<your-bot-token>
set SLACK_APP_TOKEN=<your-app-token>

Different models from different AI providers are available if the corresponding environment variable is added, as shown in the sections below.

Anthropic Setup

To interact with Anthropic models, navigate to your Anthropic account dashboard to create an API key, then export the key as follows:

export ANTHROPIC_API_KEY=<your-api-key>
Google Cloud Vertex AI Setup

To use Google Cloud Vertex AI, follow this quick start to create a project for sending requests to the Gemini API, then gather Application Default Credentials with the strategy to match your development environment.

Once your project and credentials are configured, export environment variables to select from Gemini models:

export VERTEX_AI_PROJECT_ID=<your-project-id>
export VERTEX_AI_LOCATION=<location-to-deploy-model>

The project location can be located under the Region on the Vertex AI dashboard, as well as more details about available Gemini models.

OpenAI Setup

Unlock the OpenAI models from your OpenAI account dashboard by clicking create a new secret key, then export the key like so:

export OPENAI_API_KEY=<your-api-key>

Setup Your Local Project

# Clone this project onto your machine
git clone https://github.com/slack-samples/bolt-python-ai-chatbot.git

# Change into this project directory
cd bolt-python-ai-chatbot

# Setup your python virtual environment
python3 -m venv .venv
source .venv/bin/activate

# Install the dependencies
pip install -r requirements.txt

# Start your local server
python3 app.py

Linting

# Run flake8 from root directory for linting
flake8 *.py && flake8 listeners/

# Run black from root directory for code formatting
black .

Project Structure

manifest.json

manifest.json is a configuration for Slack apps. With a manifest, you can create an app with a pre-defined configuration, or adjust the configuration of an existing app.

app.py

app.py is the entry point for the application and is the file you'll run to start the server. This project aims to keep this file as thin as possible, primarily using it as a way to route inbound requests.

/listeners

Every incoming request is routed to a "listener". Inside this directory, we group each listener based on the Slack Platform feature used, so /listeners/commands handles incoming Slash Commands requests, /listeners/events handles Events and so on.

/ai

  • ai_constants.py: Defines constants used throughout the AI module.

ai/providers

This module contains classes for communicating with different API providers, such as Anthropic, OpenAI, and Vertex AI. To add your own LLM, create a new class for it using the base_api.py as an example, then update ai/providers/__init__.py to include and utilize your new class for API communication.

  • __init__.py: This file contains utility functions for handling responses from the provider APIs and retrieving available providers.

/state_store

  • user_identity.py: This file defines the UserIdentity class for creating user objects. Each object represents a user with the user_id, provider, and model attributes.

  • user_state_store.py: This file defines the base class for FileStateStore.

  • file_state_store.py: This file defines the FileStateStore class which handles the logic for creating and managing files for each user.

  • set_user_state.py: This file creates a user object and uses a FileStateStore to save the user's selected provider to a JSON file.

  • get_user_state.py: This file retrieves a users selected provider from the JSON file created with set_user_state.py.

App Distribution / OAuth

Only implement OAuth if you plan to distribute your application across multiple workspaces. A separate app_oauth.py file can be found with relevant OAuth settings.

When using OAuth, Slack requires a public URL where it can send requests. In this template app, we've used ngrok. Checkout this guide for setting it up.

Start ngrok to access the app on an external network and create a redirect URL for OAuth.

ngrok http 3000

This output should include a forwarding address for http and https (we'll use https). It should look something like the following:

Forwarding   https://3cb89939.ngrok.io -> http://localhost:3000

Navigate to OAuth & Permissions in your app configuration and click Add a Redirect URL. The redirect URL should be set to your ngrok forwarding address with the slack/oauth_redirect path appended. For example:

https://3cb89939.ngrok.io/slack/oauth_redirect

About

Bring AI into your workspace using a chatbot powered by Anthropic and OpenAI

Resources

License

Stars

Watchers

Forks

Languages