This starter template lets you quickly start working with the Bee Agent Framework in a second.
📚 See the documentation to learn more.
- 🔒 Safely execute an arbitrary Python Code via Bee Code Interpreter.
- 🔎 Get complete visibility into agents' decisions using our MLFlow integration thanks to Bee Observe.
- 🚀 Fully fledged TypeScript project setup with linting and formatting.
- JavaScript runtime NodeJS > 18 (ideally installed via nvm).
- Container system like Rancher Desktop, Podman (VM must be rootfull machine) or Docker.
- LLM Provider either external WatsonX (OpenAI, Groq, ...) or local ollama.
- Clone this repository or use it as a template.
- Install dependencies
npm ci
. - Configure your project by filling in missing values in the
.env
file (default LLM provider is locally hostedOllama
). - Run the agent
npm run start src/agent.ts
To run an agent with a custom prompt, simply do this npm run start src/agent.ts <<< 'Hello Bee!'
🧪 More examples can be found here.
Tip
To use Bee agent with Python Code Interpreter refer to the Code Interpreter section.
Tip
To use Bee agent with Bee Observe refer to the Observability section.
Note
Docker distribution with support for compose is required, the following are supported:
The Bee Code Interpreter is a gRPC service that an agent uses to execute an arbitrary Python code safely.
- Start all services related to the
Code Interpreter
npm run infra:start --profile=code_interpreter
- Run the agent
npm run start src/agent_code_interpreter.ts
Note
Code Interpreter runs on http://127.0.0.1:50051
.
Get complete visibility of the agent's inner workings via our observability stack.
- The MLFlow is used as UI for observability.
- The Bee Observe is the observability service (API) for gathering traces from Bee Agent Framework.
- The Bee Observe Connector is the observability connector that sends traces from Bee Agent Framework to Bee Observe.
- Start all services related to Bee Observe
npm run infra:start --profile=observe
- Run the agent
npm run start src/agent_observe.ts
- See visualized trace in MLFlow web application
http://127.0.0.1:8080/#/experiments/0
Tip
Configuration file is infra/observe/.env.docker.