wren-engine
: the engine service. check out example here: wren-engine /examplewren-ai-service
: the AI service. check out example here: wren-ai-service docker-compose exampleqdrant
: the vector store ai service is using.wren-ui
: the UI service.bootstrap
: put required files to volume for engine service.
Shared data using data
volume.
Path structure as following:
/mdl
*.json
(will putsample.json
during bootstrap)
accounts
config.properties
- Check out Network drivers overview to learn more about
bridge
network driver.
- copy
.env.example
to.env.local
and modify the OpenAI API key. - start all services:
docker-compose --env-file .env.local up -d
. - stop all services:
docker-compose --env-file .env.local down
.
- copy
.env.example
to.env.local
and modify the OpenAI API key. - copy
.env.ai.example
to.env.ai
and fill in necessary information if you would like to use custom LLM. - start all services(with custom LLM):
docker-compose -f docker-compose.yaml -f docker-compose.llm.yaml --env-file .env.local --env-file .env.ai up -d
. - stop all services(with custom LLM):
docker-compose -f docker-compose.yaml -f docker-compose.llm.yaml --env-file .env.local --env-file .env.ai down
.
Note: If your port 3000 is occupied, you can modify the
HOST_PORT
in.env.local
.