The codebase for assistant.bible: A Bible assistant.
An intelligent search engine for domain specific data, that not only provides relevant links to resources but also synthesizes and summarizes results to provide a succinct and relevant answer through an interactive interface. It connects to LLMs and other AI models that are trained on the ground truth texts uploaded by the organization. This enables the software to provide highly accurate and relevant responses to user queries, based on the specific context of the organization's data. With the natural language chat interface, users can easily interact with the data and gain insights that would be difficult or time-consuming to obtain otherwise.
- Python 3.10
- FastAPI
- LangChain(?)
- OpenAI APIs
- ChromaDB
- Postgres(?)
- Supabase
- Docker
git pull
cd deployment/
docker build -t assistant.bible .
--or using docker compose, you can rundocker compose --env-file .env up --build -d
-
docker run \ -v chroma-db-vol:../chromadb \ -v ab-logs:../logs \ -p 8000:8000 \ -e OPENAI_API_KEY=<sk-...> \ <imageid>
Environment variables and their default values,
OPENAI_API_KEY
CHROMA_DB_PATH
default 'chormadb_store', if changing this should be changed in the volume also.CHROMA_DB_COLLECTION
default 'adotbcollection'.CHROMA_DB_QUERY_LIMIT=10
POSTGRES_DB_HOST=localhost
POSTGRES_DB_PORT=5432
POSTGRES_DB_NAME=adotbcollection
POSTGRES_DB_USER=admin
POSTGRES_DB_PASSWORD=secret
POSTGRES_DB_QUERY_LIMIT=10
DOMAIN=assistant.bible
SUPABASE_URL
SUPABASE_KEY
If using default values, once started the app should be running at http://localhost:8000 and dev UI available at http://localhost:8000/ui and API docs at http://localhost:8000/docs.
-
Create virtual environment
python3.10 -m venv <ENV> source <ENV>/bin/activate pip install -r requirements.txt
-
Set Environment variables
export OPENAI_API_KEY=sk-... CHROMA_DB_PATH=../chromadb CHROMA_DB_COLLECTION=adotbcollection
These values need to be set and saved in current terminal or in
.bashrc
file in home directory -
Postgres DB Have postgres db running with pgvector extension enabled (refer). And provide the connection details as environment variables.
POSTGRES_DB_HOST=localhost POSTGRES_DB_PORT=5432 POSTGRES_DB_NAME=adotbcollection POSTGRES_DB_USER=postgres POSTGRES_DB_PASSWORD=secret export SUPABASE_URL=https://<your_supabase_instance>.supabase.co export SUPABASE_KEY=<your_supabase_key>
If you dont want to mess with the locally running postgres version, running it as a docker container is a safer alternative:
docker run -e POSTGRES_USER=admin -e POSTGRES_PASSWORD=secret -e POSTGRES_DB=adotbcollection -p 5432:5432 ankane/pgvector
-
Start the app
cd app uvicorn main:app --port 8000 # Note: if you get ModuleNotFoundError, your system may be trying to use # python > or < 3.10 despite the virtual environment. If this happens, # try rerunning with the python version specified, e.g.: # python3.10 -m uvicorn main:app --port 8000
If successfully started, the app should be running at http://localhost:8000 and dev UI available at http://localhost:8000/ui and API docs at http://localhost:8000/docs.
If you are running the app locally, then trying to sign up from localhost will try to send an email verification. This may not be delivered, likely due to automated email certificate issues. To get around this, you could probably use the supabase cli to create a user and then set the
SUPABASE_KEY
environment variable to the key generated by the cli. However, the local development developer experience has not been extensively tested.
From app/
, with virtual enviroment and Environment variables all set run
python -m pytest
After activating the virtual environment, from project root folder (assistant.bible/
), run
pylint --rcfile=.pylintrc app/*.py app/tests/*.py
In this repository, there are two long-lived branches: develop and main. There are automated actions that trigger deployment upon a "push" event for the following respective environments: https://dev.assistant.bible/ (hosted in the SIL DO account) and https://assistant.bible/ (hosted in the BCS DO account ).
To contribute, please follow these steps:
- Create a feature branch.
- Merge your changes into the develop branch for testing purposes.
- Open a pull request (PR) when merging into the main branch (production).