Welcome to a project that explores the synergy between SurrealDB and OpenAI.
surreal-chat-demo.mp4
This project aims to demonstrate the potential of integrating SurrealDB with OpenAI by importing 25,000 Wikipedia articles, complete with vector embeddings generated by OpenAI. The ultimate goal is to develop a Retrieval-Augmented Generation (RAG) question-answering system.
The backend is powered by a FastAPI server, with Jinja2 handling template rendering and htmx enriching the frontend chat interface.
-
SurrealDB: Ensure that SurrealDB is installed and configured on your machine. Installation instructions are available here.
-
Python 3.11: We use uv to manage dependencies.
-
SurrealDB Python SDK: The SurrealDB Python SDK is currently in beta. To install the dependency, please ensure you have
rust
installed on your system.
-
Clone the repository.
-
You will need an OpenAI API token for this project. If you don't have one, you can obtain it by following the OpenAI Developer Quickstart. Rename the
.env.example
file to.env
and replace the placeholder with your key.
Note: This project is intended for experimental purposes only. Ensure that your OpenAI API key is kept secure.
To initialise SurrealDB, use the following make
commands:
-
Start SurrealDB with on-disk persistence:
make surreal-start
-
Initialise the database with the required tables and functions:
make surreal-init
-
If you need to reset the database:
make surreal-remove
With uv
installed, run:
uv sync
Activate the Python virtual environment:
source .venv/bin/activate
Download the Simple English Wikipedia dataset, which includes vector embeddings (~700MB compressed, ~1.7GB CSV):
download-data
Insert the dataset into SurrealDB:
surreal-insert
To start the chat server:
make server-start
If you find this project helpful, consider supporting the development: