Create chat bots that know your data
Welcome to LlamaIndex Chat. You can create and share LLM chatbots that know your data (PDF or text documents).
Getting started with LlamaIndex Chat is a breeze. Visit https://chat.llamaindex.ai - a hosted version of LlamaIndex Chat with no user authentication that provides an immediate start.
LlamaIndex Chat is an example chatbot application for LlamaIndexTS featuring LlamaCloud.
You can:
- Create bots using prompt engineering and share them with other users.
- Modify the demo bots by using the UI or directly editing the ./app/bots/bot.data.ts file.
- Integrate your data by uploading documents or generating new data sources.
Requirement: NodeJS 18
- Clone the repository
git clone https://github.com/run-llama/chat-llamaindex
cd chat-llamaindex
- Prepare the project
pnpm install
pnpm run create-llama
Note: The last step copies the chat UI component and file server route from the create-llama project, see ./create-llama.sh.
- Set the environment variables
Edit environment variables in .env.development.local
. Especially check your OPENAI_API_KEY
and LLAMA_CLOUD_API_KEY
and the LlamaCloud project to use (LLAMA_CLOUD_PROJECT_NAME
).
- Download the demo datasources
pnpm run get-demo
- Upload the demo datasources to your LlamaCloud account
pnpm run generate-demo
- Run the dev server
pnpm dev
Note: This sections has not been used for a while and might be outdated.
You can use Docker for development and deployment of LlamaIndex Chat.
docker build -t chat-llamaindex .
docker run -p 3000:3000 --env-file .env.development.local chat-llamaindex
For those preferring Docker Compose, we've included a docker-compose.yml file. To run using Docker Compose:
docker compose up
Go to http://localhost:3000 in your web browser.
Note: By default, the Docker Compose setup maps the cache
and datasources
directories from your host machine to the Docker container, ensuring data persistence and accessibility between container restarts.
Deploying to Vercel is simple; click the button below and follow the instructions:
If you're deploying to a Vercel Hobby account, change the running time to 10 seconds, as this is the limit for the free plan.
If you want to use the sharing functionality, then you need to create a Vercel KV store and connect it to your project. Just follow this step from the quickstart. No further configuration is necessary as the app automatically uses a connected KV store.
LlamaIndex Chat supports the sharing of bots via URLs. Demo bots are read-only and can't be shared. But you can create new bots (or clone and modify a demo bot) and call the share functionality in the context menu. It will create a unique URL that you can share with others. Opening the URL, users can directly use the shared bot.
The app is using a ChatEngine
for each bot with a LlamaCloudIndex
attached.
To set which LlamaCloudIndex
is used for a bot, change the datasource
attribute in the bot's data.
Note: To use the changed bots, you have to clear your local storage. Otherwise, the old bots are still used. You can clear your local storage by opening the developer tools and running
localStorage.clear()
in the console and reloading the page.
To generate a new data source, create a new subfolder in the datasources
directory and add the data files (e.g., PDFs).
Then, run the following command to create as an index in the Default
project on LlamaCloud
pnpm run generate <datasource-name>
Where <datasource-name>
is the name of the subfolder with your data files.
Note: On Windows, use
pnpm run generate:win <datasource-name>
instead.
Thanks go to @Yidadaa for his ChatGPT-Next-Web project, which was used as a starter template for this project.