Skip to content

3x3cut0r/llama-cpp-python-streamlit

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

40 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

llama-cpp-python-streamlit

A streamlit app for using a llama-cpp-python high level api

llama-cpp-python-streamlit

Index

  1. Installation
  2. Configuration
  3. Usage
    3.1 deploy streamlit app
    3.2 use
  4. Find Me
  5. License

1 Installation

apt install python3
  • install requirements
pip install -r requirements.txt

2 Configuration

  • change the api url in src/config.json to your llama-cpp-python high level api
  • set your page_title to whatever you want
  • set n_ctx value to the value of your api
  • set default values to the model settings

src/config.json

{
    "api_url": "https://llama-cpp-python.mydomain.com",
    "page_title": "Llama-2-7b-Chat",
    "n_ctx": 2048,
    "enable_context": "True",
    "stream": "True",
    "max_tokens": "256",
    "temperature": "0.2",
    "top_p": "0.95",
    "top_k": "40",
    "repeat_penalty": "1.1",
    "stop": "###",
    "system_content": "User asks Questions to the AI. AI is helpful, kind, obedient, honest, and knows its own limits.",
    "prompt": "### Instructions:\n{prompt}\n\n### Response:\n"
}
  • to change the logo or favicon, just replace the files inside the ./static folder

3 Usage

3.1 deploy streamlit app

  • run streamlit app
streamlit run streamlit_app.py

3.2 use

  • browse http://localhost:8501/
  • choose supported endpoint
  • optional: adjust model settings/parameters
  • enter your message

4 Find Me

E-Mail

5 License

License: GPL v3 - This project is licensed under the GNU General Public License - see the gpl-3.0 for details.

About

A streamlit app for using a llama-cpp-python high level api

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages