A streamlit app for using a llama-cpp-python high level api
- install python3 from python.org or from repo:
apt install python3
- install requirements
pip install -r requirements.txt
- change the api url in src/config.json to your llama-cpp-python high level api
- set your page_title to whatever you want
- set n_ctx value to the value of your api
- set default values to the model settings
src/config.json
{
"api_url": "https://llama-cpp-python.mydomain.com",
"page_title": "Llama-2-7b-Chat",
"n_ctx": 2048,
"enable_context": "True",
"stream": "True",
"max_tokens": "256",
"temperature": "0.2",
"top_p": "0.95",
"top_k": "40",
"repeat_penalty": "1.1",
"stop": "###",
"system_content": "User asks Questions to the AI. AI is helpful, kind, obedient, honest, and knows its own limits.",
"prompt": "### Instructions:\n{prompt}\n\n### Response:\n"
}
- to change the logo or favicon, just replace the files inside the
./static
folder
- run streamlit app
streamlit run streamlit_app.py
- browse http://localhost:8501/
- choose supported endpoint
- optional: adjust model settings/parameters
- enter your message
- This project is licensed under the GNU General Public License - see the gpl-3.0 for details.