Local LLM Python Streamlit UI
Ollama v.x Streamlit Playground
This project demonstrates how to run and manage models locally using Ollama by creating an interactive UI with Streamlit. The app has a page for running chat-based models and also one for nultimodal models (llava and bakllava) for vision. App in Action — https://github.com/tonykipkemboi/ollama_streamlit_demos