Local LLM Python Streamlit UI: Difference between revisions

From Chrysalis Archive
Jump to navigation Jump to search
Created page with "Category:python Category:AI Category:LLM == Ollama v.x Streamlit Playground == This project demonstrates how to run and manage models locally using Ollama by creating an interactive UI with Streamlit.   The app has a page for running chat-based models and also one for nultimodal models (llava and bakllava) for vision.   App in Action '''— https://github.com/tonykipkemboi/ollama_streamlit_demos'''"
 
mNo edit summary
 
Line 2: Line 2:
[[Category:AI]]
[[Category:AI]]
[[Category:LLM]]
[[Category:LLM]]
 
{{menuAIEngineering}}
== Ollama v.x Streamlit Playground ==
== Ollama v.x Streamlit Playground ==
  This project demonstrates how to run and manage models locally using Ollama by creating an interactive UI with Streamlit.
  This project demonstrates how to run and manage models locally using Ollama by creating an interactive UI with Streamlit.

Latest revision as of 20:30, 1 May 2024

an AI category

AI Engineering ∞  Notes on AI Machine Learning ☀  Notes on Self Hosting AI ☀  Hitchhiker's Guide to Local LLMs ☀  Notes on AI hardware vendors ☀  Local LLM Python Streamlit UI ☀  Graph Generalization using Functional Networks (GGUF) ☀  Compiling Python ☀  Steps to Build a Dialog-Wise Running AI RAG System ☀ 

Ollama v.x Streamlit Playground

This project demonstrates how to run and manage models locally using Ollama by creating an interactive UI with Streamlit.
 
The app has a page for running chat-based models and also one for nultimodal models (llava and bakllava) for vision.
 
App in Action
https://github.com/tonykipkemboi/ollama_streamlit_demos