Notes on running local AIs: Difference between revisions
Jump to navigation
Jump to search
mNo edit summary |
mNo edit summary |
||
| Line 1: | Line 1: | ||
{{menuAIEngineering}} | |||
[[Category:local AI]] | [[Category:local AI]] | ||
Latest revision as of 14:10, 28 April 2024
- an AI category
AI Engineering ∞ Notes on AI Machine Learning ☀ Notes on Self Hosting AI ☀ Hitchhiker's Guide to Local LLMs ☀ Notes on AI hardware vendors ☀ Local LLM Python Streamlit UI ☀ Graph Generalization using Functional Networks (GGUF) ☀ Compiling Python ☀ Steps to Build a Dialog-Wise Running AI RAG System ☀
- an AI category
AI Engineering ∞ Notes on AI Machine Learning ☀ Notes on Self Hosting AI ☀ Hitchhiker's Guide to Local LLMs ☀ Notes on AI hardware vendors ☀ Local LLM Python Streamlit UI ☀ Graph Generalization using Functional Networks (GGUF) ☀ Compiling Python ☀ Steps to Build a Dialog-Wise Running AI RAG System ☀
YouTube
Seven Ways of Running LLMs Locally
Coding Models
Win10 WLS prompt ollama run deepseek-coder https://ollama.com/library/deepseek-coder