Notes on running local AIs: Difference between revisions
Jump to navigation
Jump to search
mNo edit summary |
mNo edit summary |
||
| (One intermediate revision by the same user not shown) | |||
| Line 1: | Line 1: | ||
{{menuAIEngineering}} | |||
[[Category:local AI]] | [[Category:local AI]] | ||
| Line 9: | Line 9: | ||
=== Seven Ways of Running LLMs Locally === | === Seven Ways of Running LLMs Locally === | ||
;:* https://kleiber.me/blog/2024/01/07/six-ways-running-llm-locally/ | ;:* https://kleiber.me/blog/2024/01/07/six-ways-running-llm-locally/ | ||
=== Coding Models === | |||
Win10 WLS prompt | |||
ollama run deepseek-coder | |||
| |||
https://ollama.com/library/deepseek-coder | |||
Latest revision as of 14:10, 28 April 2024
- an AI category
AI Engineering ∞ Notes on AI Machine Learning ☀ Notes on Self Hosting AI ☀ Hitchhiker's Guide to Local LLMs ☀ Notes on AI hardware vendors ☀ Local LLM Python Streamlit UI ☀ Graph Generalization using Functional Networks (GGUF) ☀ Compiling Python ☀ Steps to Build a Dialog-Wise Running AI RAG System ☀
- an AI category
AI Engineering ∞ Notes on AI Machine Learning ☀ Notes on Self Hosting AI ☀ Hitchhiker's Guide to Local LLMs ☀ Notes on AI hardware vendors ☀ Local LLM Python Streamlit UI ☀ Graph Generalization using Functional Networks (GGUF) ☀ Compiling Python ☀ Steps to Build a Dialog-Wise Running AI RAG System ☀
YouTube
Seven Ways of Running LLMs Locally
Coding Models
Win10 WLS prompt ollama run deepseek-coder https://ollama.com/library/deepseek-coder