Notes on running local AIs: Difference between revisions

From Chrysalis Archive
Jump to navigation Jump to search
mNo edit summary
mNo edit summary
 
(One intermediate revision by the same user not shown)
Line 1: Line 1:
[[Category:AI]]
{{menuAIEngineering}}
[[Category:local AI]]
[[Category:local AI]]


Line 9: Line 9:
=== Seven Ways of Running LLMs Locally ===
=== Seven Ways of Running LLMs Locally ===
;:* https://kleiber.me/blog/2024/01/07/six-ways-running-llm-locally/
;:* https://kleiber.me/blog/2024/01/07/six-ways-running-llm-locally/
=== Coding Models ===
Win10 WLS prompt
ollama run deepseek-coder
 
https://ollama.com/library/deepseek-coder

Latest revision as of 14:10, 28 April 2024