Editing Run LLMs locally
Revision as of 14:33, 29 April 2024 by Planetoid (talk | contribs) (Created page with "Run LLMs (Large language model) locally == List of LLMs software == [https://ollama.com/ Ollama]<ref>[https://blog.miniasp.com/post/2024/03/04/Useful-tool-Ollama 介紹好用工具:Ollama 快速在本地啟動並執行大型語言模型 | The Will Will Web]</ref> * Docker support: Available<ref>[https://ollama.com/blog/ollama-is-now-available-as-an-official-docker-image Ollama is now available as an official Docker image · Ollama Blog]</ref> {{Gd}} * Supported OS: {{...")
Warning: You are editing an out-of-date revision of this page. If you publish it, any changes made since this revision will be lost.
Warning: You are not logged in. Your IP address will be publicly visible if you make any edits. If you log in or create an account, your edits will be attributed to your username, along with other benefits.
Retrieved from "https://wiki.planetoid.info/index.php/Run_LLMs_locally"