Run LLMs locally
Jump to navigation
Jump to search
Run LLMs (Large language model) locally
List of LLMs software
- Docker support: Available[2]
- Supported OS: Win , Mac & Linux
- User Interface: GUI
- Support API endpoint: Available[3]
- License: OSS, MIT[4]
LM Studio - Discover, download, and run local LLMs
- Docker support: N/A
- Supported OS: Win , Mac & Linux
- User Interface: GUI
- Support API endpoint: Available[5]
- License: Mixed; Apache License 2.0 for lmstudio.js[6]. But Terms of Use[7] mentioned "using the Services for your personal, non-commercial purposes."
- Docker support: Available
- Supported OS: Win , Mac & Linux
- User Interface: CLI (Command-line interface)
- Support API endpoint: Available
- License: GAPL[8]
- Docker support: N/A
- Supported OS: Win , Mac & Linux
- User Interface: GUI
- Support API endpoint: Available[9]
- License: OSS, MIT[10]
- Docker support: Available
- Supported OS: Win , Mac & Linux
- User Interface: GUI
- Support API endpoint: Available[11]
- License: OSS, GNU v.3[12]
- Docker support: Available
- Supported OS: Win , Mac & Linux [13]
- User Interface: GUI
- Support API endpoint: Available[14]
- License: OSS, Apache v.2[15]
LLM: A CLI utility and Python library for interacting with Large Language Models (llm · PyPI)
- Docker support:
- Supported OS:
- User Interface: CLI
- Support API endpoint: N/A
- License:
(left blank intentionally)
* Docker support: Available or not * Supported OS: {{Win}}, {{Mac}} & {{Linux}} * User Interface: GUI or CLI * Support API endpoint: Available or not * License: Mixed, OSS or copyright
Further reading
- 5 easy ways to run an LLM locally | InfoWorld
- Troyanovsky/Local-LLM-Comparison-Colab-UI: Compare the performance of different LLM that can be deployed locally on consumer hardware. Run yourself with Colab WebUI.
References
- ↑ 介紹好用工具:Ollama 快速在本地啟動並執行大型語言模型 | The Will Will Web
- ↑ Ollama is now available as an official Docker image · Ollama Blog
- ↑ ollama/ollama: Get up and running with Llama 3, Mistral, Gemma, and other large language models.
- ↑ ollama/LICENSE at main · ollama/ollama
- ↑ Local LLM Server | LM Studio
- ↑ lmstudio.js/LICENSE at main · lmstudio-ai/lmstudio.js
- ↑ https://lmstudio.ai/terms
- ↑ jan/LICENSE at dev · janhq/jan
- ↑ Generation - GPT4All Documentation
- ↑ gpt4all/LICENSE.txt at main · nomic-ai/gpt4all
- ↑ 12 ‐ OpenAI API · oobabooga/text-generation-webui Wiki
- ↑ text-generation-webui/LICENSE at main · oobabooga/text-generation-webui
- ↑ Installation – PrivateGPT | Docs
- ↑ API Reference overview – PrivateGPT | Docs
- ↑ private-gpt/LICENSE at main · zylon-ai/private-gpt