Run LLMs locally: Difference between revisions
Jump to navigation
Jump to search
(+ MLC LLM) |
(update GPT4All) |
||
| Line 7: | Line 7: | ||
* User Interface: CLI (Command-line interface) | * User Interface: CLI (Command-line interface) | ||
* Support API endpoint: Available<ref>[https://github.com/ollama/ollama?tab=readme-ov-file#rest-api ollama/ollama: Get up and running with Llama 3, Mistral, Gemma, and other large language models.]</ref> | * Support API endpoint: Available<ref>[https://github.com/ollama/ollama?tab=readme-ov-file#rest-api ollama/ollama: Get up and running with Llama 3, Mistral, Gemma, and other large language models.]</ref> | ||
* Supported embedding models: | |||
* License: OSS, MIT<ref>[https://github.com/ollama/ollama/blob/main/LICENSE ollama/LICENSE at main · ollama/ollama]</ref> | * License: OSS, MIT<ref>[https://github.com/ollama/ollama/blob/main/LICENSE ollama/LICENSE at main · ollama/ollama]</ref> | ||
| Line 14: | Line 15: | ||
* User Interface: GUI | * User Interface: GUI | ||
* Support API endpoint: Available<ref>[https://lmstudio.ai/docs/local-server Local LLM Server | LM Studio]</ref> | * Support API endpoint: Available<ref>[https://lmstudio.ai/docs/local-server Local LLM Server | LM Studio]</ref> | ||
* Supported embedding models: | |||
* License: Mixed; Apache License 2.0 for lmstudio.js<ref>[https://github.com/lmstudio-ai/lmstudio.js/blob/main/LICENSE lmstudio.js/LICENSE at main · lmstudio-ai/lmstudio.js]</ref>. But Terms of Use<ref>https://lmstudio.ai/terms</ref> mentioned "using the Services for your personal, non-commercial purposes." | * License: Mixed; Apache License 2.0 for lmstudio.js<ref>[https://github.com/lmstudio-ai/lmstudio.js/blob/main/LICENSE lmstudio.js/LICENSE at main · lmstudio-ai/lmstudio.js]</ref>. But Terms of Use<ref>https://lmstudio.ai/terms</ref> mentioned "using the Services for your personal, non-commercial purposes." | ||
| Line 21: | Line 23: | ||
* User Interface: CLI (Command-line interface) | * User Interface: CLI (Command-line interface) | ||
* Support API endpoint: Available | * Support API endpoint: Available | ||
* Supported embedding models: | |||
* License: GAPL<ref>[https://github.com/janhq/jan/blob/dev/LICENSE jan/LICENSE at dev · janhq/jan]</ref> | * License: GAPL<ref>[https://github.com/janhq/jan/blob/dev/LICENSE jan/LICENSE at dev · janhq/jan]</ref> | ||
| Line 28: | Line 31: | ||
* User Interface: GUI | * User Interface: GUI | ||
* Support API endpoint: Available<ref>[https://docs.gpt4all.io/gpt4all_python.html Generation - GPT4All Documentation]</ref> | * Support API endpoint: Available<ref>[https://docs.gpt4all.io/gpt4all_python.html Generation - GPT4All Documentation]</ref> | ||
* Supported embedding models: "SBert and Nomic Embed Text v1 & v1.5"<ref>[https://docs.gpt4all.io/gpt4all_help/faq.html#what-are-the-system-requirements FAQ - GPT4All]</ref> | |||
* License: OSS, MIT<ref>[https://github.com/nomic-ai/gpt4all/blob/main/LICENSE.txt gpt4all/LICENSE.txt at main · nomic-ai/gpt4all]</ref> | * License: OSS, MIT<ref>[https://github.com/nomic-ai/gpt4all/blob/main/LICENSE.txt gpt4all/LICENSE.txt at main · nomic-ai/gpt4all]</ref> | ||
| Line 35: | Line 39: | ||
* User Interface: GUI | * User Interface: GUI | ||
* Support API endpoint: Available<ref>[https://github.com/oobabooga/text-generation-webui/wiki/12-%E2%80%90-OpenAI-API#examples 12 ‐ OpenAI API · oobabooga/text-generation-webui Wiki]</ref> | * Support API endpoint: Available<ref>[https://github.com/oobabooga/text-generation-webui/wiki/12-%E2%80%90-OpenAI-API#examples 12 ‐ OpenAI API · oobabooga/text-generation-webui Wiki]</ref> | ||
* Supported embedding models: | |||
* License: OSS, GNU v.3<ref>[https://github.com/oobabooga/text-generation-webui/blob/main/LICENSE text-generation-webui/LICENSE at main · oobabooga/text-generation-webui]</ref> | * License: OSS, GNU v.3<ref>[https://github.com/oobabooga/text-generation-webui/blob/main/LICENSE text-generation-webui/LICENSE at main · oobabooga/text-generation-webui]</ref> | ||
| Line 42: | Line 47: | ||
* User Interface: GUI | * User Interface: GUI | ||
* Support API endpoint: Available<ref>[https://docs.privategpt.dev/api-reference/overview/api-reference-overview API Reference overview – PrivateGPT | Docs]</ref> | * Support API endpoint: Available<ref>[https://docs.privategpt.dev/api-reference/overview/api-reference-overview API Reference overview – PrivateGPT | Docs]</ref> | ||
* Supported embedding models: | |||
* License: OSS, Apache v.2<ref>[https://github.com/zylon-ai/private-gpt/blob/main/LICENSE private-gpt/LICENSE at main · zylon-ai/private-gpt]</ref> | * License: OSS, Apache v.2<ref>[https://github.com/zylon-ai/private-gpt/blob/main/LICENSE private-gpt/LICENSE at main · zylon-ai/private-gpt]</ref> | ||
| Line 56: | Line 62: | ||
* User Interface: CLI | * User Interface: CLI | ||
* Support API endpoint: N/A | * Support API endpoint: N/A | ||
* Supported embedding models: | |||
* License: | * License: | ||
| Line 63: | Line 70: | ||
* User Interface: CLI | * User Interface: CLI | ||
* Support API endpoint: Available<ref>[https://llm.mlc.ai/docs/deploy/rest.html REST API — mlc-llm 0.1.0 documentation]</ref> | * Support API endpoint: Available<ref>[https://llm.mlc.ai/docs/deploy/rest.html REST API — mlc-llm 0.1.0 documentation]</ref> | ||
* Supported embedding models: | |||
* License: OSS, Apache License Version 2.0<ref>[https://github.com/mlc-ai/mlc-llm/blob/main/LICENSE mlc-llm/LICENSE at main · mlc-ai/mlc-llm]</ref> | * License: OSS, Apache License Version 2.0<ref>[https://github.com/mlc-ai/mlc-llm/blob/main/LICENSE mlc-llm/LICENSE at main · mlc-ai/mlc-llm]</ref> | ||
| Line 71: | Line 79: | ||
* User Interface: GUI or CLI | * User Interface: GUI or CLI | ||
* Support API endpoint: Available or not | * Support API endpoint: Available or not | ||
* Supported embedding models: | |||
* License: Mixed, OSS or copyright | * License: Mixed, OSS or copyright | ||
</pre> | </pre> | ||
Revision as of 21:20, 3 July 2024
Run LLMs (Large language model) locally
List of LLMs software
- Docker support: Available[2]

- Supported OS: Win
, macOS
& Linux
- User Interface: CLI (Command-line interface)
- Support API endpoint: Available[3]
- Supported embedding models:
- License: OSS, MIT[4]
LM Studio - Discover, download, and run local LLMs
- Docker support: N/A

- Supported OS: Win
, macOS
& Linux
- User Interface: GUI
- Support API endpoint: Available[5]
- Supported embedding models:
- License: Mixed; Apache License 2.0 for lmstudio.js[6]. But Terms of Use[7] mentioned "using the Services for your personal, non-commercial purposes."
- Docker support: Available
- Supported OS: Win
, macOS
& Linux
- User Interface: CLI (Command-line interface)
- Support API endpoint: Available
- Supported embedding models:
- License: GAPL[8]
- Docker support: N/A
- Supported OS: Win
, macOS
& Linux
- User Interface: GUI
- Support API endpoint: Available[9]
- Supported embedding models: "SBert and Nomic Embed Text v1 & v1.5"[10]
- License: OSS, MIT[11]
- Docker support: Available
- Supported OS: Win
, macOS
& Linux
- User Interface: GUI
- Support API endpoint: Available[12]
- Supported embedding models:
- License: OSS, GNU v.3[13]
- Docker support: Available
- Supported OS: Win
, macOS
& Linux
[14] - User Interface: GUI
- Support API endpoint: Available[15]
- Supported embedding models:
- License: OSS, Apache v.2[16]
AnythingLLM | The ultimate AI business intelligence tool
- Docker support: Available[17]
- Supported OS: Win
, macOS
& Linux
- User Interface: GUI
- Support API endpoint:
- License: Mixed, OSS, MIT[18]
LLM: A CLI utility and Python library for interacting with Large Language Models (llm · PyPI)
- Docker support: The `LLM` package might be supported if installed in a Python Docker Container. However, it has not been actually tested.

- Supported OS:
- User Interface: CLI
- Support API endpoint: N/A
- Supported embedding models:
- License:
- Docker support: Not available[19]
- Supported OS: Win
, macOS
& Linux
[20] - User Interface: CLI
- Support API endpoint: Available[21]
- Supported embedding models:
- License: OSS, Apache License Version 2.0[22]
(left blank intentionally)
* Docker support: Available or not
* Supported OS: {{Win}}, {{Mac}} & {{Linux}}
* User Interface: GUI or CLI
* Support API endpoint: Available or not
* Supported embedding models:
* License: Mixed, OSS or copyright
Further reading
- 5 easy ways to run an LLM locally | InfoWorld
- Troyanovsky/Local-LLM-Comparison-Colab-UI: Compare the performance of different LLM that can be deployed locally on consumer hardware. Run yourself with Colab WebUI.
References
- ↑ 介紹好用工具:Ollama 快速在本地啟動並執行大型語言模型 | The Will Will Web
- ↑ Ollama is now available as an official Docker image · Ollama Blog
- ↑ ollama/ollama: Get up and running with Llama 3, Mistral, Gemma, and other large language models.
- ↑ ollama/LICENSE at main · ollama/ollama
- ↑ Local LLM Server | LM Studio
- ↑ lmstudio.js/LICENSE at main · lmstudio-ai/lmstudio.js
- ↑ https://lmstudio.ai/terms
- ↑ jan/LICENSE at dev · janhq/jan
- ↑ Generation - GPT4All Documentation
- ↑ FAQ - GPT4All
- ↑ gpt4all/LICENSE.txt at main · nomic-ai/gpt4all
- ↑ 12 ‐ OpenAI API · oobabooga/text-generation-webui Wiki
- ↑ text-generation-webui/LICENSE at main · oobabooga/text-generation-webui
- ↑ Installation – PrivateGPT | Docs
- ↑ API Reference overview – PrivateGPT | Docs
- ↑ private-gpt/LICENSE at main · zylon-ai/private-gpt
- ↑ 🆚 Desktop Vs. Docker | AnythingLLM
- ↑ anything-llm/LICENSE at master · Mintplex-Labs/anything-llm
- ↑ Docker support · Issue #33 · mlc-ai/mlc-llm
- ↑ Install MLC LLM Python Package — mlc-llm 0.1.0 documentation
- ↑ REST API — mlc-llm 0.1.0 documentation
- ↑ mlc-llm/LICENSE at main · mlc-ai/mlc-llm