Run LLMs locally: Difference between revisions

From LemonWiki共筆
Jump to navigation Jump to search
(+ MLC LLM)
(update GPT4All)
Line 7: Line 7:
* User Interface: CLI (Command-line interface)
* User Interface: CLI (Command-line interface)
* Support API endpoint: Available<ref>[https://github.com/ollama/ollama?tab=readme-ov-file#rest-api ollama/ollama: Get up and running with Llama 3, Mistral, Gemma, and other large language models.]</ref>
* Support API endpoint: Available<ref>[https://github.com/ollama/ollama?tab=readme-ov-file#rest-api ollama/ollama: Get up and running with Llama 3, Mistral, Gemma, and other large language models.]</ref>
* Supported embedding models:
* License: OSS, MIT<ref>[https://github.com/ollama/ollama/blob/main/LICENSE ollama/LICENSE at main · ollama/ollama]</ref>
* License: OSS, MIT<ref>[https://github.com/ollama/ollama/blob/main/LICENSE ollama/LICENSE at main · ollama/ollama]</ref>


Line 14: Line 15:
* User Interface: GUI
* User Interface: GUI
* Support API endpoint: Available<ref>[https://lmstudio.ai/docs/local-server Local LLM Server | LM Studio]</ref>
* Support API endpoint: Available<ref>[https://lmstudio.ai/docs/local-server Local LLM Server | LM Studio]</ref>
* Supported embedding models:
* License: Mixed; Apache License 2.0 for lmstudio.js<ref>[https://github.com/lmstudio-ai/lmstudio.js/blob/main/LICENSE lmstudio.js/LICENSE at main · lmstudio-ai/lmstudio.js]</ref>. But Terms of Use<ref>https://lmstudio.ai/terms</ref> mentioned "using the Services for your personal, non-commercial purposes."
* License: Mixed; Apache License 2.0 for lmstudio.js<ref>[https://github.com/lmstudio-ai/lmstudio.js/blob/main/LICENSE lmstudio.js/LICENSE at main · lmstudio-ai/lmstudio.js]</ref>. But Terms of Use<ref>https://lmstudio.ai/terms</ref> mentioned "using the Services for your personal, non-commercial purposes."


Line 21: Line 23:
* User Interface: CLI (Command-line interface)
* User Interface: CLI (Command-line interface)
* Support API endpoint: Available
* Support API endpoint: Available
* Supported embedding models:
* License: GAPL<ref>[https://github.com/janhq/jan/blob/dev/LICENSE jan/LICENSE at dev · janhq/jan]</ref>
* License: GAPL<ref>[https://github.com/janhq/jan/blob/dev/LICENSE jan/LICENSE at dev · janhq/jan]</ref>


Line 28: Line 31:
* User Interface: GUI
* User Interface: GUI
* Support API endpoint: Available<ref>[https://docs.gpt4all.io/gpt4all_python.html Generation - GPT4All Documentation]</ref>
* Support API endpoint: Available<ref>[https://docs.gpt4all.io/gpt4all_python.html Generation - GPT4All Documentation]</ref>
* Supported embedding models: "SBert and Nomic Embed Text v1 & v1.5"<ref>[https://docs.gpt4all.io/gpt4all_help/faq.html#what-are-the-system-requirements FAQ - GPT4All]</ref>
* License: OSS, MIT<ref>[https://github.com/nomic-ai/gpt4all/blob/main/LICENSE.txt gpt4all/LICENSE.txt at main · nomic-ai/gpt4all]</ref>
* License: OSS, MIT<ref>[https://github.com/nomic-ai/gpt4all/blob/main/LICENSE.txt gpt4all/LICENSE.txt at main · nomic-ai/gpt4all]</ref>


Line 35: Line 39:
* User Interface: GUI
* User Interface: GUI
* Support API endpoint: Available<ref>[https://github.com/oobabooga/text-generation-webui/wiki/12-%E2%80%90-OpenAI-API#examples 12 ‐ OpenAI API · oobabooga/text-generation-webui Wiki]</ref>
* Support API endpoint: Available<ref>[https://github.com/oobabooga/text-generation-webui/wiki/12-%E2%80%90-OpenAI-API#examples 12 ‐ OpenAI API · oobabooga/text-generation-webui Wiki]</ref>
* Supported embedding models:
* License: OSS, GNU v.3<ref>[https://github.com/oobabooga/text-generation-webui/blob/main/LICENSE text-generation-webui/LICENSE at main · oobabooga/text-generation-webui]</ref>
* License: OSS, GNU v.3<ref>[https://github.com/oobabooga/text-generation-webui/blob/main/LICENSE text-generation-webui/LICENSE at main · oobabooga/text-generation-webui]</ref>


Line 42: Line 47:
* User Interface: GUI
* User Interface: GUI
* Support API endpoint: Available<ref>[https://docs.privategpt.dev/api-reference/overview/api-reference-overview API Reference overview – PrivateGPT | Docs]</ref>
* Support API endpoint: Available<ref>[https://docs.privategpt.dev/api-reference/overview/api-reference-overview API Reference overview – PrivateGPT | Docs]</ref>
* Supported embedding models:
* License: OSS, Apache v.2<ref>[https://github.com/zylon-ai/private-gpt/blob/main/LICENSE private-gpt/LICENSE at main · zylon-ai/private-gpt]</ref>
* License: OSS, Apache v.2<ref>[https://github.com/zylon-ai/private-gpt/blob/main/LICENSE private-gpt/LICENSE at main · zylon-ai/private-gpt]</ref>


Line 56: Line 62:
* User Interface: CLI
* User Interface: CLI
* Support API endpoint: N/A
* Support API endpoint: N/A
* Supported embedding models:
* License:
* License:


Line 63: Line 70:
* User Interface: CLI
* User Interface: CLI
* Support API endpoint: Available<ref>[https://llm.mlc.ai/docs/deploy/rest.html REST API — mlc-llm 0.1.0 documentation]</ref>
* Support API endpoint: Available<ref>[https://llm.mlc.ai/docs/deploy/rest.html REST API — mlc-llm 0.1.0 documentation]</ref>
* Supported embedding models:
* License: OSS, Apache License Version 2.0<ref>[https://github.com/mlc-ai/mlc-llm/blob/main/LICENSE mlc-llm/LICENSE at main · mlc-ai/mlc-llm]</ref>
* License: OSS, Apache License Version 2.0<ref>[https://github.com/mlc-ai/mlc-llm/blob/main/LICENSE mlc-llm/LICENSE at main · mlc-ai/mlc-llm]</ref>


Line 71: Line 79:
* User Interface: GUI or CLI
* User Interface: GUI or CLI
* Support API endpoint: Available or not
* Support API endpoint: Available or not
* Supported embedding models:
* License: Mixed, OSS or copyright
* License: Mixed, OSS or copyright
</pre>
</pre>

Revision as of 21:20, 3 July 2024

Run LLMs (Large language model) locally

List of LLMs software

Ollama[1]

  • Docker support: Available[2] Good.gif
  • Supported OS: Win Os windows.png , macOS icon_os_mac.png & Linux Os linux.png
  • User Interface: CLI (Command-line interface)
  • Support API endpoint: Available[3]
  • Supported embedding models:
  • License: OSS, MIT[4]

LM Studio - Discover, download, and run local LLMs

  • Docker support: N/A Icon_exclaim.gif
  • Supported OS: Win Os windows.png , macOS icon_os_mac.png & Linux Os linux.png
  • User Interface: GUI
  • Support API endpoint: Available[5]
  • Supported embedding models:
  • License: Mixed; Apache License 2.0 for lmstudio.js[6]. But Terms of Use[7] mentioned "using the Services for your personal, non-commercial purposes."

janhq/jan: Jan is an open source alternative to ChatGPT that runs 100% offline on your computer. Multiple engine support (llama.cpp, TensorRT-LLM)

  • Docker support: Available
  • Supported OS: Win Os windows.png , macOS icon_os_mac.png & Linux Os linux.png
  • User Interface: CLI (Command-line interface)
  • Support API endpoint: Available
  • Supported embedding models:
  • License: GAPL[8]

GPT4All

  • Docker support: N/A
  • Supported OS: Win Os windows.png , macOS icon_os_mac.png & Linux Os linux.png
  • User Interface: GUI
  • Support API endpoint: Available[9]
  • Supported embedding models: "SBert and Nomic Embed Text v1 & v1.5"[10]
  • License: OSS, MIT[11]

oobabooga/text-generation-webUser Interface: A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.

  • Docker support: Available
  • Supported OS: Win Os windows.png , macOS icon_os_mac.png & Linux Os linux.png
  • User Interface: GUI
  • Support API endpoint: Available[12]
  • Supported embedding models:
  • License: OSS, GNU v.3[13]

PrivateGPT

  • Docker support: Available
  • Supported OS: Win Os windows.png , macOS icon_os_mac.png & Linux Os linux.png [14]
  • User Interface: GUI
  • Support API endpoint: Available[15]
  • Supported embedding models:
  • License: OSS, Apache v.2[16]

AnythingLLM | The ultimate AI business intelligence tool

  • Docker support: Available[17]
  • Supported OS: Win Os windows.png , macOS icon_os_mac.png & Linux Os linux.png
  • User Interface: GUI
  • Support API endpoint:
  • License: Mixed, OSS, MIT[18]

LLM: A CLI utility and Python library for interacting with Large Language Models (llm · PyPI)

  • Docker support: The `LLM` package might be supported if installed in a Python Docker Container. However, it has not been actually tested. Icon_exclaim.gif
  • Supported OS:
  • User Interface: CLI
  • Support API endpoint: N/A
  • Supported embedding models:
  • License:

MLC LLM

  • Docker support: Not available[19]
  • Supported OS: Win Os windows.png , macOS icon_os_mac.png & Linux Os linux.png [20]
  • User Interface: CLI
  • Support API endpoint: Available[21]
  • Supported embedding models:
  • License: OSS, Apache License Version 2.0[22]

(left blank intentionally)

* Docker support: Available or not
* Supported OS: {{Win}}, {{Mac}} & {{Linux}}
* User Interface: GUI or CLI
* Support API endpoint: Available or not
* Supported embedding models:
* License: Mixed, OSS or copyright

Further reading

References