Run LLMs locally: Difference between revisions

From LemonWiki共筆
Jump to navigation Jump to search
No edit summary
(+ MLC LLM)
Line 58: Line 58:
* License:
* License:


[https://llm.mlc.ai/ MLC LLM]
* Docker support: Not available<ref>[https://github.com/mlc-ai/mlc-llm/issues/33 Docker support · Issue #33 · mlc-ai/mlc-llm]</ref>
* Supported OS: {{Win}}, {{Mac}} & {{Linux}} <ref>[https://llm.mlc.ai/docs/install/mlc_llm Install MLC LLM Python Package — mlc-llm 0.1.0 documentation]</ref>
* User Interface: CLI
* Support API endpoint: Available<ref>[https://llm.mlc.ai/docs/deploy/rest.html REST API — mlc-llm 0.1.0 documentation]</ref>
* License: OSS, Apache License Version 2.0<ref>[https://github.com/mlc-ai/mlc-llm/blob/main/LICENSE mlc-llm/LICENSE at main · mlc-ai/mlc-llm]</ref>


(left blank intentionally)
(left blank intentionally)

Revision as of 14:51, 9 June 2024

Run LLMs (Large language model) locally

List of LLMs software

Ollama[1]

  • Docker support: Available[2] Good.gif
  • Supported OS: Win Os windows.png , macOS icon_os_mac.png & Linux Os linux.png
  • User Interface: CLI (Command-line interface)
  • Support API endpoint: Available[3]
  • License: OSS, MIT[4]

LM Studio - Discover, download, and run local LLMs

  • Docker support: N/A Icon_exclaim.gif
  • Supported OS: Win Os windows.png , macOS icon_os_mac.png & Linux Os linux.png
  • User Interface: GUI
  • Support API endpoint: Available[5]
  • License: Mixed; Apache License 2.0 for lmstudio.js[6]. But Terms of Use[7] mentioned "using the Services for your personal, non-commercial purposes."

janhq/jan: Jan is an open source alternative to ChatGPT that runs 100% offline on your computer. Multiple engine support (llama.cpp, TensorRT-LLM)

  • Docker support: Available
  • Supported OS: Win Os windows.png , macOS icon_os_mac.png & Linux Os linux.png
  • User Interface: CLI (Command-line interface)
  • Support API endpoint: Available
  • License: GAPL[8]

GPT4All

  • Docker support: N/A
  • Supported OS: Win Os windows.png , macOS icon_os_mac.png & Linux Os linux.png
  • User Interface: GUI
  • Support API endpoint: Available[9]
  • License: OSS, MIT[10]

oobabooga/text-generation-webUser Interface: A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.

  • Docker support: Available
  • Supported OS: Win Os windows.png , macOS icon_os_mac.png & Linux Os linux.png
  • User Interface: GUI
  • Support API endpoint: Available[11]
  • License: OSS, GNU v.3[12]

PrivateGPT

  • Docker support: Available
  • Supported OS: Win Os windows.png , macOS icon_os_mac.png & Linux Os linux.png [13]
  • User Interface: GUI
  • Support API endpoint: Available[14]
  • License: OSS, Apache v.2[15]

AnythingLLM | The ultimate AI business intelligence tool

  • Docker support: Available[16]
  • Supported OS: Win Os windows.png , macOS icon_os_mac.png & Linux Os linux.png
  • User Interface: GUI
  • Support API endpoint:
  • License: Mixed, OSS, MIT[17]

LLM: A CLI utility and Python library for interacting with Large Language Models (llm · PyPI)

  • Docker support: The `LLM` package might be supported if installed in a Python Docker Container. However, it has not been actually tested. Icon_exclaim.gif
  • Supported OS:
  • User Interface: CLI
  • Support API endpoint: N/A
  • License:

MLC LLM

  • Docker support: Not available[18]
  • Supported OS: Win Os windows.png , macOS icon_os_mac.png & Linux Os linux.png [19]
  • User Interface: CLI
  • Support API endpoint: Available[20]
  • License: OSS, Apache License Version 2.0[21]

(left blank intentionally)

* Docker support: Available or not
* Supported OS: {{Win}}, {{Mac}} & {{Linux}}
* User Interface: GUI or CLI
* Support API endpoint: Available or not
* License: Mixed, OSS or copyright

Further reading

References