Run LLMs locally: Difference between revisions

From LemonWiki共筆
Jump to navigation Jump to search
(Created page with "Run LLMs (Large language model) locally == List of LLMs software == [https://ollama.com/ Ollama]<ref>[https://blog.miniasp.com/post/2024/03/04/Useful-tool-Ollama 介紹好用工具:Ollama 快速在本地啟動並執行大型語言模型 | The Will Will Web]</ref> * Docker support: Available<ref>[https://ollama.com/blog/ollama-is-now-available-as-an-official-docker-image Ollama is now available as an official Docker image · Ollama Blog]</ref> {{Gd}} * Supported OS: {{...")
 
No edit summary
 
(4 intermediate revisions by the same user not shown)
Line 5: Line 5:
* Docker support: Available<ref>[https://ollama.com/blog/ollama-is-now-available-as-an-official-docker-image Ollama is now available as an official Docker image · Ollama Blog]</ref> {{Gd}}
* Docker support: Available<ref>[https://ollama.com/blog/ollama-is-now-available-as-an-official-docker-image Ollama is now available as an official Docker image · Ollama Blog]</ref> {{Gd}}
* Supported OS: {{Win}}, {{Mac}} & {{Linux}}
* Supported OS: {{Win}}, {{Mac}} & {{Linux}}
* User Interface: GUI
* User Interface: CLI (Command-line interface)
* Support API endpoint: Available<ref>[https://github.com/ollama/ollama?tab=readme-ov-file#rest-api ollama/ollama: Get up and running with Llama 3, Mistral, Gemma, and other large language models.]</ref>
* Support API endpoint: Available<ref>[https://github.com/ollama/ollama?tab=readme-ov-file#rest-api ollama/ollama: Get up and running with Llama 3, Mistral, Gemma, and other large language models.]</ref>
* License: OSS, MIT<ref>[https://github.com/ollama/ollama/blob/main/LICENSE ollama/LICENSE at main · ollama/ollama]</ref>
* License: OSS, MIT<ref>[https://github.com/ollama/ollama/blob/main/LICENSE ollama/LICENSE at main · ollama/ollama]</ref>
Line 36: Line 36:
* Support API endpoint: Available<ref>[https://github.com/oobabooga/text-generation-webui/wiki/12-%E2%80%90-OpenAI-API#examples 12 ‐ OpenAI API · oobabooga/text-generation-webui Wiki]</ref>
* Support API endpoint: Available<ref>[https://github.com/oobabooga/text-generation-webui/wiki/12-%E2%80%90-OpenAI-API#examples 12 ‐ OpenAI API · oobabooga/text-generation-webui Wiki]</ref>
* License: OSS, GNU v.3<ref>[https://github.com/oobabooga/text-generation-webui/blob/main/LICENSE text-generation-webui/LICENSE at main · oobabooga/text-generation-webui]</ref>
* License: OSS, GNU v.3<ref>[https://github.com/oobabooga/text-generation-webui/blob/main/LICENSE text-generation-webui/LICENSE at main · oobabooga/text-generation-webui]</ref>
[https://github.com/zylon-ai/private-gpt PrivateGPT]
* Docker support: Available
* Supported OS: {{Win}}, {{Mac}} & {{Linux}}<ref>[https://docs.privategpt.dev/installation/getting-started/installation Installation – PrivateGPT | Docs]</ref>
* User Interface: GUI
* Support API endpoint: Available<ref>[https://docs.privategpt.dev/api-reference/overview/api-reference-overview API Reference overview – PrivateGPT | Docs]</ref>
* License: OSS, Apache v.2<ref>[https://github.com/zylon-ai/private-gpt/blob/main/LICENSE private-gpt/LICENSE at main · zylon-ai/private-gpt]</ref>
[https://useanything.com/ AnythingLLM | The ultimate AI business intelligence tool]
* Docker support: Available<ref>[https://docs.useanything.com/desktop-vs-docker 🆚 Desktop Vs. Docker | AnythingLLM]</ref>
* Supported OS: {{Win}}, {{Mac}} & {{Linux}}
* User Interface: GUI
* Support API endpoint:
* License: Mixed, OSS, MIT<ref>[https://github.com/Mintplex-Labs/anything-llm/blob/master/LICENSE anything-llm/LICENSE at master · Mintplex-Labs/anything-llm]</ref>
[https://llm.datasette.io/en/stable/ LLM: A CLI utility and Python library for interacting with Large Language Models] ([https://pypi.org/project/llm/ llm · PyPI])
* Docker support: The `LLM` package might be supported if installed in a Python Docker Container. However, it has not been actually tested. {{exclaim}}
* Supported OS:
* User Interface: CLI
* Support API endpoint: N/A
* License:


(left blank intentionally)
(left blank intentionally)

Latest revision as of 11:28, 30 April 2024

Run LLMs (Large language model) locally

List of LLMs software[edit]

Ollama[1]

  • Docker support: Available[2] Good.gif
  • Supported OS: Win Os windows.png , Mac icon_os_mac.png & Linux Os linux.png
  • User Interface: CLI (Command-line interface)
  • Support API endpoint: Available[3]
  • License: OSS, MIT[4]

LM Studio - Discover, download, and run local LLMs

  • Docker support: N/A Icon_exclaim.gif
  • Supported OS: Win Os windows.png , Mac icon_os_mac.png & Linux Os linux.png
  • User Interface: GUI
  • Support API endpoint: Available[5]
  • License: Mixed; Apache License 2.0 for lmstudio.js[6]. But Terms of Use[7] mentioned "using the Services for your personal, non-commercial purposes."

janhq/jan: Jan is an open source alternative to ChatGPT that runs 100% offline on your computer. Multiple engine support (llama.cpp, TensorRT-LLM)

  • Docker support: Available
  • Supported OS: Win Os windows.png , Mac icon_os_mac.png & Linux Os linux.png
  • User Interface: CLI (Command-line interface)
  • Support API endpoint: Available
  • License: GAPL[8]

GPT4All

  • Docker support: N/A
  • Supported OS: Win Os windows.png , Mac icon_os_mac.png & Linux Os linux.png
  • User Interface: GUI
  • Support API endpoint: Available[9]
  • License: OSS, MIT[10]

oobabooga/text-generation-webUser Interface: A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.

  • Docker support: Available
  • Supported OS: Win Os windows.png , Mac icon_os_mac.png & Linux Os linux.png
  • User Interface: GUI
  • Support API endpoint: Available[11]
  • License: OSS, GNU v.3[12]

PrivateGPT

  • Docker support: Available
  • Supported OS: Win Os windows.png , Mac icon_os_mac.png & Linux Os linux.png [13]
  • User Interface: GUI
  • Support API endpoint: Available[14]
  • License: OSS, Apache v.2[15]

AnythingLLM | The ultimate AI business intelligence tool

  • Docker support: Available[16]
  • Supported OS: Win Os windows.png , Mac icon_os_mac.png & Linux Os linux.png
  • User Interface: GUI
  • Support API endpoint:
  • License: Mixed, OSS, MIT[17]

LLM: A CLI utility and Python library for interacting with Large Language Models (llm · PyPI)

  • Docker support: The `LLM` package might be supported if installed in a Python Docker Container. However, it has not been actually tested. Icon_exclaim.gif
  • Supported OS:
  • User Interface: CLI
  • Support API endpoint: N/A
  • License:


(left blank intentionally)

* Docker support: Available or not
* Supported OS: {{Win}}, {{Mac}} & {{Linux}}
* User Interface: GUI or CLI
* Support API endpoint: Available or not
* License: Mixed, OSS or copyright

Further reading[edit]

References[edit]