Run LLMs locally: Difference between revisions

Jump to navigation Jump to search
468 bytes added ,  30 April 2024
no edit summary
mNo edit summary
No edit summary
Line 43: Line 43:
* Support API endpoint: Available<ref>[https://docs.privategpt.dev/api-reference/overview/api-reference-overview API Reference overview – PrivateGPT | Docs]</ref>
* Support API endpoint: Available<ref>[https://docs.privategpt.dev/api-reference/overview/api-reference-overview API Reference overview – PrivateGPT | Docs]</ref>
* License: OSS, Apache v.2<ref>[https://github.com/zylon-ai/private-gpt/blob/main/LICENSE private-gpt/LICENSE at main · zylon-ai/private-gpt]</ref>
* License: OSS, Apache v.2<ref>[https://github.com/zylon-ai/private-gpt/blob/main/LICENSE private-gpt/LICENSE at main · zylon-ai/private-gpt]</ref>
[https://useanything.com/ AnythingLLM | The ultimate AI business intelligence tool]
* Docker support: Available<ref>[https://docs.useanything.com/desktop-vs-docker 🆚 Desktop Vs. Docker | AnythingLLM]</ref>
* Supported OS: {{Win}}, {{Mac}} & {{Linux}}
* User Interface: GUI
* Support API endpoint:
* License: Mixed, OSS, MIT<ref>[https://github.com/Mintplex-Labs/anything-llm/blob/master/LICENSE anything-llm/LICENSE at master · Mintplex-Labs/anything-llm]</ref>


[https://llm.datasette.io/en/stable/ LLM: A CLI utility and Python library for interacting with Large Language Models] ([https://pypi.org/project/llm/ llm · PyPI])
[https://llm.datasette.io/en/stable/ LLM: A CLI utility and Python library for interacting with Large Language Models] ([https://pypi.org/project/llm/ llm · PyPI])
Line 50: Line 57:
* Support API endpoint: N/A
* Support API endpoint: N/A
* License:
* License:


(left blank intentionally)
(left blank intentionally)

Navigation menu