Run LLMs locally: Difference between revisions

Jump to navigation Jump to search
109 bytes added ,  3 January 2025
mNo edit summary
Line 87: Line 87:
* [https://github.com/Troyanovsky/Local-LLM-Comparison-Colab-UI Troyanovsky/Local-LLM-Comparison-Colab-UI: Compare the performance of different LLM that can be deployed locally on consumer hardware. Run yourself with Colab WebUI.]
* [https://github.com/Troyanovsky/Local-LLM-Comparison-Colab-UI Troyanovsky/Local-LLM-Comparison-Colab-UI: Compare the performance of different LLM that can be deployed locally on consumer hardware. Run yourself with Colab WebUI.]
* [https://abishekmuthian.com/how-i-run-llms-locally/ How I run LLMs locally - Abishek Muthian]
* [https://abishekmuthian.com/how-i-run-llms-locally/ How I run LLMs locally - Abishek Muthian]
* [https://iamhlb.notion.site/HOWTO-Setup-Local-LLM-16757ba4def5805c9baad0c78d54d62f HOWTO: Setup Local LLM]


== References ==
== References ==

Navigation menu