Gerbil lets you run LLMs locally powered by KoboldCpp which itself is a highly modified fork of llama.cpp.
Read more
The Linux Portal Site
Gerbil lets you run LLMs locally powered by KoboldCpp which itself is a highly modified fork of llama.cpp.
Read moreElelem is a simple LLM client that connects seamlessly with OpenAI API compatible services.
Read moreDid you know that even a Raspberry Pi 5 is able to run LLMs locally? Even better, the developer of Pi-Apps has produced an optimized install script which makes it Raspberry Pi 5 friendly.
Read moreOllama is software in an early stage of development that lets you run and chat with Llama 2 and other models.
Read more