Gerbil lets you run LLMs locally powered by KoboldCpp which itself is a highly modified fork of llama.cpp.
Read more
The Linux Portal Site
Gerbil lets you run LLMs locally powered by KoboldCpp which itself is a highly modified fork of llama.cpp.
Read moreElelem is a simple LLM client that connects seamlessly with OpenAI API compatible services.
Read moreOllama is software in an early stage of development that lets you run and chat with Llama 2 and other models.
Read more