Skip to content
#

llm-chat

Here are 27 public repositories matching this topic...

A complete, menu-driven AI model interface for Windows that simplifies running local GGUF language models with llama.cpp. This tool automatically manages dependencies, provides multiple interaction modes, and prioritizes user privacy through fully offline operation.

  • Updated Jan 30, 2026
  • PowerShell

Chat LLM local: Interfaz CLI para modelos GGUF y Transformers con compatibilidad CUDA. Permite ejecutar Llama, Mistral, Gemma, Phi y Qwen localmente con detección automática de modelos, adaptación de mensajes del sistema, soporte RAG y más.

  • Updated Nov 5, 2025
  • Python

Improve this page

Add a description, image, and links to the llm-chat topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the llm-chat topic, visit your repo's landing page and select "manage topics."

Learn more