Skip to content

trivox-io/llmx

Repository files navigation

llmix

llmix is a unified Python interface for multiple LLM providers.

It now includes:

  • Python 3.9+ support
  • prompt and low-level messages APIs
  • request-level structured JSON output controls
  • OpenAI, Groq, Gemini, Claude/Anthropic, and Ollama adapters
  • Ollama native and OpenAI-compatible transport modes
  • lightweight local relevance-based RAG for code/docs repositories
  • normalized provider error metadata for UI integrations

Installation

poetry install

For local development and tests:

poetry install -E dev

Quick Example

from __future__ import annotations

from llmix import LLMix


lm = LLMix(
    {
        "default_provider": "ollama",
        "default_model": "llama3.2:latest",
        "providers": {
            "ollama": {
                "base_url": "http://localhost:11434",
                "transport": "openai",
            }
        },
    }
)

response = lm.chat(
    "Return a JSON object with keys provider and summary.",
    expect_json=True,
)
print(response.content)

Integration Guide

See docs/integration.md for:

  • one-shot JSON calls
  • streaming calls
  • low-level messages calls
  • Claude/Anthropic alias usage
  • Ollama OpenAI-compatible mode
  • indexing and retrieving from code/docs repositories

Additional documentation:

Testing

poetry run pytest -q

About

Unified Python interface for working with multiple LLM providers across local and hosted models.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages