Skip to content

Fix/custom local#13

Merged
Verdenroz merged 3 commits intomasterfrom
fix/custom-local
Feb 19, 2026
Merged

Fix/custom local#13
Verdenroz merged 3 commits intomasterfrom
fix/custom-local

Conversation

@Verdenroz
Copy link
Copy Markdown
Owner

Summary

  • Adds CUSTOM = "custom" to the Provider enum so that the custom local endpoint registered via base_url= is a valid enum member.
  • Registers Provider.CUSTOM: _openai_format in _FORMAT_BUILDERS so that structured outputs (response_model=) work against any OpenAI-compatible local server (llama.cpp, llama-swap, Ollama, etc.).
  • Fixes the providers property, which iterates all registered providers and calls Provider(name) — this would crash whenever a custom endpoint was configured.

Breaking Changes

None

Test Plan

  • Chimeric(base_url="http://127.0.0.1:12434/v1").list_models() returns models without error
  • client.generate(model=..., messages=..., response_model=MyModel) parses structured output from a local provider

Checklist

  • Tests pass (nox -s tests)
  • Coverage ≥ 80%
  • Docs updated (if public API changed)
  • No hardcoded secrets or credentials

@Verdenroz Verdenroz merged commit bcc6904 into master Feb 19, 2026
14 checks passed
@codecov
Copy link
Copy Markdown

codecov bot commented Feb 19, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.

📢 Thoughts on this report? Let us know!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant