Skip to content

Openrouter provider#6

Closed
Johnson-f wants to merge 6 commits intoVerdenroz:masterfrom
Johnson-f:openrouter-provider
Closed

Openrouter provider#6
Johnson-f wants to merge 6 commits intoVerdenroz:masterfrom
Johnson-f:openrouter-provider

Conversation

@Johnson-f
Copy link
Copy Markdown

OpenRouter Provider Guide

OpenRouter is a unified API that provides access to 200+ AI models from multiple providers including OpenAI, Anthropic, Google, Meta, Microsoft, and more. The chimeric OpenRouter provider gives you seamless access to this vast ecosystem of models through a single, consistent interface.

🌟 Key Features

  • 200+ Models: Access models from OpenAI, Anthropic, Google, Meta, Microsoft, Perplexity, and more
  • Cost Optimization: Often cheaper than direct provider APIs with transparent pricing
  • Unified Interface: Same API for all models, no need to learn different SDKs
  • Model Fallbacks: Automatic failover if a model is unavailable
  • Free Models: Access to free models for testing and development
  • Higher Rate Limits: Better rate limits than individual provider APIs
  • Streaming Support: Real-time response streaming for all compatible models
  • Tool Calling: Function calling support for models that support it
  • Async Support: Full async/await support for high-performance applications

📦 Installation

# Install chimeric with OpenAI support (OpenRouter is OpenAI-compatible)
pip install "chimeric[openai]"

# Or with uv (recommended)
uv add "chimeric[openai]"

# For development with all extras
uv add "chimeric[all]"

🔐 Authentication

Get Your API Key

  1. Visit OpenRouter.ai
  2. Sign up for a free account
  3. Navigate to the API Keys section
  4. Generate a new API key

Set Up Environment Variable

# Set environment variable (recommended)
export OPENROUTER_API_KEY="your-api-key-here"

# Or add to your .env file
echo "OPENROUTER_API_KEY=your-api-key-here" >> .env

🚀 Quick Start

Basic Usage

from chimeric import Chimeric

# Initialize with environment variable
client = Chimeric()

# Or pass API key directly
client = Chimeric(openrouter_api_key="your-api-key")

# Simple text generation
response = client.generate(
    model="openai/gpt-4o-mini",
    messages="Hello! Explain quantum computing in simple terms."
)

print(response.content)

List Available Models

# Get all available models
models = client.list_models("openrouter")
print(f"Total models available: {len(models)}")

# Show first few models
for model in models[:5]:
    print(f"- {model.id}: {model.name}")

🎯 Model Selection

OpenRouter provides access to models from many providers. Here are some popular choices:

OpenAI Models

# GPT-4o models (latest and most capable)
response = client.generate(
    model="openai/gpt-4o",
    messages="Write a Python function to calculate fibonacci numbers"
)

# GPT-4o-mini (faster, cheaper)
response = client.generate(
    model="openai/gpt-4o-mini", 
    messages="Summarize the benefits of renewable energy"
)

Copy link
Copy Markdown
Owner

@Verdenroz Verdenroz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried testing the tools + streaming but found that I was getting no chunks at all. Do you have the cassettes from the integration test or a code snippet of what model you used to test tools + streaming with your OpenRouterClient?

@Johnson-f
Copy link
Copy Markdown
Author

Will look into it

@Verdenroz
Copy link
Copy Markdown
Owner

Closing this as I have refactored the entire library to flatten dependencies. I have included Openrouter in #12.

@Verdenroz Verdenroz closed this Feb 18, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants