Complete guide for using alternative AI models with Claude Code — including DeepSeek, Qwen, MiniMax, Kimi, GLM, MiMo, StepFun, and more. Pricing, configs, and coding plans.
-
Updated
Apr 20, 2026
Complete guide for using alternative AI models with Claude Code — including DeepSeek, Qwen, MiniMax, Kimi, GLM, MiMo, StepFun, and more. Pricing, configs, and coding plans.
Community-maintained registry of AI/LLM model configurations - pricing, features, and limits across 19 providers and 1000+ models
OpenRouter model information
Easy-to-use LLM API from state-of-the-art providers and comparison
Some cost estimates for using LLMs via API versus web UI services (like ChatGPT)
119 AI models × 55 benchmarks with per-score freshness dates, auto-updated pricing, task routing. Every score has a date and source URL. Daily CI.
The most comprehensive LLM pricing comparison. 36+ models, 12 providers, CLI calculator. Community-maintained.
Rate card for common LLM APIs for use within TensorFoundry Products
Directory of AI and LLM pricing calculators. 8 calculators live, more in development. Fast cost estimates for Claude, OpenAI, Gemini, and other AI services.
LLM cost calculator and usage tracking. Monitor API spend across providers.
Claude Code Skill — estimate and compare LLM API costs across OpenAI, Anthropic, Google, DeepSeek, Mistral. Per-token pricing, batch/caching discounts, workload templates, cross-provider comparison.
Latest pricing and feature overview for large language models from major AI companies
Compare and track up-to-date LLM pricing to help you find the most cost-effective AI models and avoid overpaying.
MCP server for live AI tool and LLMs status, API pricing, and rate limits — powered by tickerr.ai
AI CLI tools: prompt censorship checker & bypass, model quality watchdog & degradation monitor, API cost comparison for OpenAI Claude DeepSeek Gemini. Built from real Reddit user complaints.
🤖 Explore cost-effective AI models with our guide on Claude Code compatibility, featuring pricing, setup instructions, and examples for multiple providers.
Add a description, image, and links to the llm-pricing topic page so that developers can more easily learn about it.
To associate your repository with the llm-pricing topic, visit your repo's landing page and select "manage topics."