Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
21 commits
Select commit Hold shift + click to select a range
13ccc0f
feat(ai): add custom OpenAI-compatible provider
r-pedraza Apr 7, 2026
1e2d91d
feat: Enable streaming and HTTP/2 for custom AI providers
finxo Apr 13, 2026
1016f0d
Merge branch 'master' into feat/create-custom-AI-provider
finxo Apr 13, 2026
b1b0ee1
refactor: Rename AI provider configuration to AI connection
finxo Apr 13, 2026
39e9615
refactor: Rename AI provider to AI connection in the configuration wi…
finxo Apr 13, 2026
3778f6e
feat: Add OpenAI support and refactor AI config wizard to use Source/…
finxo Apr 13, 2026
ed3b120
refactor: Introduce dynamic AI provider factory and LiteLLM support
finxo Apr 13, 2026
cdb6175
feat: Add automatic installation of missing AI provider dependencies
finxo Apr 13, 2026
f42f0aa
refactor: Introduce LiteLLMClient to encapsulate OpenAI client logic
finxo Apr 13, 2026
6c672a3
refactor: Reorganize config migrations and persist migrated configura…
finxo Apr 13, 2026
f7d9203
fix: Improve PR generation error handling and disable LiteLLM streaming
finxo Apr 13, 2026
aede905
feat: Enhance AI configuration UI with improved labeling and layout
finxo Apr 13, 2026
5be9664
feat: Add support for LLM gateways and multiple AI connections with O…
finxo Apr 13, 2026
8ad8060
feat: Separate global and project config migration managers
finxo Apr 13, 2026
dbb2266
refactor: Remove unused config version field from project configuration
finxo Apr 14, 2026
f40284c
chore: Remove unused Jira plugin configuration section
finxo Apr 14, 2026
5848226
feat: Add PR status tracking and error handling to GitHub plugin agents
finxo Apr 14, 2026
edb7e6d
fix: Improve error handling in AI PR description generation
finxo Apr 14, 2026
e09f9c9
refactor: Replace string-based provider keys with enum types and reor…
finxo Apr 14, 2026
270b852
refactor: Rename AIConnectionKind to AIConnectionType and update rela…
finxo Apr 14, 2026
7f13816
feat: Normalize legacy connection IDs in v1 configuration migration
finxo Apr 14, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 1 addition & 4 deletions .titan/config.toml
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,4 @@ pr_template_path = ".github/pull_request_template.md"
auto_assign_prs = true

[plugins.jira]
enabled = false

[plugins.jira.config]
default_project = "ECAPP"
enabled = false
85 changes: 45 additions & 40 deletions AGENTS.md
Original file line number Diff line number Diff line change
Expand Up @@ -434,8 +434,8 @@ from titan_cli.core.config import TitanConfig

config = TitanConfig() # Loads and merges global + project
print(config.config.project.name)
print(config.config.ai.default) # Default provider ID
print(config.config.ai.providers) # Dict of all configured providers
print(config.config.ai.default_connection) # Default AI connection ID
print(config.config.ai.connections) # Dict of all configured AI connections

# Check enabled plugins
if config.is_plugin_enabled("github"):
Expand Down Expand Up @@ -566,7 +566,9 @@ class MyCoolPlugin(TitanPlugin):

## πŸ€– AI Integration

Titan CLI includes a modular AI integration layer that allows for interaction with multiple AI providers (Anthropic, Gemini).
Titan CLI includes a modular AI integration layer that supports multiple AI connections.
Connections can be either direct providers (Anthropic, OpenAI, Gemini) or LLM gateways
that expose OpenAI-compatible endpoints such as LiteLLM.

### File Structure (`ai/`)

Expand All @@ -580,66 +582,69 @@ titan_cli/ai/
β”œβ”€β”€ exceptions.py # Custom AI-related exceptions
β”œβ”€β”€ models.py # Data models (AIRequest, AIResponse)
β”œβ”€β”€ oauth_helper.py # Helper for Google Cloud OAuth
β”œβ”€β”€ litellm_client.py # Shared OpenAI-compatible gateway client
└── providers/
β”œβ”€β”€ __init__.py
β”œβ”€β”€ base.py # AIProvider abstract base class
β”œβ”€β”€ anthropic.py
β”œβ”€β”€ gemini.py
β”œβ”€β”€ openai.py
└── litellm.py
```

### Core Components

- **`AIClient` (`ai/client.py`):** This is the main entry point for using AI functionality. It acts as a facade that reads the user's configuration, retrieves the necessary secrets via `SecretManager`, and instantiates the correct provider. Supports multiple providers with a `provider_id` parameter to select which one to use.
- **`AIProvider` (`ai/providers/base.py`):** This is an abstract base class that defines the interface for all AI providers. Each provider implements the `generate()` method to interact with its specific API.
- **`AIClient` (`ai/client.py`):** Main facade for AI usage. It reads the configured AI connection, retrieves secrets via `SecretManager`, and instantiates the correct direct provider or gateway adapter. Use `connection_id` to select a specific connection.
- **`AIProvider` (`ai/providers/base.py`):** Abstract base class implemented by direct providers and the LiteLLM/OpenAI-compatible gateway provider.
- **`LiteLLMClient` (`ai/litellm_client.py`):** Shared client for OpenAI-compatible gateways used for connection testing and model discovery.

### Configuration

AI configuration supports **multiple providers** simultaneously. Users can configure both corporate and personal providers, each with different models and endpoints.
AI configuration supports **multiple connections** simultaneously.
Each connection can be:
- **LLM Gateway**: one endpoint exposing multiple models through an OpenAI-compatible API
- **Direct Provider**: a direct vendor integration such as Anthropic, OpenAI, or Gemini

AI providers are configured via:
- Interactive command: `titan ai configure`
- Main menu option: "AI Configuration" β†’ "Configure AI Provider"
AI connections are configured from the TUI:
- Main menu option: `AI Configuration`
- Then create a new connection and optionally set it as default

The configuration workflow:
1. Select configuration type (Corporate or Individual)
2. Enter base URL (for corporate endpoints only)
3. Select provider (Anthropic, OpenAI, or Gemini)
1. Select connection type (`LLM Gateway` or `Direct Provider`)
2. For gateways, enter the base URL
3. Select the direct provider source when applicable
4. Provide API key (stored securely via `SecretManager`)
5. Select model with suggestions for popular models
6. Assign a friendly name to the provider
5. Select or enter the default model
6. Assign a friendly name to the connection
7. Optionally configure advanced settings (temperature, max_tokens)
8. Optionally mark as default provider
8. Optionally mark as default connection
9. Test the connection

Configuration is stored in the global `~/.titan/config.toml` file with support for multiple providers:
Configuration is stored in the global `~/.titan/config.toml` file:

```toml
[ai]
default = "corporate-gemini" # Default provider ID
default_connection = "work-gateway"

[ai.providers.corporate-gemini]
name = "Corporate Gemini"
type = "corporate"
provider = "gemini"
model = "gemini-2.0-flash-exp"
[ai.connections.work-gateway]
name = "Work Gateway"
kind = "gateway"
gateway_type = "openai_compatible"
base_url = "https://llm.company.com"
default_model = "gemini-2.5-pro"
temperature = 0.7
max_tokens = 4096

[ai.providers.personal-claude]
[ai.connections.personal-claude]
name = "Personal Claude"
type = "individual"
kind = "direct_provider"
provider = "anthropic"
model = "claude-3-5-sonnet-20241022"
default_model = "claude-sonnet-4-5"
temperature = 0.7
max_tokens = 4096
```

**Available commands:**
- `titan ai configure` - Configure a new AI provider
- `titan ai list` - List all configured providers
- `titan ai set-default [provider-id]` - Change default provider
- `titan ai test` - Test connection to default provider
Titan still loads legacy AI config and migrates it to this structure automatically.

### Usage

Expand All @@ -657,15 +662,15 @@ config = TitanConfig()
secrets = SecretManager()

# 2. Check if AI is configured
if not config.config.ai or not config.config.ai.providers:
print("No AI providers configured. Run: titan ai configure")
if not config.config.ai or not config.config.ai.connections:
print("No AI connections configured.")
return

# 3. Create the AI client (uses default provider)
# 3. Create the AI client (uses default connection)
try:
ai_client = AIClient(config.config.ai, secrets)
# Or specify a specific provider:
# ai_client = AIClient(config.config.ai, secrets, provider_id="corporate-gemini")
# Or specify a specific connection:
# ai_client = AIClient(config.config.ai, secrets, connection_id="work-gateway")
except AIConfigurationError as e:
print(f"AI not available: {e}")
return
Expand All @@ -686,12 +691,12 @@ if ai_client.is_available():
)
print(creative_response.content)

# 5. Using a specific provider
corporate_client = AIClient(config.config.ai, secrets, provider_id="corporate-gemini")
personal_client = AIClient(config.config.ai, secrets, provider_id="personal-claude")
# 5. Using specific connections
gateway_client = AIClient(config.config.ai, secrets, connection_id="work-gateway")
personal_client = AIClient(config.config.ai, secrets, connection_id="personal-claude")

# Each client uses its own provider configuration
corp_response = corporate_client.generate(messages)
# Each client uses its own connection configuration
corp_response = gateway_client.generate(messages)
personal_response = personal_client.generate(messages)
```

Expand Down
32 changes: 18 additions & 14 deletions CLAUDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -299,8 +299,9 @@ When creating new steps or refactoring existing ones:

- **Python 3.11+**
- **Textual**: TUI framework
- **Anthropic SDK**: Claude integration
- **Google GenAI SDK**: Gemini integration
- **Anthropic SDK**: Anthropic direct provider
- **Google GenAI SDK**: Gemini direct provider
- **OpenAI SDK**: OpenAI direct provider and LiteLLM/OpenAI-compatible gateways
- **PyGithub**: GitHub API client
- **Requests**: HTTP client for APIs

Expand Down Expand Up @@ -340,7 +341,7 @@ in another terminal.
Titan uses a two-level configuration system:

1. **Global Configuration** (`~/.titan/config.toml`):
- AI provider settings (shared across all projects)
- AI connection settings (shared across all projects)
- Global preferences

2. **Project Configuration** (`.titan/config.toml` at the git root):
Expand All @@ -356,14 +357,14 @@ Titan uses a two-level configuration system:

Example global config:
```toml
[ai.providers.default]
[ai]
default_connection = "default"

[ai.connections.default]
name = "My Claude"
type = "individual"
kind = "direct_provider"
provider = "anthropic"
model = "claude-sonnet-4-5"

[ai]
default = "default"
default_model = "claude-sonnet-4-5"
```

Example project config:
Expand Down Expand Up @@ -437,17 +438,19 @@ and run `textual console` in another terminal.
- βœ… Git plugin (commits, branches, etc.)
- βœ… GitHub plugin (PRs, issues with AI)
- βœ… Jira plugin (search, AI-powered analysis)
- βœ… Claude integration (Anthropic)
- βœ… Gemini integration (Google)
- βœ… Anthropic direct provider
- βœ… OpenAI direct provider
- βœ… Gemini direct provider
- βœ… LiteLLM/OpenAI-compatible gateway
- βœ… All workflow steps migrated to Textual

### Features

- **Declarative Workflows**: Define flows in YAML
- **Integrated AI**: Use Claude or Gemini to generate commit messages, PR descriptions, issue analysis
- **Integrated AI**: Use configured AI connections to generate commit messages, PR descriptions, and issue analysis
- **Interactive TUI**: Modern interface with Textual
- **Extensible**: Plugin system
- **Multi-Provider**: Supports multiple AI providers
- **Multi-Connection**: Supports direct providers and LLM gateways

## Recent Important Commits

Expand All @@ -469,7 +472,7 @@ Titan has been redesigned to work on a per-project basis:

- **Removed**: Global `project_root` and `active_project` settings from `[core]` configuration
- **New Flow**: Titan must be run from within a project directory
- **Global Config**: Now only stores AI provider settings (shared across projects)
- **Global Config**: Now only stores AI connection settings (shared across projects)
- **Project Config**: Each project has its own `.titan/config.toml` with:
- Project name
- Enabled plugins
Expand Down Expand Up @@ -499,6 +502,7 @@ Two new wizards guide users through initial setup:

- **Textual Documentation**: https://textual.textualize.io/
- **Anthropic API**: https://docs.anthropic.com/
- **OpenAI API**: https://platform.openai.com/docs/
- **Google GenAI**: https://ai.google.dev/

---
Expand Down
10 changes: 5 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ Titan CLI is a powerful command-line orchestrator that automates Git, GitHub, JI
- πŸ”§ **Project Configuration** - Centralized `.titan/config.toml` for project-specific settings
- πŸ”Œ **Plugin System** - Extend functionality with Git, GitHub, JIRA, and custom plugins
- 🎨 **Modern TUI** - Beautiful terminal interface powered by Textual
- πŸ€– **AI Integration** - Optional AI assistance (Claude & Gemini) for commits, PRs, and analysis
- πŸ€– **AI Integration** - Optional AI assistance through direct providers and LLM gateways
- ⚑ **Workflow Engine** - Compose atomic steps into powerful automated workflows
- πŸ” **Secure Secrets** - OS keyring integration for API tokens and credentials

Expand Down Expand Up @@ -59,7 +59,7 @@ titan
```

On first run, Titan will guide you through:
1. **Global Setup** - Configure AI providers (optional)
1. **Global Setup** - Configure AI connections (optional)
2. **Project Setup** - Enable plugins and configure project settings

### Basic Usage
Expand All @@ -82,10 +82,10 @@ Titan CLI v1.0.0 includes three core plugins:

## πŸ€– AI Integration

Titan supports multiple AI providers:
Titan supports multiple AI connections:

- **Anthropic Claude** (Sonnet, Opus, Haiku)
- **Google Gemini** (Pro, Flash)
- **Direct Provider**: Anthropic, OpenAI, Gemini
- **LLM Gateway**: OpenAI-compatible endpoints such as LiteLLM

Configure during first setup or later via the TUI settings.

Expand Down
59 changes: 42 additions & 17 deletions docs/concepts/ai-integration.md
Original file line number Diff line number Diff line change
@@ -1,30 +1,55 @@
# AI Integration

!!! note "Coming soon"
Detailed AI integration documentation is being written.

---

## Overview

Titan supports two AI providers:
Titan supports AI through configurable **connections**:

- **Anthropic Claude** (Sonnet, Opus, Haiku)
- **Google Gemini** (Pro, Flash)
- **Direct Provider** connections for:
- Anthropic
- OpenAI
- Gemini
- **LLM Gateway** connections for OpenAI-compatible endpoints such as LiteLLM

Configure your provider in `~/.titan/config.toml`:
Configure AI in `~/.titan/config.toml`:

```toml
[ai.providers.default]
name = "Claude"
type = "individual"
provider = "anthropic"
model = "claude-sonnet-4-5"

[ai]
default = "default"
default_connection = "default"

[ai.connections.default]
name = "My Anthropic"
kind = "direct_provider"
provider = "anthropic"
default_model = "claude-sonnet-4-5"
```

Titan stores your API key securely in the OS keyring β€” you'll be prompted for it on first use.

AI is **optional**. All built-in workflows work without it; AI steps are simply skipped if no provider is configured.
AI is **optional**. All built-in workflows work without it; AI steps are skipped if no AI connection is configured.

## Connection Types

### Direct Provider

Use this when Titan talks directly to the vendor API.

```toml
[ai.connections.personal-openai]
name = "Personal OpenAI"
kind = "direct_provider"
provider = "openai"
default_model = "gpt-5"
```

### LLM Gateway

Use this when a single endpoint exposes one or more models through an OpenAI-compatible API.

```toml
[ai.connections.work-gateway]
name = "Work Gateway"
kind = "gateway"
gateway_type = "openai_compatible"
base_url = "https://llm.company.com"
default_model = "claude-sonnet-4-5"
```
4 changes: 2 additions & 2 deletions docs/getting-started/installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ titan

On first launch, two setup wizards run automatically:

1. **Global setup** β€” Configure AI providers (Claude, Gemini). This is optional and can be skipped.
1. **Global setup** β€” Configure AI connections. This is optional and can be skipped.
2. **Project setup** β€” Choose a project name and enable the plugins you want (Git, GitHub, Jira).

After setup, the main menu appears and you're ready to run workflows.
Expand All @@ -63,7 +63,7 @@ After setup, the main menu appears and you're ready to run workflows.

| Path | Purpose |
|------|---------|
| `~/.titan/config.toml` | Global config: AI provider credentials |
| `~/.titan/config.toml` | Global config: AI connections and credentials |
| `.titan/config.toml` | Project config: enabled plugins and settings |

The project config lives at the **git repository root**, so it works correctly in monorepos.
Expand Down
14 changes: 7 additions & 7 deletions docs/getting-started/quick-start.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,14 +68,14 @@ enabled = false
Edit `~/.titan/config.toml`:

```toml
[ai.providers.default]
name = "Claude"
type = "individual"
provider = "anthropic"
model = "claude-sonnet-4-5"

[ai]
default = "default"
default_connection = "default"

[ai.connections.default]
name = "My Anthropic"
kind = "direct_provider"
provider = "anthropic"
default_model = "claude-sonnet-4-5"
```

Titan will prompt for your API key on first use and store it securely in your OS keyring.
Expand Down
Loading