Skip to content

Add Native OpenAI-compatible API Support#118

Open
jchristman wants to merge 3 commits intoKeygraphHQ:mainfrom
jchristman:main
Open

Add Native OpenAI-compatible API Support#118
jchristman wants to merge 3 commits intoKeygraphHQ:mainfrom
jchristman:main

Conversation

@jchristman
Copy link

Fully separate execution pathway parallel to the Anthropic Claude path, enables OpenAI-compatible models without the clumsy CCR translation layer.

jchristman and others added 3 commits February 10, 2026 23:14
Introduce a native OpenAI API path that bypasses the Claude Agent SDK,
enabling Shannon to run with local models (minimax, ollama, vLLM) and
any OpenAI-compatible endpoint.

Summary of changes:

Provider infrastructure:
- src/ai/provider-config.ts: New provider selection (claude|openai)
  from AI_PROVIDER env var, with AI_BASE_URL/AI_MODEL/AI_API_KEY support
- src/ai/claude-executor.ts: Route to OpenAI executor when provider is
  openai, passing sourceDir and agentName for tool context

OpenAI executor & tools:
- src/ai/openai-executor.ts: New OpenAI-compatible execution loop with
  streaming, tool calls, and turn limits; integrates Shannon helper tools
  and Playwright MCP
- src/ai/openai-tools.ts: Built-in tools (bash, read_file, write_file,
  edit_file, search_files, list_directory) in OpenAI function format
- src/ai/mcp-client.ts: MCP client that invokes save_deliverable and
  generate_totp directly (no MCP protocol) plus Playwright via stdio

MCP server updates:
- mcp-server: Export createSaveDeliverableHandler and generateTotp for
  direct use by OpenAI executor; adjust tool-responses type for compatibility

CLI & config:
- shannon: Add PROVIDER env var passthrough for docker-compose
- .env.example: AI_PROVIDER, AI_BASE_URL, AI_MODEL, AI_API_KEY
- CLAUDE.md: Document native OpenAI provider usage
- docker-compose.yml: Propagate provider-related env vars

Other:
- .dockerignore: Additional ignores
- prompts: Minor pre-recon-code updates
- package.json: New dependencies for OpenAI/MCP integration

Co-authored-by: Cursor <cursoragent@cursor.com>
…doesn't end up exhausting resources on the graphics card. Also added more logging to make it so that the values are output in the workflow.log for easy viewing
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant