Real-time AI debugging for running applications
Capture logs automatically, search your codebase in natural language, and chat with an AI that understands your entire system.
π§ Beta - OpenBug is actively developed and maintained. We ship updates regularly and welcome feedback.
npm install -g @openbug/cliFirst time setup:
# Terminal 1: Start the AI assistant
debugYou'll be prompted to log in and paste an API key from app.openbug.ai.
Start debugging:
# Terminal 2: Run any command with debugging
debug npm run dev
debug python app.py
debug docker-compose upYour application runs normally with logs visible. Behind the scenes, OpenBug captures logs, accesses your codebase locally, and makes everything available to the AI assistant running in Terminal 1.
New to OpenBug? See it debug 3 realistic bugs in under 5 minutesβno installation required.
What's included:
- 2 microservices with real bugs (schema drift, config errors, race conditions)
- Step-by-step walkthroughs of OpenBug investigating each issue
- See exactly how OpenBug correlates logs and searches code across services
Perfect for understanding OpenBug's capabilities before connecting your own services.
Stop context-switching between logs and code
Ask "why is the auth endpoint failing?" and get answers based on actual runtime logs plus relevant code from your codebaseβnot generic suggestions.
Debug across multiple services
No more grepping through logs in 5 different terminals. OpenBug sees logs from all connected services and can trace issues across your entire stack.
Understand unfamiliar codebases
Search in natural language: "where do we handle payment webhooks?" The AI searches your actual codebase, not the internet.
Terminal 1: AI Assistant Terminal 2: Your Service
βββββββββββββββββββββββββββ ββββββββββββββββββββββββββββ
β $ debug β β $ debug npm run dev β
β β β Server running on :3000 β
β You: "Why is auth β β [logs stream normally] β
β failing?" β β β
β βββββββββ€ Logs captured β
β AI: [analyzes logs + β β Code accessed locally β
β searches codebase] β β Connected to cluster β
βββββββββββββββββββββββββββ ββββββββββββββββββββββββββββ
β² β²
β β
βββββββββββ Local Cluster ββββββββββ
(ws://127.0.0.1:4466)
β
β WebSocket
β
ββββββββββββββββββββββββββββββ
β OpenBug AI Server β
β β
β ββββββββββββββββββββββββ β
β β Agent Graph β β
β β β’ Analyze logs β β
β β β’ Search codebase β β
β β β’ Generate insights β β
β ββββββββββββββββββββββββ β
ββββββββββββββββββββββββββββββ
debugstarts the AI assistant and connects to a local cluster serverdebug <command>runs your service and streams logs to the cluster- Local cluster connects to OpenBug AI server via WebSocket
- Agent Graph processes queries, searches code, and analyzes logs
- Responses flow back through cluster to your terminal
Run multiple services in different terminalsβthey all connect to the same cluster, so the AI can debug across your entire system.
Your code stays local
Your codebase is accessed locally and never uploaded. Only specific code snippets that the AI queries are sent to the server.
Selective log sharing
Logs are streamed to the server only when the AI needs them to answer your questions. You control what runs with debug <command>.
API key authentication
All requests are authenticated with your personal API key from app.oncall.build.
To run your own OpenBug server:
- Clone the server repository
- Configure with your OpenAI API key
- Point the CLI to your server:
export WEB_SOCKET_URL=ws://localhost:3000/v2/ws
export API_BASE_URL=http://localhost:3000/v2/apiSee the server README for full setup instructions.
Multi-service debugging:
# Terminal 1: AI Assistant
debug
# Terminal 2: Backend
cd backend
debug npm run dev
# Terminal 3: Frontend
cd frontend
debug npm start
# Back to Terminal 1 (AI)
> "Users can't log in, what's wrong?"
> "Show me logs from the last auth request"
> "Where do we validate JWT tokens?"The AI sees logs from both services and can search code in both repos.
| Feature | OpenBug | Cursor/Copilot/Windsurf |
|---|---|---|
| Sees runtime logs | β | β |
| Multi-service debugging | β | β |
| Natural language log analysis | β | β |
| Works with running apps | β | Static analysis only |
OpenBug coordinates debugging agents that see what's actually happening when your code runs.
Start AI assistant:
debugRun with debugging:
debug <any-command>Open browser UI:
debug studioWhen you run debug <command> for the first time in a directory, OpenBug will:
- Prompt for a project description
- Create an
openbug.yamlfile - Register the service with the local cluster
Example openbug.yaml:
id: "my-api-service"
name: "api-service"
description: "Express API with PostgreSQL"
logs_available: true
code_available: trueOn subsequent runs, OpenBug uses the existing configuration automatically.
Override defaults by setting these in ~/.openbug/config or as environment variables:
API_BASE_URL=https://api.oncall.build/v2/api
WEB_SOCKET_URL=wss://api.oncall.build/v2/ws
OPENBUG_CLUSTER_URL=ws://127.0.0.1:4466Terminal UI:
Ctrl+Cβ ExitCtrl+Dβ Toggle chat/logs viewCtrl+Rβ Reconnect/reload
Browser UI:
Ctrl+Oβ Toggle full/trimmed chat history
Current shortcuts are shown in the bottom status bar.
- Node.js 20+
- npm, yarn, or bun
For advanced usage, custom integrations, and troubleshooting:
The codebase is designed to be readable and hackable:
- Entry point:
bin/openbug.js - TUI components:
index.tsxandsrc/components/* - Utilities:
helpers/*andsrc/utils/*
Pull requests and issues welcome!
