Connects to any Postgres database, captures slow queries via pg_stat_statements, runs EXPLAIN ANALYZE, and uses AI to suggest indexes, query rewrites, and schema changes — with confidence scores and ready-to-run SQL.
- Slow query detection via
pg_stat_statements - EXPLAIN ANALYZE runner with automatic
$1placeholder substitution - AI-powered suggestions — indexes, rewrites, schema changes with confidence scores
- Two AI backends — Anthropic Claude (paid) or Ollama (100% free, runs locally)
- HTML reports — beautiful dark-mode reports you can share
- Slack alerts — send analysis results to a Slack channel
- Zero agent — reads directly from Postgres system views
pip install pgdoctorSet your database connection:
export PGDOCTOR_DSN=postgresql://user:password@localhost:5432/mydb# Install Ollama from ollama.com, then:
ollama pull llama3.2:3b
# Analyze slow queries
pgdoctor analyze-db --ollama
# Analyze a specific query
pgdoctor explain "SELECT * FROM orders WHERE user_id = 42" --ollama
# Generate HTML report
pgdoctor analyze-db --ollama --html --open
# Send to Slack
export SLACK_WEBHOOK_URL=https://hooks.slack.com/...
pgdoctor analyze-db --ollama --slackexport ANTHROPIC_API_KEY=sk-ant-...
pgdoctor analyze-db| Command | Description |
|---|---|
pgdoctor connect |
Test DB connection + verify pg_stat_statements |
pgdoctor top |
Show slowest queries (no AI) |
pgdoctor analyze-db |
Fetch slow queries + EXPLAIN + AI suggestions |
pgdoctor explain "SQL" |
Analyze one specific query |
--ollama Use local Ollama model (free)
--html Save results as HTML report
--open Open HTML report in browser
--slack Send results to Slack
--limit N Number of queries to analyze (default: 5)
--no-ai Show EXPLAIN output only, skip AI
--output FILE Custom HTML output path
Add to postgresql.conf:
shared_preload_libraries = 'pg_stat_statements'
Restart Postgres, then:
CREATE EXTENSION pg_stat_statements;| Variable | Description |
|---|---|
PGDOCTOR_DSN |
Postgres connection string |
ANTHROPIC_API_KEY |
Anthropic API key (optional) |
OLLAMA_MODEL |
Ollama model name (default: llama3.2:3b) |
OLLAMA_HOST |
Ollama host (default: http://localhost:11434) |
SLACK_WEBHOOK_URL |
Slack incoming webhook URL |
MIT