A secure web application for tracking and managing leaked secrets from TruffleHog scans. This application helps security teams manage the lifecycle of leaked secrets by ingesting scan results, deduplicating findings, and tracking remediation status.
- Docker and Docker Compose installed
# 1. Start the application
docker compose up
# 2. Open your browser
# http://localhost:5000
# 3. Create your account
# Click "Sign up here" on the login page
# 4. Upload a test scan
# Use a file from the scans/ directoryThat's it! Your secret tracker is running. 🎉
- 📤 File Upload: Web-based TruffleHog JSONL scan ingestion
- 🔍 Smart Deduplication: Content-based secret detection (SHA-256)
- 📊 Visual Dashboard: Real-time statistics and filtering
- 🔄 Status Tracking: New → In Progress → Resolved → False Positive
- ✨ Auto-Resolution: Automatically detects when secrets are remediated
- 🔐 Authentication: Secure user accounts with session management
- 👥 User Isolation: Each user sees only their own data
- 🏢 Multi-Repository: Track secrets across your entire organization
- 📝 Detailed Views: Complete finding history with file paths and timestamps
- 🎭 Secret Masking: Secrets hidden by default, explicit reveal required
- Architecture
- Usage Guide
- Configuration
- Security
- API Endpoints
- Development
- Troubleshooting
- Production Deployment
- Backend: Flask 3.0 (Python)
- Database: PostgreSQL 15 (Docker) / SQLite (local development)
- ORM: SQLAlchemy 2.0
- Frontend: HTML/CSS/JavaScript (Vanilla JS)
- Deployment: Docker + Docker Compose
- Security: bcrypt password hashing, secure sessions, OWASP compliant
┌─────────────────┐
│ Web Browser │
│ (Port 5000) │
└────────┬────────┘
│ HTTPS (in production)
↓
┌─────────────────┐
│ Flask App │
│ - Upload │
│ - Dashboard │
│ - Auth │
│ - API │
└────────┬────────┘
│ SQLAlchemy ORM
↓
┌─────────────────┐
│ PostgreSQL │
│ - Users │
│ - Secrets │
│ - Findings │
│ - Scans │
└─────────────────┘
- Users: Authentication and user management
- Secrets: Unique secrets (deduplicated by content hash)
- Findings: Individual occurrences of secrets in scans
- Scans: Upload history and metadata
- RepositoryShare: Future sharing feature (planned)
- Start the application with
docker compose up - Navigate to
http://localhost:5000 - Click "Sign up here"
- Enter username and password (twice for confirmation)
- Log in with your new credentials
- Click "📤 Upload Scan" on the dashboard
- Select a TruffleHog JSONL file
- Click "Upload & Process"
- The system will:
- Parse all findings from the file
- Deduplicate secrets based on content (SHA-256)
- Update first/last seen timestamps
- Check for auto-resolved secrets
- Associate all data with your user account
- Dashboard shows all your secrets with filters
- Filter by status (New, In Progress, Resolved, False Positive)
- Filter by repository
- View statistics in real-time
Click "View" on any secret to see:
- Secret value (masked by default, reveal when needed)
- All findings (every location where it was found)
- File paths and line numbers
- Scan timestamps (first seen / last seen)
- Repository information
- Verification status from TruffleHog
- Click "View" on a secret
- Select new status from dropdown
- Changes are logged for audit trail
New → In Progress → Resolved
↓
False Positive
- New: Secret just discovered
- In Progress: Remediation work underway
- Resolved: Secret has been remediated (manually or auto-resolved)
- False Positive: Not a real secret
The system automatically marks secrets as "Resolved" when:
- A new scan is uploaded for a repository
- A secret from previous scans no longer appears
- The secret's status is still "New" or "In Progress"
Example Workflow:
# Upload initial scan
# - Upload scans/router/trufflehog_scan_2024-06-23.jsonl
# - 10 secrets found, all marked "New"
# Work on remediation
# - Mark some as "In Progress"
# - Remove secrets from code
# Upload new scan
# - Upload scans/router/trufflehog_scan_2024-09-09.jsonl
# - Remediated secrets automatically marked "Resolved"
# - Remaining secrets stay "In Progress"Sample scan files are included in the scans/ directory:
-
Initial Test:
scans/client/trufflehog_scan_2024-11-27.jsonl- 13 unique secrets
- Various types (AWS, GitHub, Stripe, etc.)
-
Auto-Resolution Test:
- Upload
scans/router/trufflehog_scan_2024-06-23.jsonl - Upload
scans/router/trufflehog_scan_2024-09-09.jsonl - Watch auto-resolution in action!
- Upload
-
Multi-Repository Test:
- Upload scans from different subdirectories
- Use repository filter to view each separately
Configure in docker-compose.yml or create a .env file:
# Database
DATABASE_URL=postgresql://user:pass@db:5432/secrets_db
# Security (REQUIRED for production)
SECRET_KEY=your-cryptographically-secure-secret-key-here
# Flask Environment
FLASK_ENV=production # Enables secure cookies, disables debug
# Session Configuration
SESSION_TIMEOUT=3600 # Session timeout in seconds (default: 1 hour)python -c "import secrets; print(secrets.token_hex(32))"Default: http://localhost:5000
To change the port, edit docker-compose.yml:
ports:
- "8080:5000" # Change 5000 to your preferred port- PostgreSQL 15 with persistent volumes
- Data survives container restarts
- Located in
postgres_datavolume
- SQLite database (
secrets.db) - No Docker required
- Use
python app.pyto run locally
-
Authentication & Authorization
- Session-based authentication
- Secure cookies (HttpOnly, SameSite=Lax)
- HTTPS-only cookies in production
- Configurable session timeouts (default: 1 hour)
- Password hashing with bcrypt
- No default credentials
-
Data Protection
- User data isolation (per-user access control)
- Secrets masked by default
- Explicit reveal action required
- SQL injection prevention (SQLAlchemy ORM)
- XSS prevention (Jinja2 auto-escaping + CSP headers)
- Input sanitization (username, control characters)
- JSONL structure validation before processing
-
Container Security
- Non-root user (appuser, UID 1000)
- Minimal base image (python:3.11-slim)
- Health checks configured
- No SSH access
- CIS Docker Benchmark compliant
-
Application Security
- Rate limiting (Flask-Limiter)
- Login: 10/minute
- Signup: 10/hour
- Uploads: 100/hour
- Security headers (CSP, HSTS, X-Frame-Options, etc.)
- File type validation (.jsonl only)
- File size limits (16MB maximum)
- Secure filename handling
- Comprehensive audit logging
- Environment-based configuration
- Debug mode disabled in production
- Rate limiting (Flask-Limiter)
✅ Dependency Vulnerabilities: None (Snyk scan)
✅ Container Security: 77/77 checks passed (Checkov)
✅ OWASP Top 10: Compliant
See SECURITY.md for comprehensive security documentation and scan results.
Before deploying to production:
-
Generate Secure Credentials (CRITICAL)
python -c "import secrets; print(secrets.token_hex(32))" -
Enable HTTPS (CRITICAL)
- Deploy behind reverse proxy (Nginx, Traefik)
- Obtain SSL/TLS certificates
- Set
FLASK_ENV=production
-
Encrypt Secrets at Rest (Recommended)
- Application-level encryption, or
- Database-level encryption, or
- Use key management service (AWS KMS, Azure Key Vault)
-
Additional Security (Recommended)
- Implement 2FA
- Add rate limiting
- Set up monitoring and alerting
- Configure automated backups
- Regular security updates
See SECURITY.md for comprehensive production deployment requirements and security checklist.
All endpoints require authentication (except /login and /signup).
POST /login- User loginPOST /signup- Create new accountGET /logout- User logout
GET /api/secrets- List all secrets (with optional filters)- Query params:
status,repository
- Query params:
GET /api/secrets/<id>- Get secret details with all findingsPUT /api/secrets/<id>/status- Update secret status- Body:
{"status": "In Progress"}
- Body:
POST /api/upload- Upload and process scan file- Form data:
file(JSONL file)
- Form data:
GET /api/scans- Get scan historyGET /api/stats- Get dashboard statistics
{
"success": true,
"data": { ... },
"message": "Operation completed"
}Error responses:
{
"success": false,
"error": "Error message"
}# Install dependencies
pip install -r requirements.txt
# Create database
python recreate_db.py
# Run application
python app.py
# Access at http://localhost:5000secret-remediation-tracker/
├── app.py # Main Flask application
├── init_db.py # Database initialization (Docker)
├── recreate_db.py # Database reset (local dev)
├── requirements.txt # Python dependencies
│
├── templates/ # HTML templates
│ ├── base.html # Base layout with navigation
│ ├── login.html # Login page
│ ├── signup.html # Signup page
│ └── dashboard.html # Main dashboard
│
├── static/ # Static assets
│ ├── css/
│ │ └── style.css # Application styles
│ └── js/
│ ├── main.js # Common utilities
│ └── dashboard.js # Dashboard functionality
│
├── scans/ # Sample TruffleHog scan files
│ ├── client/
│ ├── router/
│ └── server/
│
├── tests/ # Test suite
│ ├── conftest.py # Test fixtures
│ ├── test_auth.py # Authentication tests
│ ├── test_api.py # API endpoint tests
│ ├── test_models.py # Database model tests
│ └── test_security.py # Security feature tests
│
├── Dockerfile # Container image definition
├── docker-compose.yml # Multi-service orchestration
├── env.example # Environment variable template
├── .checkov.yaml # Security scan configuration
├── run_tests.sh # Test runner (Linux/Mac)
├── run_tests.ps1 # Test runner (Windows)
│
├── README.md # This file
├── SECURITY.md # Security documentation & scan results
└── PROJECT_SUMMARY.md # Technical overview
The project includes a comprehensive test suite to validate core functionality.
Quick Start:
# Linux/Mac
chmod +x run_tests.sh
./run_tests.sh
# Windows
.\run_tests.ps1Manual Testing:
# Install test dependencies
pip install -r requirements.txt
# Run all tests
pytest -c tests/pytest.ini tests/
# Run with coverage report
pytest -c tests/pytest.ini tests/
# Then open: tests/htmlcov/index.html
# Run specific test file
pytest -c tests/pytest.ini tests/test_auth.py
# Run tests matching pattern
pytest -c tests/pytest.ini tests/ -k "test_login"Test Coverage:
- ✅ Authentication (login, signup, logout)
- ✅ API endpoints (secrets, upload, stats)
- ✅ Database models and relationships
- ✅ Security features (sanitization, headers, isolation)
- ✅ Input validation
See tests/README.md for detailed testing documentation.
# Dependency vulnerabilities
snyk test --file=requirements.txt
# Infrastructure security
checkov -d . --compact
# Full security scan
snyk test && checkov -d .# Check application logs
docker compose logs -f web
# Check database
docker compose exec db psql -U secretuser -d secrets_db
# Access database shell (local)
sqlite3 secrets.db# Reset local database
python recreate_db.py
# Backup database (Docker)
docker compose exec db pg_dump -U secretuser secrets_db > backup.sql
# Restore database (Docker)
docker compose exec -T db psql -U secretuser secrets_db < backup.sqlPort already in use:
# Windows
netstat -ano | findstr :5000
# Linux/Mac
lsof -i :5000
# Change port in docker-compose.yml
ports:
- "8080:5000"Database connection errors:
# Wait longer - PostgreSQL takes ~10 seconds to initialize
# Check database logs
docker compose logs db
# Verify database is ready
docker compose exec db pg_isready -U secretuser- Ensure file ends with
.jsonl - Check file size (maximum 16MB)
- Verify JSONL format (one valid JSON object per line)
- Check application logs:
docker compose logs web
- Verify username and password
- Create new account at
/signupif needed - Check logs for authentication errors:
docker compose logs web
If you see table has no column named user_id:
# For Docker
docker compose down -v
docker compose up
# For local development
python recreate_db.py# Check container status
docker compose ps
# View detailed logs
docker compose logs -f
# Restart services
docker compose restart
# Full reset
docker compose down -v
docker compose up- Generate secure SECRET_KEY
- Change database password
- Set FLASK_ENV=production
- Configure HTTPS reverse proxy
- Set up database backups
- Configure monitoring and alerting
- Review SECURITY.md for hardening
- Test with production-like data
- Scan for vulnerabilities (Snyk, Checkov)
# 1. Update configuration
# Edit docker-compose.yml with production values
# 2. Start services
docker compose up -d
# 3. Verify services
docker compose ps
docker compose logs
# 4. Create first user account
# Navigate to /signup
# 5. Upload test scan
# Verify functionality
# 6. Set up backups
# Configure automated database backupsMonitor these metrics:
- Application health (Docker health checks)
- Failed login attempts
- Upload failures
- Database performance
- CPU and memory usage
# Daily automated backup
docker compose exec db pg_dump -U secretuser secrets_db > backup_$(date +%Y%m%d).sql
# Verify backup
pg_restore --list backup_$(date +%Y%m%d).sql
# Store backups securely off-siteSee SECURITY.md for production deployment requirements and operational procedures.
- SECURITY.md - Comprehensive security documentation, scan results, and production deployment guide
- PROJECT_SUMMARY.md - Technical overview and architecture
- env.example - Environment variable template
This is an MVP implementation. Contributions welcome!
- Multi-factor authentication (2FA)
- Role-based access control (RBAC)
- Repository sharing between users
- Email notifications for new secrets
- Slack/Teams integrations
- API authentication tokens
- Bulk status updates
- Advanced search and filtering
- Secret rotation tracking
- Compliance reporting
- Integration with ticketing systems
This is an MVP implementation for evaluation purposes.
For issues or questions:
- Check this README
- Review SECURITY.md for security questions
- Check logs:
docker compose logs -f - Run verification script:
verify_setup.ps1(Windows) orverify_setup.sh(Linux/Mac) - See SECURITY.md for production deployment troubleshooting
Built with security, quality, and user experience in mind. 🔒
Ready to deploy and use immediately ✅