RSS -> AI -> email digest pipeline as a tiny CLI.
- Overview
- Features
- Requirements
- Installation
- Configuration
- RSS sources
- Profiles
- Usage
- Data locations
- Development
- Troubleshooting
knowever pulls RSS feeds, filters and scores entries, optionally enriches them with Codex/OpenAI, then ships a daily digest (or individual emails) via SMTP. It is designed to be crontab/systemd friendly and to keep all state inside the repo directory.
- Parallel feed download with de-duplication and similarity filtering.
- Scoring based on recency, engagement signals, and keyword hints from profiles.
- Optional AI enrichment (Codex/OpenAI) with a safe HTML email template.
- Digest or per-entry sending modes with configurable worker counts.
- Cache for failing domains to avoid hammering broken sources.
- Python 3.10+
uvinstalled
Install uv (Linux/macOS example):
curl -LsSf https://astral.sh/uv/install.sh | shOn Windows/WSL follow the instructions in the uv repository.
From the project root:
uv syncCreate .env (or copy .env.example) and set SMTP and pipeline knobs. Key variables:
SMTP_HOST=smtp.gmail.com
SMTP_PORT=587
SMTP_USER=your_smtp_login
SMTP_PASS=your_smtp_password_or_app_password
SMTP_FROM="Your Name <you@example.com>"
SMTP_TO=recipient@example.com
MAX_POSTS_PER_DAY=10
SEND_MODE=digest # or individual
INCLUDE_AI_CONTENT=true # false -> summary/link without AI content
CLEAR_BUFFER_AFTER_SEND=true
AUTO_SEND_DIGEST=false
FEED_DOWNLOAD_WORKERS=4
PROCESS_WORKERS=2
SEND_WORKERS=3
FAIL_TTL_SECONDS=86400 # seconds; TTL for domain error cache
PROFILE_NAME=default
ENTRY_SCORE=0.0
LOG_LEVEL=INFOList your feeds in sources.yaml:
- name: Hacker News Frontpage
url: https://hnrss.org/frontpage
- name: Dev.to Top
url: https://dev.to/feedCopy profile.example.yaml to profile.yaml and adjust:
min_score: threshold to keep an entrymax_per_source: per-source daily capkeywords_positive/keywords_negative: scoring hintssend_time: used in digest metadata
After uv sync, commands are available via uv run -m knowever.cli ... (or ./run.sh for the full pipeline).
Common commands:
# Full pipeline with day.lock guard (once per day)
PYTHONPATH=src uv run -m knowever.cli run
# Download feeds only
PYTHONPATH=src uv run -m knowever.cli download
# Select + AI -> daily_buffer.jsonl
PYTHONPATH=src uv run -m knowever.cli process
# Send digest or individual emails
PYTHONPATH=src uv run -m knowever.cli send
# Clear domain error cache
PYTHONPATH=src uv run -m knowever.cli purge-cache
# Mark all entries in feeds/ as processed
PYTHONPATH=src uv run -m knowever.cli mark-all
# Inspect today's buffer (default limit 5)
PYTHONPATH=src uv run -m knowever.cli show-buffer --limit 5If you prefer editable install: uv pip install -e . then call knowever ... directly.
- Feeds:
feeds/*.jsonl - Digest buffer:
daily_buffer.jsonl - Processing history:
feeds_process_history.jsonl - Domain error cache:
tmp/fetch_failures.json - AI prompt:
prompt.md - Logs:
logs/
- Source lives in
src/knowever/. - Keep
PYTHONPATH=srcwhen running commands locally. - Logging level can be tuned via
LOG_LEVELin.env.
- Nothing sent today? Check
daily_buffer.jsonlandlogs/knowever.log. - Hitting the daily cap: raise
MAX_POSTS_PER_DAYor lowermin_scoreinprofile.yaml. - Repeated domain failures: delete
tmp/fetch_failures.jsonor runpurge-cache.