Turn any video – or your webcam feed – into a pulsing network of labelled squares that dance to the beat.
• Beat-driven spawning of squares with ORB keypoints
• LK optical-flow tracking with subtle jitter for organic motion
• Optional ambient "noise" spawns so visuals never fall silent
• Neighbor-link edges for a living graph aesthetic
• Per-square color-inversion, random alphanumeric labels, vertical text option
Requirements: Python 3.9+, FFmpeg on PATH.
python -m venv .venv
source Windows: .venv\Scripts\activate # MACOS :.venv/bin/activate
pip install -r requirements.txtpython main.py \
--input sample_data/playing_dead.mp4 \
--output output/playing_dead_boxes.mp4 \
--life-frames 10 \
--pts-per-beat 20 \
--ambient-rate 5.0 \
--jitter-px 0.5 \
--neighbor-links 3- Extract audio, detect onsets with Librosa.
- At each onset, ORB keypoints are sampled; a subset spawns squares.
- Squares are tracked across frames with Lucas-Kanade optical flow; small Gaussian jitter adds life.
- Edges connect each square to its nearest neighbors.
- Squares invert colors within their bounds, display a random label, and expire after
life_frames. - Ambient Poisson spawns keep visuals active during silence.
Effect_v1
├── main.py # offline renderer
├── sample_data/ # demo video
├── output/ # rendered results go here
├── requirements.txt