waford is a zero-dependency, high-throughput webhook fan-out service written in Go.
It securely receives a single incoming webhook payload and asynchronously distributes it to multiple downstream destinations while natively handling partial failures, system backpressure, and graceful degradation.
This is an internal micro tool designed to solve a specific problem where most webhooks only allow a single endpoint registration. Sometimes you might want to send this webhook event to multiple endpoints, services, apps, users or whatever it is. waford allows you to fan out and send a single webhook across multiple destination proposing a neat asynchronous solution.
📖 Read the full architectural deep-dive and build journey here!
- Micro-Job Fan-Out: Ingress payloads are instantly split into isolated micro-jobs, ensuring that a slow destination never blocks the delivery of a healthy destination.
- Exponential Backoff with Full Jitter: Protects recovering downstream servers from "Thundering Herd" DDoS attacks by mathematically desynchronizing retry attempts.
- Dead Letter Queue (DLQ): Jobs that exhaust their retry limits are safely flushed to a thread-safe
.jsonlfile on disk, capturing the exactlast_errorfor later debugging. - Load Shedding & Backpressure: Protects its own memory. When internal queues reach capacity,
wafordshifts from blocking connections to instantly shedding load withHTTP 429 Too Many Requests. - Graceful Context Shutdowns: Uses
selectstatements andcontext.Contextto ensure background workers cleanly abort sleeping timers and flush buffers to disk onSIGTERM, guaranteeing zero panics and no lost data.
- Go 1.26 or higher
- Clone the repository:
git clone https://github.com/segfaultscribe/waford
cd waford- Install dependencies
go mod tidy- Start the server
go run main.gowaford exposes a single, lightning-fast ingress endpoint. It accepts your payload, drops it into the internal channel buffers, and instantly returns a 202 Accepted to free up the client.
Send a Webhook:
curl -X POST http://localhost:3000/ingress \
-H "Content-Type: application/json" \
-d '{"event": "user.signup", "user_id": "12345" "plan": "pryou can see the logs appear on your console.
NOTE: On windows you might want to create a .json file (eg: payload.json) and use it instead.
curl -X POST http://localhost:3000/ingress `
-H "Content-Type: application/json" `
--data-binary "@payload.json"waford is built to maximize the underlying OS's TCP limits. In local stress tests using hey, waford dynamically balanced background fan-out processing with active load shedding.
A basic test was made with a downstream chaos server. The chaos server was designed to be a probabilistically hostile with:
33% of the time: Return 200 OK instantly. (simulate success)
33% of the time: Return 500 Internal Server Error. (simulate failure)
34% of the time: Sleep for 6 seconds and do nothing. (simulate slow downstream)
Configuration: Buffer: 100 | Workers: 100 / 10 retry / 1 DLQ
Load: 5,000 concurrent webhooks (15,000 internal jobs) fired in 1 second (100 concurrent connections) using hey.
Average Ingress Latency: 0.02 seconds
Success Rate: 100% (Mix of 202 Accepted for buffered jobs and 429 Too Many Requests for successfully shed load to prevent OOM).
Configuration: Buffer: 20,000 | Workers: 1,000 Fresh / 200 Retry / 1 DLQ
Load: 20,000 webhooks across 500 concurrent connections.
OS Bottleneck Hit: Windows actively blocked ~1,700 requests due to ephemeral port exhaustion (connectex error) before they even reached the code.
waford flawlessly ingested the surviving ~18,200 requests in just 4.9 seconds.
Maintained an average ingress latency of 0.12 seconds under extreme duress.
~4k RPS
NOTE: Testing on a proper linux server is pending. The above tests were run on a windows system.