Feature Description
Introduce a RequestContext object carrying organizationId, projectId, userId, apiKeyId, and requestId through the full request lifecycle: Fastify handlers, BullMQ job payloads, Kysely query builders, and outbound integrations. The context is created once at request entry (HTTP route or job consumer) and propagated implicitly via AsyncLocalStorage rather than threaded through every function signature.
Problem/Use Case
Several existing and upcoming features need consistent access to "who is doing what right now":
- Audit logging needs actor + target on every mutation
- Per-tenant rate limiting needs the current org at any layer
- Debugging multi-tenant issues requires a request id correlated across logs, jobs, and database queries
- Future quota enforcement hooks need the org context at ingestion, query, and alert evaluation time
Today this information is partially available in Fastify request objects, partially passed as function arguments, and largely missing in BullMQ workers. Adding context retroactively to every callsite is painful; doing it once at v1.0 is much cheaper.
Proposed Solution
A context module backed by AsyncLocalStorage:
// At HTTP entry (Fastify hook)
context.run({ organizationId, projectId, userId, apiKeyId, requestId }, async () => {
await handler()
})
// Anywhere downstream
const ctx = context.current() // typed RequestContext, throws if missing
const ctx = context.currentOrNull() // returns undefined if outside a context
Integration points:
- Fastify: a global
onRequest hook resolves the context from auth and wraps the handler
- BullMQ producers: serialize the current context into the job payload's
_ctx field
- BullMQ consumers: a wrapper deserializes
_ctx and runs the worker inside context.run()
- Kysely: a plugin reads the request id from context and adds it as a SQL comment for query log correlation
- Outbound HTTP (webhooks, integrations): inject
X-Logtide-Request-Id automatically
Alternatives Considered
- Threading context through every function call. Verbose, intrusive on every signature, easy to forget for new code paths. Rejected.
- Using Fastify's
request object directly. Doesn't survive the boundary into BullMQ workers, doesn't help with library code that doesn't know about Fastify, and couples business logic to the HTTP layer.
- Continuation-local storage via
cls-hooked. Works but is unmaintained. Node's built-in AsyncLocalStorage is the right tool today.
Implementation Details (Optional)
- Build the primitive as a standalone module (
src/context/) with no dependencies on Fastify or BullMQ. The integrations live in adapter files.
- Keep the
RequestContext type small and serializable. Anything not JSON-safe stays out.
- Add a
withContext() test helper that runs a callback inside a synthetic context for unit tests.
- Audit existing handlers and worker functions and migrate them to use
context.current() instead of arguments where it cleans up the signature. This is mechanical but should be done in the same milestone to avoid two parallel patterns.
- Document that any new code path must establish or inherit a context — add a lint rule or runtime assertion in critical paths (ingestion, query, alert evaluation) that throws if context is missing.
Priority
Target Users
- Internal: maintainers and contributors writing new features that need org/project scoping
- Operators debugging multi-tenant issues via correlated request ids in logs and database query logs
- Downstream platforms built on
@logtide/backend needing a stable extension point for tenant-aware middleware
Contribution
Feature Description
Introduce a
RequestContextobject carryingorganizationId,projectId,userId,apiKeyId, andrequestIdthrough the full request lifecycle: Fastify handlers, BullMQ job payloads, Kysely query builders, and outbound integrations. The context is created once at request entry (HTTP route or job consumer) and propagated implicitly via AsyncLocalStorage rather than threaded through every function signature.Problem/Use Case
Several existing and upcoming features need consistent access to "who is doing what right now":
Today this information is partially available in Fastify request objects, partially passed as function arguments, and largely missing in BullMQ workers. Adding context retroactively to every callsite is painful; doing it once at v1.0 is much cheaper.
Proposed Solution
A
contextmodule backed byAsyncLocalStorage:Integration points:
onRequesthook resolves the context from auth and wraps the handler_ctxfield_ctxand runs the worker insidecontext.run()X-Logtide-Request-IdautomaticallyAlternatives Considered
requestobject directly. Doesn't survive the boundary into BullMQ workers, doesn't help with library code that doesn't know about Fastify, and couples business logic to the HTTP layer.cls-hooked. Works but is unmaintained. Node's built-inAsyncLocalStorageis the right tool today.Implementation Details (Optional)
src/context/) with no dependencies on Fastify or BullMQ. The integrations live in adapter files.RequestContexttype small and serializable. Anything not JSON-safe stays out.withContext()test helper that runs a callback inside a synthetic context for unit tests.context.current()instead of arguments where it cleans up the signature. This is mechanical but should be done in the same milestone to avoid two parallel patterns.Priority
Target Users
@logtide/backendneeding a stable extension point for tenant-aware middlewareContribution