Conversation
fanOutGet now returns { data, cacheControl } instead of a bare Map.
Services bubble up the upstream Cache-Control value and routes set it
on the response so downstream clients and the cache middleware can use
the TTL. Tests use MSW to cover header propagation, null header, and
partial upstream failure scenarios.
Adds a CacheStore interface and Redis client factory, plus a Hono middleware that caches GET responses using the upstream max-age as TTL. The middleware is opt-in (skipped when REDIS_URL is unset) and fails open on Redis errors to preserve availability. Tests cover HIT, fail-open reads/writes, non-2xx responses, and missing Cache-Control.
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
|
Claude encountered an error —— View job I'll analyze this and get back to you. |
- 1hr: dao, delegation-percentage, token, token-metrics, event-relevance - 60s: all remaining routes (previously 120s or 300s) - 30s: last-update (unchanged) Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
|
Claude encountered an error —— View job I'll analyze this and get back to you. |
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 20c725aa8b
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: d354456d51
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 4347f4482a
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
Move setCacheControl from an inline context helper to a proper Hono middleware factory using createMiddleware, and wire it through each route's middleware array instead of calling it inside handlers. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 09603ae261
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| const entry = safeParse<CachedEntry>(raw); | ||
| if (!entry) return next(); | ||
| return new Response(entry.body, { |
There was a problem hiding this comment.
Treat malformed cache entries as cache misses
When a Redis value exists but fails deserialization, return next() exits before the response-phase write-back logic runs, so the bad key is never repaired. In this path every request for that URL keeps bypassing cache and re-hitting upstream until external eviction or key expiry, which defeats cache availability for that route. Let this branch fall through the miss path (or delete the key) so successful responses can repopulate Redis.
Useful? React with 👍 / 👎.
| if (cacheControl === null) { | ||
| cacheControl = result.value.cacheControl; | ||
| } |
There was a problem hiding this comment.
Derive aggregate TTL from all upstream cache headers
This keeps only the first fulfilled upstream cacheControl and ignores the rest, but the aggregation routes forward that value unchanged and the cache middleware uses it as Redis TTL. If DAO APIs return different max-age values, the aggregated response can be cached too long for fresher DAOs (or too short for others), producing stale cross-DAO data. Compute a conservative policy from all successful upstreams (for example minimum max-age, or skip caching when policies differ).
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: a2bcd1b747
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| await next(); | ||
| c.header("Cache-Control", `public, max-age=${seconds}`); |
There was a problem hiding this comment.
Avoid caching non-2xx API responses
setCacheControl unconditionally sets Cache-Control: public, max-age=... after await next(), so validation and error responses also become publicly cacheable. In this codebase, apps/api/cmd/index.ts uses defaultHook to emit 400s for invalid queries, so those 400 payloads will carry the same TTL and can be cached by shared intermediaries/CDNs, causing stale client errors even after request/input fixes. Restrict this middleware to successful responses (for example 2xx) before setting the header.
Useful? React with 👍 / 👎.
Summary
Adds a Redis-backed cache layer to the API Gateway (
gateful) to reduce upstream load and improve response latency. API controllers now emit appropriateCache-Controlheaders, which the gateway uses as the TTL signal to populate the cache.Changes
apps/gateful/src/middlewares/cache.ts— new cache-aside Hono middleware that reads from Redis on cache hit and writes on cache miss, driven by the upstreamCache-Control: max-ageheader; fails open on Redis errorsapps/gateful/src/middlewares/cache.test.ts— unit tests covering HIT, MISS, Redis read/write errors (fail-open), non-2xx suppression, and missingCache-Controlapps/gateful/src/cache/redis.ts— thin Redis client factory with non-blocking connect and structured event loggingapps/gateful/src/index.ts— wires the cache middleware into the gateway; Redis is optional viaREDIS_URLenv varapps/gateful/src/config.ts— adds optionalREDIS_URLconfig entryapps/gateful/src/shared/fan-out.ts— refactored to return{ data, cacheControl }so the upstreamCache-Controlheader propagates through the fan-out layerapps/gateful/src/shared/fan-out.test.ts— integration tests forfanOutGetusing MSW, covering header propagation and failed-upstream exclusionapps/gateful/src/resolvers/daos/service.ts+route.ts— forwards the aggregatedcacheControlvalue to the response headerapps/gateful/src/resolvers/delegation/service.ts+route.ts— same pattern for the delegation resolverapps/api/src/controllers/**— all API route handlers now setCache-Control: public, max-age=<N>with TTLs tuned per data volatility (30 s – 1800 s)apps/gateful/package.json— addsredisandmswdependenciesdocs/specs/2026-04-07-gateway-cache-design.md— design spec for the cache layer