Skip to content

[VW-216] use one-time-tokens for webhook responses instead of api keys#105

Open
0xcad wants to merge 6 commits intomainfrom
VW-216
Open

[VW-216] use one-time-tokens for webhook responses instead of api keys#105
0xcad wants to merge 6 commits intomainfrom
VW-216

Conversation

@0xcad
Copy link
Copy Markdown
Collaborator

@0xcad 0xcad commented Mar 27, 2026

previously a user had to configure an api key for the integrated service on VIPER, and configure a VIPER api key on the integrated service. so for example, a user had to put a Blueflow API key in VIPER and a VIPER API key in Blueflow.

Now, we use one-time-tokens to create unique urls. We pass these to external services, who get authenticated via the unique url, instead of from an API key

Summary by CodeRabbit

  • New Features

    • Introduced token-based authentication for integration uploads, replacing API keys with secure, expiring tokens.
    • Added automatic cleanup of expired authentication tokens.
    • Made integration sync frequency configurable via new constant.
  • Removed Features

    • Removed API key rotation UI and functionality from integrations.

0xcad added 4 commits March 26, 2026 12:47
what needs to change:
* integrations currently link to an api key, that's not right. remove that from the schema - DONE
* integration create -> don't create an api key - DONE
* integration rotate -> don't rotate an api key - DONE
* integration apikeyconnector -> don't connect api key to apikeyconnector. - DONE

* Now, how do I find out when an integration was last used? just reuse `lastSuccessfulSync`
    * so probably need to add a prisma client extension for this? or put the code in the integrationUpload endpoint, that might be best
* sync-integrations: do create a user token with appropriate permission for resource type
* need to modify `integrationUpload` urls to contain a token in the url, and do auth to get integrationUser with that
    * so they aren't a protected procedure any more
    * also make sure to check permission
* double check that prisma delete event still works...
what needs to change:
* integrations currently link to an api key, that's not right. remove that from the schema - DONE
* integration create -> don't create an api key - DONE
* integration rotate -> don't rotate an api key - DONE
* integration apikeyconnector -> don't connect api key to apikeyconnector. - DONE
* Now, how do I find out when an integration was last used?
    * so probably need to add a prisma client extension for this? or put the code in the integrationUpload endpoint, that might be best - DONE
* sync-integrations: do create a user token with appropriate permission for resource type - DONE
* need to modify `integrationUpload` urls to contain a token in the url, and do auth to get integrationUser with that - DONE
    * so they aren't a protected procedure any more - DONE
    * also make sure to check permission - DONE

TODO: can I use user tokens again for AI authentication, rather than API keys? that seems smarter here. - DONE
@vercel
Copy link
Copy Markdown

vercel bot commented Mar 27, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
viper Ready Ready Preview, Comment Mar 27, 2026 8:08pm
1 Skipped Deployment
Project Deployment Actions Updated (UTC)
viper-demo Ignored Ignored Mar 27, 2026 8:08pm

Request Review

@coderabbitai
Copy link
Copy Markdown

coderabbitai bot commented Mar 27, 2026

📝 Walkthrough

Walkthrough

This pull request replaces API-key-based authentication for integration webhooks with a cryptographically-hashed, one-time token system. It introduces a new UserToken database model, updates integration endpoints to accept tokens via URL paths, removes API key rotation features, and modifies the sync workflow to generate and consume tokens instead of managing persistent API keys.

Changes

Cohort / File(s) Summary
Database Schema & Migrations
prisma/schema.prisma, prisma/migrations/20260327193952_user_tokens/migration.sql
Added new UserToken model with SHA-256 hashed tokenHash, userId relation, expiration tracking, and optional permissions. Made integrationUserId required in Integration. Removed apiKeyId and API key relation from Integration and Apikey models.
Token Library
src/lib/tokens.ts
New token management module exporting createUserToken, consumeUserToken, and purgeExpiredTokens functions. Implements SHA-256 hashing, TTL-based expiration (15 min default), and one-time use pattern via deletion on consumption.
Integration Upload Endpoints
src/features/assets/server/routers.ts, src/features/device-artifacts/server/routers.ts, src/features/remediations/server/routers.ts, src/features/vulnerabilities/server/routers.ts
Migrated processIntegrationCreate from protectedProcedure to baseProcedure. Updated OpenAPI routes from fixed paths to /.../{token} format. Replaced ctx.auth-based integration lookup with processIntegrationToken() call for token validation.
Integration Management Features
src/features/integrations/server/routers.ts, src/features/integrations/hooks/use-integrations.ts, src/features/integrations/components/columns.tsx, src/features/integrations/components/integrations-layout.tsx, src/features/integrations/components/integrations.tsx
Removed rotateKey mutation and API key creation/persistence logic. Removed UI components for API key rotation and success modals. Updated default sync interval to use INTEGRATION_SYNC_EVERY_MIN constant.
Sync & Workflow
src/inngest/functions/sync-integrations.ts, src/inngest/functions/purge-expired-user-tokens.ts, n8n_workflows/AI_Sync_Workflow.json
Added scheduled Inngest function to purge expired tokens every 6 hours. Updated sync-integrations cron to use dynamic interval. Replaced API key generation with tokenized webhook URLs in response config. Updated n8n workflow header authentication and node positioning.
Integration Upload Tests
src/app/api/v1/__tests__/assets.test.ts, src/app/api/v1/__tests__/deviceArtifacts.test.ts, src/app/api/v1/__tests__/remediations.test.ts, src/app/api/v1/__tests__/vulnerabilities.test.ts, src/app/api/v1/__tests__/test-config.ts
Updated test setup to generate per-test integration tokens via createIntegrationToken(). Changed request paths from /... with Authorization header to /.../${token} format. Modified setupMockIntegration to return structured object with integration and apiKey.
Router Utilities & Configuration
src/lib/router-utils.ts, src/lib/schemas.ts, src/app/api/inngest/route.ts, src/config/constants.ts, src/features/integrations/types.ts, src/lib/url-utils.ts
Added processIntegrationToken() helper for token validation. Updated upsertSyncStatus to track apiKeyConnector.lastRequest. Added INTEGRATION_SYNC_EVERY_MIN constant. Updated schema to include token field. Registered purgeExpiredTokensFn in Inngest. Improved getBaseUrl() to distinguish Vercel preview vs production.

Sequence Diagram

sequenceDiagram
    actor Client
    participant Endpoint as Integration Upload<br/>(e.g., /assets/{token})
    participant TokenService as processIntegrationToken()
    participant TokenDB as UserToken DB
    participant IntegrationDB as Integration DB
    participant Processor as processIntegrationSync()

    Client->>Endpoint: POST /assets/integrationUpload/{token}
    Endpoint->>TokenService: processIntegrationToken(token, ResourceType)
    
    TokenService->>TokenDB: consumeUserToken(raw)
    TokenDB-->>TokenService: userId (if hash match & not expired)
    
    TokenService->>IntegrationDB: findFirst by integrationUserId
    IntegrationDB-->>TokenService: integration (or null)
    
    alt Token Invalid or Expired
        TokenService-->>Endpoint: FORBIDDEN error
        Endpoint-->>Client: 403 Forbidden
    else Token Valid & Integration Found
        TokenService-->>Endpoint: {userId, integrationId}
        TokenDB->>TokenDB: Delete consumed token
        Endpoint->>Processor: processIntegrationSync(userId, integrationId, ...)
        Processor->>IntegrationDB: Update sync status & lastRequest
        Processor-->>Endpoint: Success
        Endpoint-->>Client: 200 OK
    end
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~60 minutes

Possibly related PRs

  • PR #57: Introduces API key rotation and automatic API key creation for integrations—this PR removes those features in favor of token-based auth.
  • PR #20: Implements the initial API-key workflow that this PR substantially reverses by replacing persistent API keys with one-time hashed user tokens.
  • PR #58: Updates assets integration upload flow with token-based endpoints and related test changes that align with this PR's broader auth refactor.

Suggested reviewers

  • trummelhadron
  • timrcm
🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 77.78% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title clearly and specifically describes the main change: replacing API key authentication with one-time tokens for webhook responses.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
📝 Generate docstrings
  • Create stacked PR
  • Commit on current branch
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch VW-216

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

/*
Warnings:

- You are about to drop the column `apiKeyId` on the `integration` table. All the data in the column will be lost.
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

drop it >:)

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

remove api key from integration, create UserToken model.

UserToken stores a hash of a token, expiration time, user, and optional permissions.

permissions are so that user tokens can be used more generally on VIPER if needed, and to make sure that token users "stay in their lanes". it's cheap RBAC. for integration uploads we currently set a permission on the user token for just that integration endpoint, and then when we use a token, we make sure it has the expected permission. if it doesn't the token is invalid. so Helm, which gives us vulnerabilities, can't use its token to give us assets instead.

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use a constant to enforce how often integrations sync. this constant is used in the frontend form verification, and controls how often the backend inngest event runs.

changed from every minute doing a syncrhonization to every 5 minutes, just to get fewer events on inngest in prod

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

get rid of rotate api key function on integrations

userId: true,
integrationUserId: true,
apiKeyConnector: true,
const integration = await prisma.$transaction(async (tx) => {
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

moved creating integration user and integration into a prisma transaction, since we no longer also have to create api keys at the same time

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

inngest fn deletes expired tokens from db, runs every 6 hrs (could later increase to run daily if event count is a concern)

Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 7

🧹 Nitpick comments (5)
n8n_workflows/AI_Sync_Workflow.json (1)

41-46: Unused variable: viperAPIKey is assigned but never used.

The viperAPIKey assignment appears to be leftover from the previous API key-based authentication. Since the "Send Data to VIPER API" node no longer uses bearer credentials (authentication is now via URL tokens), this variable is dead code.

♻️ Proposed fix: Remove unused assignment
             {
               "id": "684395a7-ceb1-4524-8eaf-a3970c6fb934",
               "name": "responsePath",
               "value": "={{ $json.body.responsePath }}",
               "type": "string"
-            },
-            {
-              "id": "a4e97f8d-527d-4500-8fd5-a52a38b260b2",
-              "name": "viperAPIKey",
-              "value": "={{ $json.body.responseApiKey }}",
-              "type": "string"
             }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@n8n_workflows/AI_Sync_Workflow.json` around lines 41 - 46, Remove the dead
assignment for the credential variable "viperAPIKey" which is set to "={{
$json.body.responseApiKey }}" but never used; locate the JSON node with "name":
"viperAPIKey" (id "a4e97f8d-527d-4500-8fd5-a52a38b260b2") and delete that entire
object entry so no unused API key variable remains—confirm the "Send Data to
VIPER API" node now relies on URL tokens and tests still pass.
src/inngest/functions/purge-expired-user-tokens.ts (1)

3-3: Use the source-root alias for the Inngest client import.

../client makes this new file an outlier against the rest of the app and is harder to keep stable through moves or renames.

♻️ Suggested change
-import { inngest } from "../client";
+import { inngest } from "@/inngest/client";

As per coding guidelines "Use @ import alias consistently to map to src/ directory".

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/inngest/functions/purge-expired-user-tokens.ts` at line 3, The import in
purge-expired-user-tokens.ts should use the project source-root alias instead of
a relative path: update the import that brings in the inngest symbol (currently
from "../client") to use the configured src alias (the same alias used across
the codebase for the client module) so the file matches project conventions and
remains stable under moves/renames.
src/lib/schemas.ts (1)

113-115: Validate the token shape at the API boundary.

src/lib/tokens.ts only issues 64-character lowercase hex tokens. Accepting any string here widens the public contract and still spends work hashing obviously invalid values. Tightening this schema also makes the path param contract more precise.

♻️ Suggested change
   return pagesWithLinksSchema.extend({
-    token: z.string(), // the user token calling this endpoint
+    token: z
+      .string()
+      .length(64)
+      .regex(/^[a-f0-9]+$/, "Invalid token"), // the user token calling this endpoint
   });
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/lib/schemas.ts` around lines 113 - 115, The token field in the response
schema currently accepts any string; tighten the API boundary by updating the
schema returned by pagesWithLinksSchema.extend (the object that adds token) to
validate the token shape as the 64-character lowercase hex produced by
src/lib/tokens.ts—replace the loose z.string() with a string schema that
enforces /^[0-9a-f]{64}$/ (or equivalent Zod regex) so only valid tokens pass
through and unnecessary hashing of invalid values is avoided.
prisma/schema.prisma (1)

94-100: Make token scope an enum.

permissions is part of the auth boundary, but as String? it can drift from the ResourceType values consumed in src/lib/tokens.ts. Storing ResourceType? here lets Prisma reject bad scopes and gives the token helpers a real type to share.

♻️ Suggested change
 model UserToken {
   id        String   `@id` `@default`(cuid())
   userId    String
   user      User     `@relation`(fields: [userId], references: [id], onDelete: Cascade)
   tokenHash String   `@unique`
-  permissions String? // what can the token be used for?
+  permissions ResourceType? // what can the token be used for?
   expiresAt DateTime
   createdAt DateTime `@default`(now())
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@prisma/schema.prisma` around lines 94 - 100, The permissions field on the
Prisma model UserToken is a freeform String? and should be converted to a Prisma
enum so schema-level types align with your ResourceType in src/lib/tokens.ts.
Add a Prisma enum (e.g., enum ResourceType) with the same members used by
ResourceType in src/lib/tokens.ts, then change UserToken.permissions from
String? to ResourceType? (or ResourceType if non-nullable), update any code that
creates/reads tokens to use the enum values, and run prisma migrate and prisma
generate to apply the schema change.
src/inngest/functions/sync-integrations.ts (1)

67-71: Consider token TTL vs external service latency.

Tokens are created with a 15-minute TTL (DEFAULT_TOKEN_TTL_SECONDS). If N8N or partner services batch requests or experience delays beyond 15 minutes before calling back to the upload endpoint, the token will have expired, causing silent failures.

For most scenarios this should be adequate, but if external services are known to have high latency or batching behavior, consider:

  1. Increasing the TTL for sync tokens specifically
  2. Adding monitoring/alerting for expired token rejections
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/inngest/functions/sync-integrations.ts` around lines 67 - 71, The current
createUserToken call uses DEFAULT_TOKEN_TTL_SECONDS which may expire before
external services (e.g., N8N) call back; change this by introducing a longer TTL
constant (e.g., DEFAULT_SYNC_TOKEN_TTL_SECONDS) and use that in the
createUserToken call for sync flows (replace DEFAULT_TOKEN_TTL_SECONDS in the
invocation that passes integrationUserId and resourceType), and update any
token-creation docs/tests; additionally emit a log/metric when generating sync
tokens (in the same function) so expired-token issues can be monitored.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@n8n_workflows/AI_Sync_Workflow.json`:
- Around line 88-89: The "name" fallback in the node's templated field uses the
incorrect header "Authentication"; update the ternary expression that sets the
"name" property (the expression checking $json.body.authType === "Header") to
return "Authorization" instead of "Authentication" so Bearer tokens use the
standard Authorization header; ensure the paired "value" expression that
constructs "Bearer " + $json.body.authentication.token remains unchanged.

In `@src/features/integrations/types.ts`:
- Around line 22-26: Existing integrations with syncEvery below the new floor
(INTEGRATION_SYNC_EVERY_MIN * 60) will be rejected by integrationInputSchema;
either backfill existing records to meet the minimum or only enforce the minimum
on creation/updates moving forward. Implement one of two fixes: (A) write a data
migration that finds records with syncEvery < INTEGRATION_SYNC_EVERY_MIN * 60
and updates them to the floor (referencing the syncEvery field and
INTEGRATION_SYNC_EVERY_MIN constant), or (B) relax validation in
integrationInputSchema so the .min(...) constraint is applied only for creation
flows (e.g., apply the min check in the create handler/validateCreate path
instead of the schema used for general updates), and keep the stricter .min for
new integrations. Ensure integrationInputSchema and any code paths that call it
(create/update handlers) are updated accordingly so existing stored integrations
remain editable.

In `@src/inngest/functions/sync-integrations.ts`:
- Around line 161-164: The code uses a non-null assertion on
integration.integrationUserId when calling getResponseConfig, which can throw at
runtime; update the partner integration sync to guard against a missing
integration.integrationUserId (e.g., check for undefined/null before calling
getResponseConfig), and handle the error case by logging and skipping or
throwing a clear error instead of using the "!" operator; ensure the call that
produces responsePath uses validated values for integration.integrationUserId
and integration.resourceType (refer to getResponseConfig,
integration.integrationUserId, integration.resourceType, and responsePath) so
runtime failures are prevented.
- Around line 114-118: Add an explicit guard that checks
integration.integrationUserId before calling getResponseConfig to avoid the
non-null assertion and provide a clear error; inside both syncAiIntegration and
syncPartnerIntegration, if (!integration.integrationUserId) return { success:
false, errorMessage: `Integration ${integration.id} is missing integrationUserId
and cannot sync` }; then proceed to call
getResponseConfig(integration.integrationUserId, integration.resourceType) only
when the guard passes.

In `@src/lib/router-utils.ts`:
- Around line 222-228: The call to tx.apiKeyConnector.update inside
upsertSyncStatus can throw if an integration has no ApiKeyConnector; change this
to use tx.apiKeyConnector.updateMany with the same where:{ integrationId } and
data:{ lastRequest: lastSynced } (or alternatively add a null-check for the
related apiKeyConnector like the pattern used in the other router) so the
operation is safe when no connector exists and will silently affect zero rows
instead of raising a Prisma error.

In `@src/lib/tokens.ts`:
- Line 1: This module needs the server-only boundary to prevent client imports:
add import 'server-only' as the very first statement in src/lib/tokens.ts so the
file is treated as server-only; leave the existing prisma import and any
one-time-token functions (e.g., symbols like prisma, createOneTimeToken,
verifyOneTimeToken or similar) intact — just prepend import 'server-only' at the
top of the file.
- Around line 49-67: The findUnique()+delete() race should be replaced with an
atomic deleteMany() call: remove the prisma.userToken.findUnique and subsequent
delete and instead call prisma.userToken.deleteMany({ where: { tokenHash,
expiresAt: { gt: new Date() }, ...(expectedPermissions ? { permissions:
expectedPermissions } : {}) } }); then check the returned count (if count === 0
return null, else treat token as consumed). This ensures a single atomic
operation (use prisma.userToken.deleteMany and remove the separate
prisma.userToken.delete calls) and encodes the expiry and expectedPermissions
into the where clause so concurrent requests cannot both succeed.

---

Nitpick comments:
In `@n8n_workflows/AI_Sync_Workflow.json`:
- Around line 41-46: Remove the dead assignment for the credential variable
"viperAPIKey" which is set to "={{ $json.body.responseApiKey }}" but never used;
locate the JSON node with "name": "viperAPIKey" (id
"a4e97f8d-527d-4500-8fd5-a52a38b260b2") and delete that entire object entry so
no unused API key variable remains—confirm the "Send Data to VIPER API" node now
relies on URL tokens and tests still pass.

In `@prisma/schema.prisma`:
- Around line 94-100: The permissions field on the Prisma model UserToken is a
freeform String? and should be converted to a Prisma enum so schema-level types
align with your ResourceType in src/lib/tokens.ts. Add a Prisma enum (e.g., enum
ResourceType) with the same members used by ResourceType in src/lib/tokens.ts,
then change UserToken.permissions from String? to ResourceType? (or ResourceType
if non-nullable), update any code that creates/reads tokens to use the enum
values, and run prisma migrate and prisma generate to apply the schema change.

In `@src/inngest/functions/purge-expired-user-tokens.ts`:
- Line 3: The import in purge-expired-user-tokens.ts should use the project
source-root alias instead of a relative path: update the import that brings in
the inngest symbol (currently from "../client") to use the configured src alias
(the same alias used across the codebase for the client module) so the file
matches project conventions and remains stable under moves/renames.

In `@src/inngest/functions/sync-integrations.ts`:
- Around line 67-71: The current createUserToken call uses
DEFAULT_TOKEN_TTL_SECONDS which may expire before external services (e.g., N8N)
call back; change this by introducing a longer TTL constant (e.g.,
DEFAULT_SYNC_TOKEN_TTL_SECONDS) and use that in the createUserToken call for
sync flows (replace DEFAULT_TOKEN_TTL_SECONDS in the invocation that passes
integrationUserId and resourceType), and update any token-creation docs/tests;
additionally emit a log/metric when generating sync tokens (in the same
function) so expired-token issues can be monitored.

In `@src/lib/schemas.ts`:
- Around line 113-115: The token field in the response schema currently accepts
any string; tighten the API boundary by updating the schema returned by
pagesWithLinksSchema.extend (the object that adds token) to validate the token
shape as the 64-character lowercase hex produced by src/lib/tokens.ts—replace
the loose z.string() with a string schema that enforces /^[0-9a-f]{64}$/ (or
equivalent Zod regex) so only valid tokens pass through and unnecessary hashing
of invalid values is avoided.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: 8ffad9e9-bd98-4bf7-87a6-a3ba410a085b

📥 Commits

Reviewing files that changed from the base of the PR and between 19f5a20 and d438cc2.

📒 Files selected for processing (26)
  • n8n_workflows/AI_Sync_Workflow.json
  • prisma/migrations/20260326153511_user_tokens/migration.sql
  • prisma/schema.prisma
  • src/app/api/inngest/route.ts
  • src/app/api/v1/__tests__/assets.test.ts
  • src/app/api/v1/__tests__/deviceArtifacts.test.ts
  • src/app/api/v1/__tests__/remediations.test.ts
  • src/app/api/v1/__tests__/test-config.ts
  • src/app/api/v1/__tests__/vulnerabilities.test.ts
  • src/config/constants.ts
  • src/features/assets/server/routers.ts
  • src/features/device-artifacts/server/routers.ts
  • src/features/integrations/components/columns.tsx
  • src/features/integrations/components/integrations-layout.tsx
  • src/features/integrations/components/integrations.tsx
  • src/features/integrations/hooks/use-integrations.ts
  • src/features/integrations/server/routers.ts
  • src/features/integrations/types.ts
  • src/features/remediations/server/routers.ts
  • src/features/vulnerabilities/server/routers.ts
  • src/inngest/functions/purge-expired-user-tokens.ts
  • src/inngest/functions/sync-integrations.ts
  • src/lib/auth.ts
  • src/lib/router-utils.ts
  • src/lib/schemas.ts
  • src/lib/tokens.ts

Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🧹 Nitpick comments (2)
src/lib/url-utils.ts (1)

16-22: Update the JSDoc for the new production env var.

The docs still say Vercel uses VERCEL_URL, but production now reads VERCEL_PROJECT_PRODUCTION_URL. Please keep the comment/example aligned so debugging points at the right variable.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/lib/url-utils.ts` around lines 16 - 22, Update the JSDoc/comment in
src/lib/url-utils.ts that documents environment variables: change any
mention/example of VERCEL using VERCEL_URL for production to instead reference
VERCEL_PROJECT_PRODUCTION_URL so the comment matches the code path that returns
`https://${process.env.VERCEL_PROJECT_PRODUCTION_URL}` (also keep the preview
note for VERCEL_URL tied to the preview branch that returns
`https://${process.env.VERCEL_URL}`).
src/lib/router-utils.ts (1)

278-295: Redact the {token} path segment everywhere.

This bearer credential now lives in the request URL, so default access logging, tracing, and error telemetry will capture it unless /integrationUpload/:token is scrubbed at the proxy/app/telemetry layers.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/lib/router-utils.ts` around lines 278 - 295, The bearer token in the URL
must be redacted before it reaches logs/traces/telemetry: add a middleware that
detects the /integrationUpload/:token path and replaces the token segment
(req.url or req.path) with a constant placeholder like
"/integrationUpload/[REDACTED]" and also overwrite req.params.token (and any
request context used by tracing) so downstream logging/tracing/telemetry and
error handlers never see the raw token; ensure processIntegrationToken still
receives the token value from the original request body/headers (or pass it
explicitly) but never allow it to propagate into logs, error messages, or span
attributes.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src/lib/router-utils.ts`:
- Around line 282-295: The current flow calls consumeUserToken(token,
expectedPermissions) immediately which burns the one-time token before the route
confirms the integration exists and before the upload/DB transaction completes;
change it to a two-step flow: first validate the token without consuming (e.g.,
add or use a validateUserToken/tokenLookup function that returns userId and
permissions) and use that to look up the integration via
prisma.integration.findUnique and requireExistence(integration, "Integration");
then, after the upload/transaction commits successfully, call the existing
consumeUserToken(token, expectedPermissions) (or a dedicated markTokenUsed
function) to finalize consumption so retries on transient failures still work.

In `@src/lib/tokens.ts`:
- Around line 23-30: createUserToken must validate the ttlSeconds API input
before proceeding: at the top of the function (before
generateRawToken()/hashToken()/expiresAt) add a guard that ensures ttlSeconds is
a finite number > 0 (reject 0, negatives, and NaN) and throw a clear Error (e.g.
"invalid ttlSeconds") when the check fails; reference DEFAULT_TOKEN_TTL_SECONDS
only as the default value, but still validate the effective ttlSeconds passed
in.

---

Nitpick comments:
In `@src/lib/router-utils.ts`:
- Around line 278-295: The bearer token in the URL must be redacted before it
reaches logs/traces/telemetry: add a middleware that detects the
/integrationUpload/:token path and replaces the token segment (req.url or
req.path) with a constant placeholder like "/integrationUpload/[REDACTED]" and
also overwrite req.params.token (and any request context used by tracing) so
downstream logging/tracing/telemetry and error handlers never see the raw token;
ensure processIntegrationToken still receives the token value from the original
request body/headers (or pass it explicitly) but never allow it to propagate
into logs, error messages, or span attributes.

In `@src/lib/url-utils.ts`:
- Around line 16-22: Update the JSDoc/comment in src/lib/url-utils.ts that
documents environment variables: change any mention/example of VERCEL using
VERCEL_URL for production to instead reference VERCEL_PROJECT_PRODUCTION_URL so
the comment matches the code path that returns
`https://${process.env.VERCEL_PROJECT_PRODUCTION_URL}` (also keep the preview
note for VERCEL_URL tied to the preview branch that returns
`https://${process.env.VERCEL_URL}`).
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: 87345d23-37a9-467e-aa8e-f7ea7c814c3e

📥 Commits

Reviewing files that changed from the base of the PR and between d438cc2 and 1711b2f.

📒 Files selected for processing (6)
  • prisma/migrations/20260327193952_user_tokens/migration.sql
  • prisma/schema.prisma
  • src/inngest/functions/sync-integrations.ts
  • src/lib/router-utils.ts
  • src/lib/tokens.ts
  • src/lib/url-utils.ts
🚧 Files skipped from review as they are similar to previous changes (1)
  • src/inngest/functions/sync-integrations.ts

Comment on lines +282 to +295
const userId = await consumeUserToken(token, expectedPermissions);
// TODO: probably need better error messages than this tbh
if (!userId) {
throw new TRPCError({
code: "FORBIDDEN",
message: `Invalid token (couldn't find it, expired, or invalid permissions)`,
});
}
const integration = await prisma.integration.findUnique({
where: { integrationUserId: userId },
select: { id: true },
});
const integrationId = requireExistence(integration, "Integration").id;
return { userId, integrationId };
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Don't burn the token before the upload is durable.

Line 282 consumes the one-time token before Line 290 even confirms the backing integration still exists, and long before the route's writes succeed. If anything later fails with a transient 5xx, the sender's retry will hit FORBIDDEN and that sync response is lost. Split this into a validate step plus a final consume/mark-used step once the upload transaction commits.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/lib/router-utils.ts` around lines 282 - 295, The current flow calls
consumeUserToken(token, expectedPermissions) immediately which burns the
one-time token before the route confirms the integration exists and before the
upload/DB transaction completes; change it to a two-step flow: first validate
the token without consuming (e.g., add or use a validateUserToken/tokenLookup
function that returns userId and permissions) and use that to look up the
integration via prisma.integration.findUnique and requireExistence(integration,
"Integration"); then, after the upload/transaction commits successfully, call
the existing consumeUserToken(token, expectedPermissions) (or a dedicated
markTokenUsed function) to finalize consumption so retries on transient failures
still work.

Comment on lines +23 to +30
export async function createUserToken(
userId: string,
ttlSeconds = DEFAULT_TOKEN_TTL_SECONDS,
permissions?: string,
): Promise<string> {
const raw = generateRawToken();
const tokenHash = await hashToken(raw);
const expiresAt = new Date(Date.now() + ttlSeconds * 1000);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Reject invalid TTLs up front.

ttlSeconds is exported API surface here; 0, negatives, or NaN currently persist tokens that are unusable from birth and only fail later as “invalid token”. Failing fast makes bad config obvious.

🛡️ Proposed guard
 export async function createUserToken(
   userId: string,
   ttlSeconds = DEFAULT_TOKEN_TTL_SECONDS,
   permissions?: string,
 ): Promise<string> {
+  if (!Number.isFinite(ttlSeconds) || ttlSeconds <= 0) {
+    throw new Error("ttlSeconds must be a positive number");
+  }
+
   const raw = generateRawToken();
   const tokenHash = await hashToken(raw);
   const expiresAt = new Date(Date.now() + ttlSeconds * 1000);
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
export async function createUserToken(
userId: string,
ttlSeconds = DEFAULT_TOKEN_TTL_SECONDS,
permissions?: string,
): Promise<string> {
const raw = generateRawToken();
const tokenHash = await hashToken(raw);
const expiresAt = new Date(Date.now() + ttlSeconds * 1000);
export async function createUserToken(
userId: string,
ttlSeconds = DEFAULT_TOKEN_TTL_SECONDS,
permissions?: string,
): Promise<string> {
if (!Number.isFinite(ttlSeconds) || ttlSeconds <= 0) {
throw new Error("ttlSeconds must be a positive number");
}
const raw = generateRawToken();
const tokenHash = await hashToken(raw);
const expiresAt = new Date(Date.now() + ttlSeconds * 1000);
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/lib/tokens.ts` around lines 23 - 30, createUserToken must validate the
ttlSeconds API input before proceeding: at the top of the function (before
generateRawToken()/hashToken()/expiresAt) add a guard that ensures ttlSeconds is
a finite number > 0 (reject 0, negatives, and NaN) and throw a clear Error (e.g.
"invalid ttlSeconds") when the check fails; reference DEFAULT_TOKEN_TTL_SECONDS
only as the default value, but still validate the effective ttlSeconds passed
in.

Comment on lines +16 to +21
// Vercel preview deployment
if (process.env.VERCEL_ENV === "preview")
return `https://${process.env.VERCEL_URL}`;

// Vercel production deployment
if (process.env.VERCEL_ENV === "production")
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use preview env url if we're in a preview environment. I think this will fix several issues going forwards, but in this PR, it means that the webhook_url is the preview env url, not viper-xi.vercel.app. This PR makes testing new integrations in preview environments viable since we don't have to reconfigure the API key

body: JSON.stringify({
// If you're testing this locally and need webhooks, use NEXT_PUBLIC_APP_URL
baseApiUrl: `${getBaseUrl()}/api/v1`,
responseApiKey: apiKey.key, // TODO: eventually switch to tokens or webhook secrets
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

😎

Copy link
Copy Markdown
Contributor

@trummelhadron trummelhadron left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well done!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants