Conversation
what needs to change:
* integrations currently link to an api key, that's not right. remove that from the schema - DONE
* integration create -> don't create an api key - DONE
* integration rotate -> don't rotate an api key - DONE
* integration apikeyconnector -> don't connect api key to apikeyconnector. - DONE
* Now, how do I find out when an integration was last used? just reuse `lastSuccessfulSync`
* so probably need to add a prisma client extension for this? or put the code in the integrationUpload endpoint, that might be best
* sync-integrations: do create a user token with appropriate permission for resource type
* need to modify `integrationUpload` urls to contain a token in the url, and do auth to get integrationUser with that
* so they aren't a protected procedure any more
* also make sure to check permission
* double check that prisma delete event still works...
what needs to change:
* integrations currently link to an api key, that's not right. remove that from the schema - DONE
* integration create -> don't create an api key - DONE
* integration rotate -> don't rotate an api key - DONE
* integration apikeyconnector -> don't connect api key to apikeyconnector. - DONE
* Now, how do I find out when an integration was last used?
* so probably need to add a prisma client extension for this? or put the code in the integrationUpload endpoint, that might be best - DONE
* sync-integrations: do create a user token with appropriate permission for resource type - DONE
* need to modify `integrationUpload` urls to contain a token in the url, and do auth to get integrationUser with that - DONE
* so they aren't a protected procedure any more - DONE
* also make sure to check permission - DONE
TODO: can I use user tokens again for AI authentication, rather than API keys? that seems smarter here. - DONE
|
The latest updates on your projects. Learn more about Vercel for GitHub.
1 Skipped Deployment
|
📝 WalkthroughWalkthroughThis pull request replaces API-key-based authentication for integration webhooks with a cryptographically-hashed, one-time token system. It introduces a new UserToken database model, updates integration endpoints to accept tokens via URL paths, removes API key rotation features, and modifies the sync workflow to generate and consume tokens instead of managing persistent API keys. Changes
Sequence DiagramsequenceDiagram
actor Client
participant Endpoint as Integration Upload<br/>(e.g., /assets/{token})
participant TokenService as processIntegrationToken()
participant TokenDB as UserToken DB
participant IntegrationDB as Integration DB
participant Processor as processIntegrationSync()
Client->>Endpoint: POST /assets/integrationUpload/{token}
Endpoint->>TokenService: processIntegrationToken(token, ResourceType)
TokenService->>TokenDB: consumeUserToken(raw)
TokenDB-->>TokenService: userId (if hash match & not expired)
TokenService->>IntegrationDB: findFirst by integrationUserId
IntegrationDB-->>TokenService: integration (or null)
alt Token Invalid or Expired
TokenService-->>Endpoint: FORBIDDEN error
Endpoint-->>Client: 403 Forbidden
else Token Valid & Integration Found
TokenService-->>Endpoint: {userId, integrationId}
TokenDB->>TokenDB: Delete consumed token
Endpoint->>Processor: processIntegrationSync(userId, integrationId, ...)
Processor->>IntegrationDB: Update sync status & lastRequest
Processor-->>Endpoint: Success
Endpoint-->>Client: 200 OK
end
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Possibly related PRs
Suggested reviewers
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
| /* | ||
| Warnings: | ||
|
|
||
| - You are about to drop the column `apiKeyId` on the `integration` table. All the data in the column will be lost. |
There was a problem hiding this comment.
remove api key from integration, create UserToken model.
UserToken stores a hash of a token, expiration time, user, and optional permissions.
permissions are so that user tokens can be used more generally on VIPER if needed, and to make sure that token users "stay in their lanes". it's cheap RBAC. for integration uploads we currently set a permission on the user token for just that integration endpoint, and then when we use a token, we make sure it has the expected permission. if it doesn't the token is invalid. so Helm, which gives us vulnerabilities, can't use its token to give us assets instead.
There was a problem hiding this comment.
use a constant to enforce how often integrations sync. this constant is used in the frontend form verification, and controls how often the backend inngest event runs.
changed from every minute doing a syncrhonization to every 5 minutes, just to get fewer events on inngest in prod
There was a problem hiding this comment.
get rid of rotate api key function on integrations
| userId: true, | ||
| integrationUserId: true, | ||
| apiKeyConnector: true, | ||
| const integration = await prisma.$transaction(async (tx) => { |
There was a problem hiding this comment.
moved creating integration user and integration into a prisma transaction, since we no longer also have to create api keys at the same time
There was a problem hiding this comment.
inngest fn deletes expired tokens from db, runs every 6 hrs (could later increase to run daily if event count is a concern)
There was a problem hiding this comment.
Actionable comments posted: 7
🧹 Nitpick comments (5)
n8n_workflows/AI_Sync_Workflow.json (1)
41-46: Unused variable:viperAPIKeyis assigned but never used.The
viperAPIKeyassignment appears to be leftover from the previous API key-based authentication. Since the "Send Data to VIPER API" node no longer uses bearer credentials (authentication is now via URL tokens), this variable is dead code.♻️ Proposed fix: Remove unused assignment
{ "id": "684395a7-ceb1-4524-8eaf-a3970c6fb934", "name": "responsePath", "value": "={{ $json.body.responsePath }}", "type": "string" - }, - { - "id": "a4e97f8d-527d-4500-8fd5-a52a38b260b2", - "name": "viperAPIKey", - "value": "={{ $json.body.responseApiKey }}", - "type": "string" }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@n8n_workflows/AI_Sync_Workflow.json` around lines 41 - 46, Remove the dead assignment for the credential variable "viperAPIKey" which is set to "={{ $json.body.responseApiKey }}" but never used; locate the JSON node with "name": "viperAPIKey" (id "a4e97f8d-527d-4500-8fd5-a52a38b260b2") and delete that entire object entry so no unused API key variable remains—confirm the "Send Data to VIPER API" node now relies on URL tokens and tests still pass.src/inngest/functions/purge-expired-user-tokens.ts (1)
3-3: Use the source-root alias for the Inngest client import.
../clientmakes this new file an outlier against the rest of the app and is harder to keep stable through moves or renames.♻️ Suggested change
-import { inngest } from "../client"; +import { inngest } from "@/inngest/client";As per coding guidelines "Use @ import alias consistently to map to src/ directory".
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/inngest/functions/purge-expired-user-tokens.ts` at line 3, The import in purge-expired-user-tokens.ts should use the project source-root alias instead of a relative path: update the import that brings in the inngest symbol (currently from "../client") to use the configured src alias (the same alias used across the codebase for the client module) so the file matches project conventions and remains stable under moves/renames.src/lib/schemas.ts (1)
113-115: Validate the token shape at the API boundary.
src/lib/tokens.tsonly issues 64-character lowercase hex tokens. Accepting any string here widens the public contract and still spends work hashing obviously invalid values. Tightening this schema also makes the path param contract more precise.♻️ Suggested change
return pagesWithLinksSchema.extend({ - token: z.string(), // the user token calling this endpoint + token: z + .string() + .length(64) + .regex(/^[a-f0-9]+$/, "Invalid token"), // the user token calling this endpoint });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/lib/schemas.ts` around lines 113 - 115, The token field in the response schema currently accepts any string; tighten the API boundary by updating the schema returned by pagesWithLinksSchema.extend (the object that adds token) to validate the token shape as the 64-character lowercase hex produced by src/lib/tokens.ts—replace the loose z.string() with a string schema that enforces /^[0-9a-f]{64}$/ (or equivalent Zod regex) so only valid tokens pass through and unnecessary hashing of invalid values is avoided.prisma/schema.prisma (1)
94-100: Make token scope an enum.
permissionsis part of the auth boundary, but asString?it can drift from theResourceTypevalues consumed insrc/lib/tokens.ts. StoringResourceType?here lets Prisma reject bad scopes and gives the token helpers a real type to share.♻️ Suggested change
model UserToken { id String `@id` `@default`(cuid()) userId String user User `@relation`(fields: [userId], references: [id], onDelete: Cascade) tokenHash String `@unique` - permissions String? // what can the token be used for? + permissions ResourceType? // what can the token be used for? expiresAt DateTime createdAt DateTime `@default`(now())🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@prisma/schema.prisma` around lines 94 - 100, The permissions field on the Prisma model UserToken is a freeform String? and should be converted to a Prisma enum so schema-level types align with your ResourceType in src/lib/tokens.ts. Add a Prisma enum (e.g., enum ResourceType) with the same members used by ResourceType in src/lib/tokens.ts, then change UserToken.permissions from String? to ResourceType? (or ResourceType if non-nullable), update any code that creates/reads tokens to use the enum values, and run prisma migrate and prisma generate to apply the schema change.src/inngest/functions/sync-integrations.ts (1)
67-71: Consider token TTL vs external service latency.Tokens are created with a 15-minute TTL (
DEFAULT_TOKEN_TTL_SECONDS). If N8N or partner services batch requests or experience delays beyond 15 minutes before calling back to the upload endpoint, the token will have expired, causing silent failures.For most scenarios this should be adequate, but if external services are known to have high latency or batching behavior, consider:
- Increasing the TTL for sync tokens specifically
- Adding monitoring/alerting for expired token rejections
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/inngest/functions/sync-integrations.ts` around lines 67 - 71, The current createUserToken call uses DEFAULT_TOKEN_TTL_SECONDS which may expire before external services (e.g., N8N) call back; change this by introducing a longer TTL constant (e.g., DEFAULT_SYNC_TOKEN_TTL_SECONDS) and use that in the createUserToken call for sync flows (replace DEFAULT_TOKEN_TTL_SECONDS in the invocation that passes integrationUserId and resourceType), and update any token-creation docs/tests; additionally emit a log/metric when generating sync tokens (in the same function) so expired-token issues can be monitored.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@n8n_workflows/AI_Sync_Workflow.json`:
- Around line 88-89: The "name" fallback in the node's templated field uses the
incorrect header "Authentication"; update the ternary expression that sets the
"name" property (the expression checking $json.body.authType === "Header") to
return "Authorization" instead of "Authentication" so Bearer tokens use the
standard Authorization header; ensure the paired "value" expression that
constructs "Bearer " + $json.body.authentication.token remains unchanged.
In `@src/features/integrations/types.ts`:
- Around line 22-26: Existing integrations with syncEvery below the new floor
(INTEGRATION_SYNC_EVERY_MIN * 60) will be rejected by integrationInputSchema;
either backfill existing records to meet the minimum or only enforce the minimum
on creation/updates moving forward. Implement one of two fixes: (A) write a data
migration that finds records with syncEvery < INTEGRATION_SYNC_EVERY_MIN * 60
and updates them to the floor (referencing the syncEvery field and
INTEGRATION_SYNC_EVERY_MIN constant), or (B) relax validation in
integrationInputSchema so the .min(...) constraint is applied only for creation
flows (e.g., apply the min check in the create handler/validateCreate path
instead of the schema used for general updates), and keep the stricter .min for
new integrations. Ensure integrationInputSchema and any code paths that call it
(create/update handlers) are updated accordingly so existing stored integrations
remain editable.
In `@src/inngest/functions/sync-integrations.ts`:
- Around line 161-164: The code uses a non-null assertion on
integration.integrationUserId when calling getResponseConfig, which can throw at
runtime; update the partner integration sync to guard against a missing
integration.integrationUserId (e.g., check for undefined/null before calling
getResponseConfig), and handle the error case by logging and skipping or
throwing a clear error instead of using the "!" operator; ensure the call that
produces responsePath uses validated values for integration.integrationUserId
and integration.resourceType (refer to getResponseConfig,
integration.integrationUserId, integration.resourceType, and responsePath) so
runtime failures are prevented.
- Around line 114-118: Add an explicit guard that checks
integration.integrationUserId before calling getResponseConfig to avoid the
non-null assertion and provide a clear error; inside both syncAiIntegration and
syncPartnerIntegration, if (!integration.integrationUserId) return { success:
false, errorMessage: `Integration ${integration.id} is missing integrationUserId
and cannot sync` }; then proceed to call
getResponseConfig(integration.integrationUserId, integration.resourceType) only
when the guard passes.
In `@src/lib/router-utils.ts`:
- Around line 222-228: The call to tx.apiKeyConnector.update inside
upsertSyncStatus can throw if an integration has no ApiKeyConnector; change this
to use tx.apiKeyConnector.updateMany with the same where:{ integrationId } and
data:{ lastRequest: lastSynced } (or alternatively add a null-check for the
related apiKeyConnector like the pattern used in the other router) so the
operation is safe when no connector exists and will silently affect zero rows
instead of raising a Prisma error.
In `@src/lib/tokens.ts`:
- Line 1: This module needs the server-only boundary to prevent client imports:
add import 'server-only' as the very first statement in src/lib/tokens.ts so the
file is treated as server-only; leave the existing prisma import and any
one-time-token functions (e.g., symbols like prisma, createOneTimeToken,
verifyOneTimeToken or similar) intact — just prepend import 'server-only' at the
top of the file.
- Around line 49-67: The findUnique()+delete() race should be replaced with an
atomic deleteMany() call: remove the prisma.userToken.findUnique and subsequent
delete and instead call prisma.userToken.deleteMany({ where: { tokenHash,
expiresAt: { gt: new Date() }, ...(expectedPermissions ? { permissions:
expectedPermissions } : {}) } }); then check the returned count (if count === 0
return null, else treat token as consumed). This ensures a single atomic
operation (use prisma.userToken.deleteMany and remove the separate
prisma.userToken.delete calls) and encodes the expiry and expectedPermissions
into the where clause so concurrent requests cannot both succeed.
---
Nitpick comments:
In `@n8n_workflows/AI_Sync_Workflow.json`:
- Around line 41-46: Remove the dead assignment for the credential variable
"viperAPIKey" which is set to "={{ $json.body.responseApiKey }}" but never used;
locate the JSON node with "name": "viperAPIKey" (id
"a4e97f8d-527d-4500-8fd5-a52a38b260b2") and delete that entire object entry so
no unused API key variable remains—confirm the "Send Data to VIPER API" node now
relies on URL tokens and tests still pass.
In `@prisma/schema.prisma`:
- Around line 94-100: The permissions field on the Prisma model UserToken is a
freeform String? and should be converted to a Prisma enum so schema-level types
align with your ResourceType in src/lib/tokens.ts. Add a Prisma enum (e.g., enum
ResourceType) with the same members used by ResourceType in src/lib/tokens.ts,
then change UserToken.permissions from String? to ResourceType? (or ResourceType
if non-nullable), update any code that creates/reads tokens to use the enum
values, and run prisma migrate and prisma generate to apply the schema change.
In `@src/inngest/functions/purge-expired-user-tokens.ts`:
- Line 3: The import in purge-expired-user-tokens.ts should use the project
source-root alias instead of a relative path: update the import that brings in
the inngest symbol (currently from "../client") to use the configured src alias
(the same alias used across the codebase for the client module) so the file
matches project conventions and remains stable under moves/renames.
In `@src/inngest/functions/sync-integrations.ts`:
- Around line 67-71: The current createUserToken call uses
DEFAULT_TOKEN_TTL_SECONDS which may expire before external services (e.g., N8N)
call back; change this by introducing a longer TTL constant (e.g.,
DEFAULT_SYNC_TOKEN_TTL_SECONDS) and use that in the createUserToken call for
sync flows (replace DEFAULT_TOKEN_TTL_SECONDS in the invocation that passes
integrationUserId and resourceType), and update any token-creation docs/tests;
additionally emit a log/metric when generating sync tokens (in the same
function) so expired-token issues can be monitored.
In `@src/lib/schemas.ts`:
- Around line 113-115: The token field in the response schema currently accepts
any string; tighten the API boundary by updating the schema returned by
pagesWithLinksSchema.extend (the object that adds token) to validate the token
shape as the 64-character lowercase hex produced by src/lib/tokens.ts—replace
the loose z.string() with a string schema that enforces /^[0-9a-f]{64}$/ (or
equivalent Zod regex) so only valid tokens pass through and unnecessary hashing
of invalid values is avoided.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 8ffad9e9-bd98-4bf7-87a6-a3ba410a085b
📒 Files selected for processing (26)
n8n_workflows/AI_Sync_Workflow.jsonprisma/migrations/20260326153511_user_tokens/migration.sqlprisma/schema.prismasrc/app/api/inngest/route.tssrc/app/api/v1/__tests__/assets.test.tssrc/app/api/v1/__tests__/deviceArtifacts.test.tssrc/app/api/v1/__tests__/remediations.test.tssrc/app/api/v1/__tests__/test-config.tssrc/app/api/v1/__tests__/vulnerabilities.test.tssrc/config/constants.tssrc/features/assets/server/routers.tssrc/features/device-artifacts/server/routers.tssrc/features/integrations/components/columns.tsxsrc/features/integrations/components/integrations-layout.tsxsrc/features/integrations/components/integrations.tsxsrc/features/integrations/hooks/use-integrations.tssrc/features/integrations/server/routers.tssrc/features/integrations/types.tssrc/features/remediations/server/routers.tssrc/features/vulnerabilities/server/routers.tssrc/inngest/functions/purge-expired-user-tokens.tssrc/inngest/functions/sync-integrations.tssrc/lib/auth.tssrc/lib/router-utils.tssrc/lib/schemas.tssrc/lib/tokens.ts
There was a problem hiding this comment.
Actionable comments posted: 2
🧹 Nitpick comments (2)
src/lib/url-utils.ts (1)
16-22: Update the JSDoc for the new production env var.The docs still say Vercel uses
VERCEL_URL, but production now readsVERCEL_PROJECT_PRODUCTION_URL. Please keep the comment/example aligned so debugging points at the right variable.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/lib/url-utils.ts` around lines 16 - 22, Update the JSDoc/comment in src/lib/url-utils.ts that documents environment variables: change any mention/example of VERCEL using VERCEL_URL for production to instead reference VERCEL_PROJECT_PRODUCTION_URL so the comment matches the code path that returns `https://${process.env.VERCEL_PROJECT_PRODUCTION_URL}` (also keep the preview note for VERCEL_URL tied to the preview branch that returns `https://${process.env.VERCEL_URL}`).src/lib/router-utils.ts (1)
278-295: Redact the{token}path segment everywhere.This bearer credential now lives in the request URL, so default access logging, tracing, and error telemetry will capture it unless
/integrationUpload/:tokenis scrubbed at the proxy/app/telemetry layers.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/lib/router-utils.ts` around lines 278 - 295, The bearer token in the URL must be redacted before it reaches logs/traces/telemetry: add a middleware that detects the /integrationUpload/:token path and replaces the token segment (req.url or req.path) with a constant placeholder like "/integrationUpload/[REDACTED]" and also overwrite req.params.token (and any request context used by tracing) so downstream logging/tracing/telemetry and error handlers never see the raw token; ensure processIntegrationToken still receives the token value from the original request body/headers (or pass it explicitly) but never allow it to propagate into logs, error messages, or span attributes.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/lib/router-utils.ts`:
- Around line 282-295: The current flow calls consumeUserToken(token,
expectedPermissions) immediately which burns the one-time token before the route
confirms the integration exists and before the upload/DB transaction completes;
change it to a two-step flow: first validate the token without consuming (e.g.,
add or use a validateUserToken/tokenLookup function that returns userId and
permissions) and use that to look up the integration via
prisma.integration.findUnique and requireExistence(integration, "Integration");
then, after the upload/transaction commits successfully, call the existing
consumeUserToken(token, expectedPermissions) (or a dedicated markTokenUsed
function) to finalize consumption so retries on transient failures still work.
In `@src/lib/tokens.ts`:
- Around line 23-30: createUserToken must validate the ttlSeconds API input
before proceeding: at the top of the function (before
generateRawToken()/hashToken()/expiresAt) add a guard that ensures ttlSeconds is
a finite number > 0 (reject 0, negatives, and NaN) and throw a clear Error (e.g.
"invalid ttlSeconds") when the check fails; reference DEFAULT_TOKEN_TTL_SECONDS
only as the default value, but still validate the effective ttlSeconds passed
in.
---
Nitpick comments:
In `@src/lib/router-utils.ts`:
- Around line 278-295: The bearer token in the URL must be redacted before it
reaches logs/traces/telemetry: add a middleware that detects the
/integrationUpload/:token path and replaces the token segment (req.url or
req.path) with a constant placeholder like "/integrationUpload/[REDACTED]" and
also overwrite req.params.token (and any request context used by tracing) so
downstream logging/tracing/telemetry and error handlers never see the raw token;
ensure processIntegrationToken still receives the token value from the original
request body/headers (or pass it explicitly) but never allow it to propagate
into logs, error messages, or span attributes.
In `@src/lib/url-utils.ts`:
- Around line 16-22: Update the JSDoc/comment in src/lib/url-utils.ts that
documents environment variables: change any mention/example of VERCEL using
VERCEL_URL for production to instead reference VERCEL_PROJECT_PRODUCTION_URL so
the comment matches the code path that returns
`https://${process.env.VERCEL_PROJECT_PRODUCTION_URL}` (also keep the preview
note for VERCEL_URL tied to the preview branch that returns
`https://${process.env.VERCEL_URL}`).
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 87345d23-37a9-467e-aa8e-f7ea7c814c3e
📒 Files selected for processing (6)
prisma/migrations/20260327193952_user_tokens/migration.sqlprisma/schema.prismasrc/inngest/functions/sync-integrations.tssrc/lib/router-utils.tssrc/lib/tokens.tssrc/lib/url-utils.ts
🚧 Files skipped from review as they are similar to previous changes (1)
- src/inngest/functions/sync-integrations.ts
| const userId = await consumeUserToken(token, expectedPermissions); | ||
| // TODO: probably need better error messages than this tbh | ||
| if (!userId) { | ||
| throw new TRPCError({ | ||
| code: "FORBIDDEN", | ||
| message: `Invalid token (couldn't find it, expired, or invalid permissions)`, | ||
| }); | ||
| } | ||
| const integration = await prisma.integration.findUnique({ | ||
| where: { integrationUserId: userId }, | ||
| select: { id: true }, | ||
| }); | ||
| const integrationId = requireExistence(integration, "Integration").id; | ||
| return { userId, integrationId }; |
There was a problem hiding this comment.
Don't burn the token before the upload is durable.
Line 282 consumes the one-time token before Line 290 even confirms the backing integration still exists, and long before the route's writes succeed. If anything later fails with a transient 5xx, the sender's retry will hit FORBIDDEN and that sync response is lost. Split this into a validate step plus a final consume/mark-used step once the upload transaction commits.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/lib/router-utils.ts` around lines 282 - 295, The current flow calls
consumeUserToken(token, expectedPermissions) immediately which burns the
one-time token before the route confirms the integration exists and before the
upload/DB transaction completes; change it to a two-step flow: first validate
the token without consuming (e.g., add or use a validateUserToken/tokenLookup
function that returns userId and permissions) and use that to look up the
integration via prisma.integration.findUnique and requireExistence(integration,
"Integration"); then, after the upload/transaction commits successfully, call
the existing consumeUserToken(token, expectedPermissions) (or a dedicated
markTokenUsed function) to finalize consumption so retries on transient failures
still work.
| export async function createUserToken( | ||
| userId: string, | ||
| ttlSeconds = DEFAULT_TOKEN_TTL_SECONDS, | ||
| permissions?: string, | ||
| ): Promise<string> { | ||
| const raw = generateRawToken(); | ||
| const tokenHash = await hashToken(raw); | ||
| const expiresAt = new Date(Date.now() + ttlSeconds * 1000); |
There was a problem hiding this comment.
Reject invalid TTLs up front.
ttlSeconds is exported API surface here; 0, negatives, or NaN currently persist tokens that are unusable from birth and only fail later as “invalid token”. Failing fast makes bad config obvious.
🛡️ Proposed guard
export async function createUserToken(
userId: string,
ttlSeconds = DEFAULT_TOKEN_TTL_SECONDS,
permissions?: string,
): Promise<string> {
+ if (!Number.isFinite(ttlSeconds) || ttlSeconds <= 0) {
+ throw new Error("ttlSeconds must be a positive number");
+ }
+
const raw = generateRawToken();
const tokenHash = await hashToken(raw);
const expiresAt = new Date(Date.now() + ttlSeconds * 1000);📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| export async function createUserToken( | |
| userId: string, | |
| ttlSeconds = DEFAULT_TOKEN_TTL_SECONDS, | |
| permissions?: string, | |
| ): Promise<string> { | |
| const raw = generateRawToken(); | |
| const tokenHash = await hashToken(raw); | |
| const expiresAt = new Date(Date.now() + ttlSeconds * 1000); | |
| export async function createUserToken( | |
| userId: string, | |
| ttlSeconds = DEFAULT_TOKEN_TTL_SECONDS, | |
| permissions?: string, | |
| ): Promise<string> { | |
| if (!Number.isFinite(ttlSeconds) || ttlSeconds <= 0) { | |
| throw new Error("ttlSeconds must be a positive number"); | |
| } | |
| const raw = generateRawToken(); | |
| const tokenHash = await hashToken(raw); | |
| const expiresAt = new Date(Date.now() + ttlSeconds * 1000); |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/lib/tokens.ts` around lines 23 - 30, createUserToken must validate the
ttlSeconds API input before proceeding: at the top of the function (before
generateRawToken()/hashToken()/expiresAt) add a guard that ensures ttlSeconds is
a finite number > 0 (reject 0, negatives, and NaN) and throw a clear Error (e.g.
"invalid ttlSeconds") when the check fails; reference DEFAULT_TOKEN_TTL_SECONDS
only as the default value, but still validate the effective ttlSeconds passed
in.
| // Vercel preview deployment | ||
| if (process.env.VERCEL_ENV === "preview") | ||
| return `https://${process.env.VERCEL_URL}`; | ||
|
|
||
| // Vercel production deployment | ||
| if (process.env.VERCEL_ENV === "production") |
There was a problem hiding this comment.
use preview env url if we're in a preview environment. I think this will fix several issues going forwards, but in this PR, it means that the webhook_url is the preview env url, not viper-xi.vercel.app. This PR makes testing new integrations in preview environments viable since we don't have to reconfigure the API key
| body: JSON.stringify({ | ||
| // If you're testing this locally and need webhooks, use NEXT_PUBLIC_APP_URL | ||
| baseApiUrl: `${getBaseUrl()}/api/v1`, | ||
| responseApiKey: apiKey.key, // TODO: eventually switch to tokens or webhook secrets |
previously a user had to configure an api key for the integrated service on VIPER, and configure a VIPER api key on the integrated service. so for example, a user had to put a Blueflow API key in VIPER and a VIPER API key in Blueflow.
Now, we use one-time-tokens to create unique urls. We pass these to external services, who get authenticated via the unique url, instead of from an API key
Summary by CodeRabbit
New Features
Removed Features