feat(types): M8 IRONCLAD — eliminate all wildcard casts#40
feat(types): M8 IRONCLAD — eliminate all wildcard casts#40flyingrobots merged 13 commits intomainfrom
Conversation
…n + infra Remove all TODO(ts-cleanup) tagged @type {*} / @type {any} casts from src/domain/warp/, src/domain/services/, and src/infrastructure/adapters/. Replaced with role-specific persistence types (PersistenceReader, PersistenceWriter, CheckpointPersistence, IndexStorage), error narrowing helpers (isError, hasErrorCode, hasMessage), and properly typed aliases. Adds Zod validation to HttpSyncServer constructor, assertion typing to SyncAuthService._validateKeys, ambient Bun/Deno types in globals.d.ts, consumer smoke test in test/type-check/, and type-surface manifest in contracts/. All gates pass: typecheck, typecheck:policy, typecheck:consumer, lint, 3799/3799 unit tests including no-coordination regression suite.
Tier 1 quick wins (5 files): - scripts/setup-hooks.js: error narrowing via instanceof - BunHttpAdapter.js: BunServer typedef for server state - roaring.js: RoaringModule typedef for lazy singleton - WarpGraph.js: CheckpointPersistence cast for audit service - text.js: unknown/Record types for formatDelta/formatOpSummaryPlain Tier 2 catch clauses (6 files): - doctor/checks.js: 7 catch clauses + internalError param + buildHookFinding param - doctor/index.js: 1 catch clause + error narrowing - verify-audit.js: catch clause + instanceof Error narrowing - path.js: catch clause + code property narrowing - view.js: catch clause + typed errObj intermediate - install-hooks.js: catch clause + ENOENT narrowing
… casts Zero TODO(ts-cleanup) tags, @type {*}, @type {any}, @returns {*}, and @returns {any} remain in src/, bin/, or test/. Narrowed CLI types (QueryBuilderLike, WarpGraphInstance, Persistence), made MinHeap generic, typed presenters/json.js, and replaced test mock wildcards with proper vitest Mock types. All 3,799 tests pass.
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughLarge type-safety and policy enforcement update: replaces many wildcard ( Changes
Estimated code review effort🎯 5 (Critical) | ⏱️ ~120 minutes Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 8
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (3)
src/domain/services/SyncAuthService.js (1)
172-185:⚠️ Potential issue | 🟡 MinorJSDoc marks
options.keysoptional but runtime requires it.
This can mislead consumers into thinkingkeysis optional. Consider keepingoptionsoptional butoptions.keysrequired when provided.✍️ Suggested JSDoc fix
- * `@param` {Record<string, string>} [options.keys] - Key-id to secret mapping + * `@param` {Record<string, string>} options.keys - Key-id to secret mapping🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/SyncAuthService.js` around lines 172 - 185, The JSDoc incorrectly marks options.keys as optional while the constructor calls _validateKeys(keys) and requires it; update the docs and add an explicit runtime check: in SyncAuthService.constructor ensure you validate presence of keys before calling _validateKeys (throw a clear error like "options.keys is required" if missing) and update the `@param` for options.keys to remove the optional brackets (make it required) so the JSDoc matches runtime behavior; reference SyncAuthService, constructor, and _validateKeys when making these changes.bin/cli/commands/verify-audit.js (1)
49-50:⚠️ Potential issue | 🟡 MinorMissed wildcard in
@returnsJSDoc.
{payload: *, exitCode: number}still carries a*cast — at odds with this PR's stated goal of zero wildcards insrc/bin. The payload type can be narrowed toRecord<string, unknown>to match the declared type ofpayloadat line 62.🛠️ Proposed fix
- * `@returns` {Promise<{payload: *, exitCode: number}>} + * `@returns` {Promise<{payload: Record<string, unknown>, exitCode: number}>}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@bin/cli/commands/verify-audit.js` around lines 49 - 50, Update the JSDoc for the function in verify-audit.js so the `@returns` payload type is not a wildcard; replace the "*" with "Record<string, unknown>" to match the runtime/declared type of payload (the payload variable used later in the function) and keep exitCode as number so the `@returns` reads {payload: Record<string, unknown>, exitCode: number}.bin/cli/commands/tree.js (1)
83-83:⚠️ Potential issue | 🟡 MinorReplace all
Record<string, *>withRecord<string, unknown>and fix return type wildcardLines 83 and 105 use
Record<string, *>which should beRecord<string, unknown>to align with the codebase cleanup pattern seen throughoutbin/directory. Additionally, line 154's return type usespayload: *which also needs the same replacement.♻️ Suggested replacements
/** - * `@param` {Record<string, *>} nodeProps + * `@param` {Record<string, unknown>} nodeProps- * `@param` {Map<string, Record<string, *>>} params.propsMap + * `@param` {Map<string, Record<string, unknown>>} params.propsMap- * `@returns` {Promise<{payload: *, exitCode: number}>} + * `@returns` {Promise<{payload: unknown, exitCode: number}>}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@bin/cli/commands/tree.js` at line 83, Replace all occurrences of the JSDoc wildcard type `Record<string, *>` with `Record<string, unknown>` and change any `payload: *` return-type wildcards to `payload: unknown`; specifically update the JSDoc for the `nodeProps` parameter (the `@param {Record<string, *>} nodeProps` entry) and the function return/type annotation that currently uses `payload: *` so they read `Record<string, unknown>` and `payload: unknown` respectively to match the codebase convention.
🧹 Nitpick comments (28)
src/domain/services/MessageCodecInternal.js (2)
69-70: Type annotation is documentary only — effective type isanyBecause the import on line 16 is guarded by
@ts-expect-error(no.d.tsfor@git-stunts/trailer-codec),TrailerCodecresolves toanyin the type checker, making@type {TrailerCodec|null}equivalent toanyfor enforcement purposes. The annotation is still worthwhile as intent documentation and will gain enforcement value if/when the library ships declarations. No action required now.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/MessageCodecInternal.js` around lines 69 - 70, The JSDoc type on the module-scoped variable _codec (/** `@type` {TrailerCodec|null} */) is only documentary because the import of TrailerCodec is guarded with `@ts-expect-error` and thus resolves to any in the checker; no code change is required—leave the annotation in place for intent documentation and future enforcement once `@git-stunts/trailer-codec` provides declarations.
76-82:getCodec()return will become a TS error once the library ships declarationsTypeScript does not narrow module-level mutable variables — after the
if (!_codec)guard,_codecis still typed asTrailerCodec|nullat thereturnsite, conflicting with the@returns {TrailerCodec}promise. This is currently masked because the import resolves toany, but it will surface the moment@git-stunts/trailer-codecships a.d.ts.♻️ Forward-compatible fix using a local variable
export function getCodec() { if (!_codec) { const service = new TrailerCodecService(); _codec = new TrailerCodec({ service }); } - return _codec; + const codec = _codec; + return codec;Or, more tersely, with a non-null assertion JSDoc cast:
- return _codec; + return /** `@type` {TrailerCodec} */ (_codec);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/MessageCodecInternal.js` around lines 76 - 82, getCodec() returns _codec which is typed as TrailerCodec|null at module scope, causing a TS declaration error; fix by assigning the newly created instance to a local non-null-typed variable and return that (or cast `_codec` to non-null via a JSDoc non-null assertion) so the return type is guaranteed TrailerCodec. Specifically, inside getCodec() use a local const (e.g. const codec = new TrailerCodec({ service })) then set _codec = codec and return codec, referencing getCodec, _codec, TrailerCodecService, and TrailerCodec to locate the change.src/domain/services/LogicalTraversal.js (1)
155-155: Consider extracting the long inline cast to a@typedef.The inline type assertion on
this._graphis correct and a clear improvement over a wildcard cast, but at ~200 characters it makes the line hard to read. Extracting it to a named@typedefwould make the assumed shape of_materializeGraph's return value easier to audit and update if the implementation changes.♻️ Suggested refactor
Add near the top of the file (after the
TraversalErrorimport):+/** + * `@typedef` {{ adjacency: { outgoing: Map<string, Array<{neighborId: string, label: string}>>, incoming: Map<string, Array<{neighborId: string, label: string}>> } }} MaterializedGraphShape + */Then simplify line 155:
- const materialized = await /** `@type` {{ _materializeGraph: () => Promise<{adjacency: {outgoing: Map<string, Array<{neighborId: string, label: string}>>, incoming: Map<string, Array<{neighborId: string, label: string}>>}}> }} */ (this._graph)._materializeGraph(); + const materialized = await /** `@type` {{ _materializeGraph: () => Promise<MaterializedGraphShape> }} */ (this._graph)._materializeGraph();Note: This cast is the sole source of truth for the return shape TypeScript expects here. Since
_materializeGraph's actual implementation lacks strict typing, keep the cast in sync with its real return structure if it changes.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/LogicalTraversal.js` at line 155, Extract the long inline JSDoc cast into a named typedef and use it where the cast is applied: add a typedef (e.g., typedef MaterializedGraphReturn) after the TraversalError import that describes the Promise-resolved shape returned by _materializeGraph (including adjacency.outgoing and adjacency.incoming Maps of neighborId/label arrays), then replace the inline cast on (this._graph)._materializeGraph() with a concise /** `@type` {MaterializedGraphReturn} */ cast so the const materialized assignment uses the typedef instead of the ~200-char inline assertion.src/infrastructure/adapters/WebCryptoAdapter.js (1)
39-43: Buffer branch is unreachable — dead codeThe
Bufferclass is a subclass ofUint8Array, so anyBufferinstance satisfiesdata instanceof Uint8Arrayon line 39 and is returned there, before execution ever reaches theBuffer.isBufferguard on line 41. The same applies to Bun, where the Node.js Buffer class extendsUint8Array.This makes lines 41–43 dead code in every supported runtime. Note that returning the
Bufferdirectly via theinstanceof Uint8Arraypath is functionally equivalent — the Buffer's ownbyteOffsetandbyteLengthare preserved as part of theTypedArrayinterface, so the Web Crypto API receives the correct byte range either way.If the intent is to always produce a plain
Uint8Arrayview (rather than aBuffersubclass), the check order needs to be swapped:♻️ Suggested fix: check `Buffer.isBuffer` before `instanceof Uint8Array`
function toUint8Array(data) { - if (data instanceof Uint8Array) { return data; } - if (typeof data === 'string') { return new TextEncoder().encode(data); } if (typeof Buffer !== 'undefined' && Buffer.isBuffer(data)) { const buf = /** `@type` {Buffer} */ (data); return new Uint8Array(buf.buffer, buf.byteOffset, buf.byteLength); } + if (data instanceof Uint8Array) { return data; } + if (typeof data === 'string') { return new TextEncoder().encode(data); } throw new Error('WebCryptoAdapter: data must be string, Buffer, or Uint8Array'); }Alternatively, if there is no semantic reason to distinguish a plain
Uint8Arrayfrom aBuffersubclass for this use-case, theBuffer.isBufferbranch can simply be removed.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/infrastructure/adapters/WebCryptoAdapter.js` around lines 39 - 43, The Buffer-specific branch (Buffer.isBuffer) is dead because Buffer extends Uint8Array so the instanceof Uint8Array check returns true first; update the conversion logic in the WebCryptoAdapter byte-conversion routine to either (a) remove the Buffer.isBuffer branch entirely if a Buffer subclass can be treated the same as a plain Uint8Array, or (b) if you must produce a plain Uint8Array view, swap the checks so you call Buffer.isBuffer(data) before the instanceof Uint8Array branch and then create a new Uint8Array from the Buffer's buffer/byteOffset/byteLength; keep the existing string handling (TextEncoder) unchanged and ensure Buffer.isBuffer and instanceof Uint8Array are the referenced guards.src/domain/utils/roaring.js (1)
37-39:[default]property type is missingisNativelyInstalled?: booleanfor the CJS/1.x surface.The
[default]property (line 38) models the unwrapped CJS module as{ RoaringBitmap32: ... }. In roaring 1.x the CJS module also exposesisNativelyInstalled: booleanat the top level (the path exercised at line 177). AfterroaringModule = roaringModule.default(line 91), TypeScript widens back toRoaringModule | nullvia the variable's declared type, soisNativelyInstalledremains accessible with the correct type — no runtime impact.Aligning the
[default]type with the full 1.x module surface would make the typedef self-documenting:🔧 Suggested typedef refinement
- * `@property` {{ RoaringBitmap32: typeof import('roaring').RoaringBitmap32 }} [default] + * `@property` {{ RoaringBitmap32: typeof import('roaring').RoaringBitmap32, isNativelyInstalled?: boolean }} [default]🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/utils/roaring.js` around lines 37 - 39, The JSDoc typedef for the unwrapped CJS module is incomplete: update the `[default]` property to include isNativelyInstalled?: boolean so the CJS/1.x surface is fully modeled; modify the `@property` that currently reads `{ RoaringBitmap32: typeof import('roaring').RoaringBitmap32 } [default]` to include `isNativelyInstalled?: boolean` alongside `RoaringBitmap32` (this ensures the typedef aligns with the runtime usage where roaringModule.default may expose isNativelyInstalled, and references like RoaringBitmap32 remain correct).bin/cli/commands/path.js (1)
82-87: Optional: eliminate the remaining JSDoc cast with'code' in errornarrowing.After
instanceof Error,'code' in errornarrowserrortoError & Record<"code", unknown>, makingerror['code'] === 'NODE_NOT_FOUND'valid without any cast — staying consistent with the PR's goal of replacing type assertions with type guards.♻️ Proposed refactor
- } catch (error) { - if (error instanceof Error && /** `@type` {{code?: string}} */ (error).code === 'NODE_NOT_FOUND') { + } catch (error) { + if (error instanceof Error && 'code' in error && error['code'] === 'NODE_NOT_FOUND') { throw notFoundError(error.message); } throw error; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@bin/cli/commands/path.js` around lines 82 - 87, Replace the JSDoc cast in the catch block by using a type guard: after checking "error instanceof Error", check "'code' in error" and then compare error.code === 'NODE_NOT_FOUND' before throwing notFoundError(error.message); update the condition currently using /** `@type` {{code?: string}} */ (error).code to use the 'code' in error narrowing with the same notFoundError(error.message) and falling back to rethrowing error.bin/cli/commands/doctor/types.js (1)
9-9:unknownis a solid upgrade over*; recursive self-reference would be more precise.TypeScript supports recursive JSDoc typedefs, so the inner types can reference
JsonValuedirectly, constraining nested arrays/objects to valid JSON values rather thanunknown.♻️ Fully recursive definition
-/** `@typedef` {null | boolean | number | string | Array<unknown> | {[k:string]: unknown}} JsonValue */ +/** `@typedef` {null | boolean | number | string | Array<JsonValue> | {[k:string]: JsonValue}} JsonValue */This ensures that consumers who read nested values get
JsonValue(notunknown) and benefit from the same type guards transitively.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@bin/cli/commands/doctor/types.js` at line 9, Update the JSDoc typedef for JsonValue so nested arrays and objects reference JsonValue recursively instead of using unknown; modify the existing /** `@typedef` {null | boolean | number | string | Array<unknown> | {[k:string]: unknown}} JsonValue */ declaration so Array and object value types point to JsonValue (e.g., Array<JsonValue> and {[k:string]: JsonValue}) to ensure nested values are typed as JsonValue throughout.bin/cli/commands/install-hooks.js (1)
21-22: Remaining@param {*}/ embedded*wildcards still present after this PR's cleanup pass.Lines 21, 40 (
@param {*} classification) and Line 95 (@returns {Promise<{payload: *, exitCode: number}>}) weren't addressed.classifyExistingHookalready has a precise return type, soclassificationcan be typed accordingly.♻️ Proposed cleanup
/** - * `@param` {*} classification + * `@param` {{ kind: 'none'|'ours'|'foreign', version?: string, appended?: boolean }} classification * `@param` {{force: boolean}} hookOptions */ async function resolveStrategy(classification, hookOptions) { /** - * `@param` {*} classification + * `@param` {{ kind: 'ours', version?: string }} classification */ async function promptForOursStrategy(classification) {For the return type of
handleInstallHooks, a dedicated payload typedef (or a union of the three shapes returned) would replace the embedded*.Also applies to: 40-41, 95-95
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@bin/cli/commands/install-hooks.js` around lines 21 - 22, Update the JSDoc types to remove wildcard asterisks: replace "@param {*} classification" with the concrete type returned by classifyExistingHook (reference classifyExistingHook's return shape) for both occurrences, and replace the "@returns {Promise<{payload: *, exitCode: number}>}" on handleInstallHooks with a proper typedef (or a union) describing the exact payload shapes returned by handleInstallHooks (create a named typedef like InstallHooksResult or a union of the three payload shapes and use "@returns {Promise<InstallHooksResult>}"). Ensure the JSDoc param name matches the function signature for handleInstallHooks and reference classifyExistingHook and handleInstallHooks in the typedef to keep types consistent.src/domain/services/PatchBuilderV2.js (1)
451-458: Snapshot the version vector in patch context to avoid accidental mutation.
build()returns a patch for inspection; if the builder is reused, the shared_vvreference can drift. Consider cloning the vector when embedding it in the patch context.♻️ Suggested adjustment
- context: /** `@type` {import('../types/WarpTypesV2.js').VersionVector} */ (/** `@type` {unknown} */ (this._vv)), + context: /** `@type` {import('../types/WarpTypesV2.js').VersionVector} */ (/** `@type` {unknown} */ (vvClone(this._vv))),- context: /** `@type` {import('../types/WarpTypesV2.js').VersionVector} */ (/** `@type` {unknown} */ (this._vv)), + context: /** `@type` {import('../types/WarpTypesV2.js').VersionVector} */ (/** `@type` {unknown} */ (vvClone(this._vv))),Also applies to: 543-550
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/PatchBuilderV2.js` around lines 451 - 458, The build() method embeds the internal version-vector reference _vv directly into the patch context causing accidental mutations if the builder is reused; when calling createPatchV2 (and the other similar builder method around the 543-550 region), clone the version vector (deep copy) instead of passing this._vv so the patch.context contains an immutable snapshot — update build() and the analogous function to pass a cloned object (e.g., shallow/deep clone depending on VV structure) into the context parameter when calling createPatchV2 to prevent shared-state drift.bin/cli/commands/verify-audit.js (1)
91-100: Static fallback discards thrown value for non-Errorexceptions.If
evaluateTrust(or a dependency) throws a non-Errorvalue — a string, a custom status object, etc. — the'Trust evaluation failed'fallback completely hides the cause.String(err)would preserve it without leaking unexpected structure.🔧 Suggested improvement
- error: err instanceof Error ? err.message : 'Trust evaluation failed', + error: err instanceof Error ? err.message : String(err),🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@bin/cli/commands/verify-audit.js` around lines 91 - 100, The catch block in the verify-audit flow currently replaces non-Error thrown values with the fixed string "Trust evaluation failed"; update the payload.trustAssessment assignment so the error field preserves the original thrown value by using String(err) for non-Error cases (e.g., set error to err instanceof Error ? err.message : String(err)). Locate the catch handling around evaluateTrust (the catch(err) that builds payload.trustAssessment) and change only the error fallback there to use String(err) so non-Error throws (strings, status objects) are captured.src/infrastructure/adapters/GitGraphAdapter.js (1)
709-714:this._getExitCodeis inconsistent withnodeExistsusing the module-levelgetExitCodedirectly.
_getExitCodeis a one-line delegate (lines 774–776). BothnodeExists(line 634) andisAncestor(line 710) are performing identical exit-code checks; using the instance method in one place and the module helper in the other adds noise without benefit.♻️ Suggested alignment
- if (this._getExitCode(/** `@type` {GitError} */ (err)) === 1) { + if (getExitCode(/** `@type` {GitError} */ (err)) === 1) {🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/infrastructure/adapters/GitGraphAdapter.js` around lines 709 - 714, The isAncestor catch block mixes instance and module helpers—replace the call to this._getExitCode with the module-level getExitCode so both isAncestor and nodeExists use the same helper; update the check in isAncestor (currently calling this._getExitCode) to call getExitCode(/** `@type` {GitError} */ (err)) === 1 to match nodeExists and remove the unnecessary one-line delegate usage.bin/presenters/json.js (1)
52-53:Record<string, unknown> | unknownsimplifies tounknown.Since
unknownis the top type,Record<string, unknown>is already a subtype ofunknownand the union is redundant. If the intent is to document that callers will commonly pass a record, consider using an overloaded JSDoc or justunknownwith a prose note.♻️ Simplification
- * `@param` {Record<string, unknown> | unknown} payload - * `@returns` {Record<string, unknown> | unknown} + * `@param` {unknown} payload + * `@returns` {unknown}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@bin/presenters/json.js` around lines 52 - 53, The JSDoc type "Record<string, unknown> | unknown" is redundant because it simplifies to "unknown"; update the JSDoc for the affected function in bin/presenters/json.js by replacing the union type with "unknown" (or, if you want to emphasize callers commonly pass an object, use a JSDoc overload or add a prose note) — search for the JSDoc lines containing "Record<string, unknown> | unknown" (the param and `@returns` annotations) and change them to use "unknown" or add an overload documenting Record<string, unknown> explicitly.src/domain/services/HttpSyncServer.js (2)
227-241: Stale duplicate JSDoc block oninitAuth.There are two consecutive JSDoc blocks before
initAuth. Only the second one (lines 232–235) is effective; most tooling will use the JSDoc immediately preceding the function declaration, making the first block (lines 227–231) dead documentation that could confuse readers.♻️ Suggested fix — remove the stale JSDoc
-/** - * `@param` {{ keys: Record<string, string>, mode: 'enforce'|'log-only', crypto?: *, logger?: *, wallClockMs?: () => number }|undefined} auth - * `@param` {string[]} [allowedWriters] - * `@returns` {{ auth: SyncAuthService|null, authMode: string|null }} - * `@private` - */ /** * `@param` {z.infer<typeof authSchema>} [auth] * `@param` {string[]} [allowedWriters] */ function initAuth(auth, allowedWriters) {🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/HttpSyncServer.js` around lines 227 - 241, Remove the stale duplicate JSDoc block that precedes the effective one for the initAuth function; keep only the second JSDoc (the z.infer<typeof authSchema> variant) immediately above function initAuth to avoid confusing tooling and readers, ensuring the function signature and description remain documented by the single correct comment.
21-54: Zod schema validation looks well-structured overall.The schemas provide clear validation with sensible defaults and the
superRefinerule correctly enforces thatallowedWritersrequiresauth. A couple of observations:
httpPortandgraph(lines 33–39) still usez.any().refine(...)— this is pragmatic for port interfaces that Zod can't deeply validate, but consider documenting that these are intentional residualz.any()usages (e.g., a short comment) so future cleanup sweeps don't re-flag them.
cryptoandloggerinauthSchema(lines 27–28) similarly usez.any().optional().Given the PR's stated goal of eliminating wildcasts, a brief inline comment (e.g.,
// z.any(): port interface, not schema-validatable) would clarify intent.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/HttpSyncServer.js` around lines 21 - 54, Add short inline comments explaining the intentional use of z.any() for non-schema-validatable interfaces: annotate optionsSchema.httpPort and optionsSchema.graph with a comment like "// z.any(): port/graph interface not schema-validatable" and annotate authSchema.crypto and authSchema.logger with "// z.any(): opaque runtime objects (crypto/logger) intentionally left untyped"; keep the existing z.any().optional() and superRefine logic unchanged so future maintainers won't remove these wildcasts accidentally.bin/cli/types.js (1)
86-94: Consider tightening thewherecallback type.The
wheremethod usesFunctionwhich is still a loose type. Since this type is consumed byapplyQueryStepinquery.jswhere the callback receives a{props?: Record<string, unknown>}node, a more precise signature would be:`@property` {(fn: (node: {id: string, props?: Record<string, unknown>}) => boolean) => QueryBuilderLike} whereThis would provide better type safety at the call sites. That said, if the underlying implementation accepts arbitrary predicates beyond this shape,
Functionis acceptable.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@bin/cli/types.js` around lines 86 - 94, The typedef for QueryBuilderLike uses a loose Function type for the where method; tighten it by changing the where signature to accept a predicate that takes a node shaped like {id: string, props?: Record<string, unknown>} and returns a boolean (matching how applyQueryStep in query.js invokes the predicate). Update the `@property` for where in the QueryBuilderLike JSDoc to use (fn: (node: {id: string, props?: Record<string, unknown>}) => boolean) => QueryBuilderLike so call sites get stronger type information while remaining compatible with applyQueryStep.src/domain/services/SyncProtocol.js (1)
401-408: Divergence detection error narrowing is correct but slightly redundant.The
err instanceof Errorcheck appears in both sides of the||. A minor readability improvement would be to hoist it:if (err instanceof Error && ( ('code' in err && /** `@type` {{ code: string }} */ (err).code === 'E_SYNC_DIVERGENCE') || err.message?.includes('Divergence detected') )) {The message-based fallback (
'Divergence detected') is inherently fragile, but given it's a safety net for potential legacy paths, it's acceptable.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/SyncProtocol.js` around lines 401 - 408, Refactor the divergence check inside the catch block in SyncProtocol.js to avoid repeating "err instanceof Error": first ensure err is an Error, then test the two divergence conditions (error.code === 'E_SYNC_DIVERGENCE' on the code property and the fallback err.message?.includes('Divergence detected')) as a single combined conditional; update the existing catch block that currently checks both sides of the || to use this hoisted instanceof Error check while preserving the continue behavior for detected divergence and rethrowing otherwise.src/domain/services/ProvenanceIndex.js (1)
279-280: Redundant double-annotation — remove the standalone@typeline.Line 279's
/**@type{...} */and line 280's inline cast both set identical types. One is sufficient; the rest of the codebase (e.g.,Frontier.jsline 76,CheckpointSerializerV5.jsline 99) uses only the inline cast pattern.♻️ Proposed simplification
- /** `@type` {{ version?: number, entries?: Array<[string, string[]]> }} */ - const obj = /** `@type` {{ version?: number, entries?: Array<[string, string[]]> }} */ (c.decode(buffer)); + const obj = /** `@type` {{ version?: number, entries?: Array<[string, string[]]> }} */ (c.decode(buffer));🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/ProvenanceIndex.js` around lines 279 - 280, Remove the redundant standalone JSDoc type annotation applied before the declaration of obj; keep only the inline cast used with c.decode(buffer) (the inline /** `@type` {...} */ cast) so the variable obj (result of c.decode) has the type without the duplicate /** `@type` {...} */ line—update the ProvenanceIndex.js declaration that assigns obj from c.decode(buffer) to use only the inline cast form.src/domain/services/ObserverView.js (1)
131-131: Consider extracting the_materializeGraphreturn shape to a typedef.The inline cast spans the entire line. Extracting the internal type makes the method body scannable.
♻️ Proposed simplification
Add a local typedef (top of file or in a types file):
/** * `@typedef` {{ _materializeGraph: () => Promise<{ * state: import('./JoinReducer.js').WarpStateV5, * stateHash: string, * adjacency: { * outgoing: Map<string, Array<{neighborId: string, label: string}>>, * incoming: Map<string, Array<{neighborId: string, label: string}>> * } * }> }} MaterializableGraph */Then at line 131:
- const materialized = await /** `@type` {{ _materializeGraph: () => Promise<{state: import('./JoinReducer.js').WarpStateV5, stateHash: string, adjacency: {outgoing: Map<string, Array<{neighborId: string, label: string}>>, incoming: Map<string, Array<{neighborId: string, label: string}>>}}> }} */ (this._graph)._materializeGraph(); + const materialized = await /** `@type` {MaterializableGraph} */ (this._graph)._materializeGraph();🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/ObserverView.js` at line 131, Extract the long inline JSDoc cast into a named typedef (e.g., MaterializableGraph) that describes the _materializeGraph return shape (including state: import('./JoinReducer.js').WarpStateV5, stateHash: string, and adjacency with outgoing/incoming maps of {neighborId, label}). Add that typedef near the top of the file (or in a shared types file) and then replace the inline cast on the materialized assignment with a concise cast to MaterializableGraph (e.g., /** `@type` {MaterializableGraph} */ (this._graph)). Ensure you reference the existing _materializeGraph symbol and preserve the exact nested types from the original inline annotation.src/domain/warp/provenance.methods.js (1)
264-265: Residualanyin_sortPatchesCausallyparameter type.The JSDoc
@param {Array<{patch: any, sha: string}>}still usesanyfor thepatchfield. Given this PR's goal of eliminating wildcard/any types, consider narrowing this to a more specific patch shape (e.g.,{lamport?: number, writer?: string}), consistent with the fields actually accessed in the sort comparator (lines 270, 276).♻️ Suggested type narrowing
-/** - * `@param` {Array<{patch: any, sha: string}>} patches - Unsorted patch entries - * `@returns` {Array<{patch: any, sha: string}>} Sorted patch entries - */ +/** + * `@param` {Array<{patch: {lamport?: number, writer?: string}, sha: string}>} patches - Unsorted patch entries + * `@returns` {Array<{patch: {lamport?: number, writer?: string}, sha: string}>} Sorted patch entries + */🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/warp/provenance.methods.js` around lines 264 - 265, Update the JSDoc for _sortPatchesCausally to replace the residual any with a narrowed patch shape that matches what the comparator reads (e.g., use `@param` {Array<{patch: {lamport?: number, writer?: string}, sha: string}>} or similar), so the patch object documents lamport and writer (optional if they can be missing) and aligns with the comparator logic in _sortPatchesCausally; adjust any other references to the previous typedef if present to keep types consistent.src/domain/warp/patch.methods.js (1)
268-270: Outer cast usesObject(capital-O) — consider using the precise type.Lines 268 and 270 cast through
unknown, which is the correct pattern to replace wildcards. However, the outer target type on line 270 uses{patch: Object, sha: string}—Objectis nearly as loose asany. Since the inner lambda already typesoptswith the precisePatchV2import, the outer cast could match:- onCommitSuccess: /** `@type` {(result: {patch: Object, sha: string}) => void} */ (/** `@type` {unknown} */ ((/** `@type` {{patch?: import('../types/WarpTypesV2.js').PatchV2, sha?: string}} */ opts) => this._onPatchCommitted(resolvedWriterId, opts))), + onCommitSuccess: /** `@type` {(result: {patch: import('../types/WarpTypesV2.js').PatchV2, sha: string}) => void} */ (/** `@type` {unknown} */ ((/** `@type` {{patch?: import('../types/WarpTypesV2.js').PatchV2, sha?: string}} */ opts) => this._onPatchCommitted(resolvedWriterId, opts))),Same issue exists on line 328 in
createWriter(). This is a minor leftover from the wildcard-elimination effort.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/warp/patch.methods.js` around lines 268 - 270, The outer cast for onCommitSuccess uses a loose `{patch: Object, sha: string}` type; update the outer cast to match the precise inner signature (use `{patch?: import('../types/WarpTypesV2.js').PatchV2, sha?: string}` or the exact optionality used in the inner lambda) so the cast is not effectively `Object`, and make the same change in the matching cast inside createWriter() (line ~328) so both onCommitSuccess and the createWriter cast use the exact PatchV2 import and correct optional properties.src/domain/services/CommitDagTraversalService.js (1)
57-167: Facade delegation methods still use@param {*} options.The delegating methods (
bfs,dfs,ancestors, etc.) retain@param {*} options— effectively wildcards. These aren't part of this PR's changed lines, but they represent residual loose types in the facade. A follow-up could type them to match the sub-service signatures (or useParameters<...>[0]like thejoinPatchpattern inpatch.methods.js).🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/CommitDagTraversalService.js` around lines 57 - 167, The facade methods (bfs, dfs, ancestors, descendants, isReachable, findPath, shortestPath, weightedShortestPath, aStarSearch, bidirectionalAStar, commonAncestors, topologicalSort) use a loose JSDoc param `@param {*} options`; update each method's JSDoc to reflect the actual parameter type by referencing the delegate signature (e.g. use a TypeScript-style JSDoc type like `@param {Parameters<import('./DagTraversal').default['bfs']>[0]} options` or `Parameters<typeof this._traversal.bfs>[0]` for traversal methods and analogous `Parameters<...>[0]` for _pathFinding and _topology methods) so the facade accurately mirrors the sub-service types (match the patterns used in patch.methods.js's joinPatch).src/domain/services/IndexRebuildService.js (1)
392-392: Intersection cast forloadIndexFrontierstorage parameter.The cast
/**@type{IndexStoragePort & BlobPort} */is needed becausethis.storageis typed asIndexStoragePortbutloadIndexFrontierrequires blob-reading capabilities too. This works but is a signal that the constructor's storage type could be widened to includeBlobPort(or use theIndexStoragealias fromWarpPersistence.jswhich already includes it).🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/IndexRebuildService.js` at line 392, The explicit intersection cast on this.storage when calling loadIndexFrontier indicates the class constructor type is too narrow; update the service to accept a storage type that includes both IndexStoragePort and BlobPort (or use the existing IndexStorage alias from WarpPersistence.js) so you can remove the inline cast. Concretely, change the constructor/field type for the service (the this.storage declaration) to IndexStorage or an intersection of IndexStoragePort & BlobPort, then remove the /** `@type` {...} */ cast at the loadIndexFrontier call; keep function name loadIndexFrontier and this.storage so reviewers can find the change.src/domain/services/BitmapIndexReader.js (1)
93-98: Constructor still uses a typed cast on the empty default, unlike other files.Other constructors in this PR (e.g.,
IndexRebuildService,DagPathFinding) use plain= {}. This one retains= /**@type{{ storage: IndexStoragePort, ... }} */ ({}). Both patterns work, but it's an inconsistency. Minor nit — no functional impact since the runtime guard on line 94 catches missingstorage.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/BitmapIndexReader.js` around lines 93 - 98, The constructor for BitmapIndexReader uses a JSDoc typed cast on its empty default parameter (the "constructor" in class BitmapIndexReader) which is inconsistent with other constructors like IndexRebuildService and DagPathFinding; change the parameter default from the JSDoc-cast form (/** `@type` {…} */ ({})) to a plain = {} default and keep the existing runtime storage guard/throw, so behavior is unchanged but style matches the rest of the codebase.src/domain/services/StreamingBitmapIndexBuilder.js (1)
11-11: Dual type references for storage:IndexStoragePort(param) vsIndexStorage(property).The constructor param is typed as
IndexStoragePort(line 86), but the property is typed asIndexStoragefromWarpPersistence.js(line 111–112). These should be structurally identical, but having two separate type aliases for the same concept could cause confusion if they drift apart.Consider using the same type alias consistently — either
IndexStoragePorteverywhere orIndexStorageeverywhere.Also applies to: 111-112
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/StreamingBitmapIndexBuilder.js` at line 11, The code mixes two type aliases for the same concept: the constructor parameter annotated as IndexStoragePort and the instance property annotated as IndexStorage; unify them to a single alias to avoid drift. Pick one alias (either IndexStoragePort or IndexStorage), update the top typedef import and all JSDoc annotations so the constructor parameter (in the constructor signature) and the property (the instance field assigned in StreamingBitmapIndexBuilder) use the same type name, and remove the alternate typedef reference so both the param and the property consistently reference the identical type alias.src/domain/types/WarpPersistence.js (1)
15-33:PersistenceReader,PersistenceWriter, andCheckpointPersistenceare structurally identical.All three are
CommitPort & BlobPort & TreePort & RefPort. The doc comments imply different access patterns (read-side vs. write-side vs. no-config), but the types don't actually enforce that distinction. If these ports expose both read and write methods, aPersistenceWriterreference can still call read methods and vice versa.This is fine as semantic documentation today, but consider splitting the ports themselves into finer-grained read/write interfaces if you want compile-time enforcement in the future.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/types/WarpPersistence.js` around lines 15 - 33, The typedefs PersistenceReader, PersistenceWriter, and CheckpointPersistence currently all alias the same full-port types (CommitPort, BlobPort, TreePort, RefPort), so they do not enforce read-only vs write-only semantics; to fix this, introduce separate read/write variants for each port (e.g., CommitPortRead/CommitPortWrite, BlobPortRead/BlobPortWrite, TreePortRead/TreePortWrite, RefPortRead/RefPortWrite) that only expose the appropriate methods, then update PersistenceReader to combine the *Read variants, PersistenceWriter to combine the *Write variants, and leave CheckpointPersistence as the precise combination you intend (e.g., reads+writes as needed); update the import typedefs in WarpPersistence.js to reference these new symbol names so the types enforce access patterns.src/domain/warp/_wiredMethods.d.ts (2)
205-213:serve()useshttpPort: unknownandauth?: unknown— weaker than the public surface.
index.d.tstypes these asHttpServerPortandSyncAuthServerOptionsrespectively. The looser typing here means internal callers ofthis.serve(...)won't get compile-time checks on these parameters.This is acceptable if importing those port types would create a circular dependency, but worth documenting with a brief comment.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/warp/_wiredMethods.d.ts` around lines 205 - 213, The serve method currently types httpPort and auth as unknown which weakens compile-time checks; update the signature of serve(httpPort: HttpServerPort, auth?: SyncAuthServerOptions) so internal callers get proper types (use the same HttpServerPort and SyncAuthServerOptions from the public surface), and if importing those types would cause a circular dependency instead add a brief comment above serve explaining that unknown was chosen intentionally due to circular-import risk and reference the canonical types (HttpServerPort, SyncAuthServerOptions) located in the public index.d.ts so future maintainers understand the rationale.
137-143: AlignWormholeEdge.payloadtype between internal and public API declarations.The internal
WormholeEdgeinterface in_wiredMethods.d.tsdeclarespayload: unknown, but the public API inindex.d.tsexports it aspayload: ProvenancePayload(line 2416). This inconsistency means code using the internalcreateWormhole()method cannot safely access payload properties without explicit casts.Import
ProvenancePayloadfrom../services/ProvenancePayload.jsand update the type to match the public declaration.♻️ Suggested alignment
+import type { ProvenancePayload } from '../services/ProvenancePayload.js'; interface WormholeEdge { fromSha: string; toSha: string; writerId: string; - payload: unknown; + payload: ProvenancePayload; patchCount: number; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/warp/_wiredMethods.d.ts` around lines 137 - 143, The internal WormholeEdge interface in _wiredMethods.d.ts declares payload as unknown but must match the public API; import ProvenancePayload from ../services/ProvenancePayload.js and change the WormholeEdge.payload type to ProvenancePayload so createWormhole and any callers can use the payload without casts; update the import statement and the interface declaration (WormholeEdge, referenced by createWormhole) to use ProvenancePayload.test/unit/cli/doctor.test.js (1)
28-31: Considervi.mocked()over the double-cast pattern.The
unknown → Mockdouble-cast works but is the JSDoc workaround for whatvi.mocked()solves directly and more readably:♻️ Proposed refactor using `vi.mocked()`
-const _shared = await import('../../../bin/cli/shared.js'); -const createPersistence = /** `@type` {import('vitest').Mock} */ (/** `@type` {unknown} */ (_shared.createPersistence)); -const resolveGraphName = /** `@type` {import('vitest').Mock} */ (/** `@type` {unknown} */ (_shared.resolveGraphName)); -const createHookInstaller = /** `@type` {import('vitest').Mock} */ (/** `@type` {unknown} */ (_shared.createHookInstaller)); +const _shared = await import('../../../bin/cli/shared.js'); +const createPersistence = vi.mocked(_shared.createPersistence); +const resolveGraphName = vi.mocked(_shared.resolveGraphName); +const createHookInstaller = vi.mocked(_shared.createHookInstaller);
vi.mockis hoisted and always executed before imports, so the mock is in place when theawait import(...)resolves — the timing is correct either way.vi.mocked()is available in Vitest 2.1.8 and just removes the noise.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/unit/cli/doctor.test.js` around lines 28 - 31, Replace the double-cast JSDoc mock conversion for the imported _shared functions with Vitest's vi.mocked helper: after importing _shared (the result of await import('../../../bin/cli/shared.js')), call vi.mocked(_shared.createPersistence), vi.mocked(_shared.resolveGraphName), and vi.mocked(_shared.createHookInstaller) to get typed mocks instead of using the /** `@type` {unknown} */ → /** `@type` {import('vitest').Mock} */ casts; this removes the noisy double-cast and relies on vi.mocked() to produce properly typed mocks for createPersistence, resolveGraphName, and createHookInstaller.
| ### Changed | ||
|
|
||
| - **M8 IRONCLAD Wave 3: cast elimination** — Removed ~107 wildcard casts (`@type {*}` / `@type {any}`) across ~40 files in `src/domain/warp/`, `src/domain/services/`, and `src/infrastructure/adapters/`. All casts replaced with role-specific persistence types (`PersistenceReader`, `PersistenceWriter`, `CheckpointPersistence`, `IndexStorage`), error narrowing helpers (`isError`, `hasErrorCode`, `hasMessage`), and properly typed aliases. Zero `TODO(ts-cleanup)` tags remain in Wave 3 scope. | ||
| - **HttpSyncServer Zod validation** — Constructor now uses Zod schema validation; removed manual mode validation and wildcard casts. `initAuth` typed via schema inference. `z.function()` replaced with `z.custom()` for correct `wallClockMs` TypeScript inference. | ||
| - **HookInstaller** — Constructor deps parameter changed from optional to required (all callers already provided all fields). | ||
| - **SyncAuthService** — `_validateKeys` now typed as assertion function for proper post-validation narrowing. | ||
| - **WarpErrors** — `hasErrorCode`/`hasMessage` helpers upgraded from wildcard casts to `Record<string, unknown>` narrowing. | ||
| - **WarpPersistence types** — Added `IndexStorage` typedef (`BlobPort & TreePort & RefPort`). | ||
|
|
There was a problem hiding this comment.
Verify Wave 3 metrics and scope in the changelog.
The entry claims ~107 casts removed in Wave 3, but the PR summary indicates that figure for Wave 1. Please reconcile the counts/paths to avoid misleading release notes.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@CHANGELOG.md` around lines 10 - 18, Update the "M8 IRONCLAD Wave 3: cast
elimination" changelog line to match the actual PR scope: verify whether the
~107 wildcard casts and the listed paths (`src/domain/warp/`,
`src/domain/services/`, `src/infrastructure/adapters/`) belong to Wave 3 or Wave
1, correct the numeric count and/or wave label accordingly, and ensure the
sentence accurately reflects which files and waves were changed (edit the "~107"
and/or "Wave 3" text in the header and the file path list to match the PR
summary).
There was a problem hiding this comment.
Not applicable — the Wave 3 content this referenced has been superseded by later CHANGELOG entries (11.3.1, 11.3.2). Current entries are accurate.
There was a problem hiding this comment.
Rate Limit Exceeded
@flyingrobots have exceeded the limit for the number of chat messages per hour. Please wait 1 minutes and 44 seconds before sending another message.
| const currentFrontier = /** @type {Map<string, string>} */ (/** @type {unknown} */ (this._cachedState.observedFrontier)); | ||
| const result = /** @type {{state: import('../services/JoinReducer.js').WarpStateV5, frontier: Map<string, string>, applied: number}} */ (applySyncResponseImpl(response, this._cachedState, currentFrontier)); |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
rg -nP --type=js -C5 'observedFrontier' -g '!node_modules/**' | head -120Repository: git-stunts/git-warp
Length of output: 9102
🏁 Script executed:
# Check the type declaration in index.d.ts
cat -n index.d.ts | sed -n '2190,2210p'Repository: git-stunts/git-warp
Length of output: 767
🏁 Script executed:
# Search for vvSerialize and vvDeserialize to understand what they do
rg -nP 'export.*(?:vvSerialize|vvDeserialize|createVersionVector)' --type=js -A2Repository: git-stunts/git-warp
Length of output: 979
🏁 Script executed:
# Check applySyncResponseImpl signature
rg -nP 'function applySyncResponseImpl|export.*applySyncResponseImpl' --type=js -A10Repository: git-stunts/git-warp
Length of output: 45
🏁 Script executed:
rg -nP 'applySyncResponseImpl' --type=js -B2 -A10Repository: git-stunts/git-warp
Length of output: 2476
🏁 Script executed:
# Also search for where frontier is used in response handling
rg -nP 'frontier.*string|frontier.*number' --type=js -C3Repository: git-stunts/git-warp
Length of output: 10801
🏁 Script executed:
rg -nP 'export.*function applySyncResponse|function applySyncResponse' src/domain/services/SyncProtocol.js -A25Repository: git-stunts/git-warp
Length of output: 1383
🏁 Script executed:
# Also check the Frontier.js module to understand what createFrontier and updateFrontier do
rg -nP 'export.*(?:createFrontier|updateFrontier|Frontier)' src/domain/services/Frontier.js -A8Repository: git-stunts/git-warp
Length of output: 2271
🏁 Script executed:
# Check the full sync response handling to see if there's a frontier being constructed
rg -nP 'const currentFrontier|_lastFrontier' src/domain/warp/sync.methods.js -B3 -A3Repository: git-stunts/git-warp
Length of output: 1787
🏁 Script executed:
# Check the return type and usage of applySyncResponse more carefully
rg -nP 'result\.frontier|newFrontier' src/domain/services/SyncProtocol.js -B2 -A2Repository: git-stunts/git-warp
Length of output: 720
🏁 Script executed:
# Check initialization and documentation of observedFrontier in checkpoint/state creation
rg -nP 'observedFrontier.*=' src/domain/services/JoinReducer.js -B2 -A2Repository: git-stunts/git-warp
Length of output: 426
🏁 Script executed:
# Check what vvSerialize really contains and how observedFrontier is used after being loaded
rg -nP 'vvDeserialize|appliedVV' src/domain/services/CheckpointService.js -B2 -A5Repository: git-stunts/git-warp
Length of output: 4753
observedFrontier is a version vector (Map<string, number>), not a frontier (Map<string, string>)—passing it to applySyncResponseImpl is incorrect.
observedFrontier is a CRDT version vector (lamport timestamps by writerId) created via createVersionVector(), serialized with vvSerialize(), and merged via vvMerge(). The cast through unknown masks a critical semantic mismatch: you are passing version vector data where applySyncResponseImpl expects writer frontier data (current patch SHAs). This is not just a type bypass—it passes the wrong data structure entirely, which will cause incorrect state merging in the sync protocol.
Separate the concerns: either load or reconstruct the actual frontier (writerId → patchSha) from a source like the checkpoint's frontier.cbor, or track it separately in the instance state, then pass that to applySyncResponseImpl.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/domain/warp/sync.methods.js` around lines 261 - 262, The code wrongly
casts this._cachedState.observedFrontier (a version vector of Map<string,
number>) to a writer frontier Map<string, string> and passes it into
applySyncResponseImpl; instead, obtain the actual frontier (writerId→patchSha)
and pass that. Replace the unsafe cast of observedFrontier with logic that
reconstructs/loads the real frontier: e.g., parse/deserialize the checkpoint
frontier (frontier.cbor) from this._cachedState.checkpoint or use an explicit
cached writer-frontier field on this instance, and pass that Map<string,string>
(not the VV) into applySyncResponseImpl; keep observedFrontier as the version
vector (created by createVersionVector/vvSerialize/vvMerge) and do not reuse it
for writer SHAs.
There was a problem hiding this comment.
Acknowledged — this is a pre-existing architectural concern, not introduced by this PR. The double cast masks a semantic mismatch between version vectors and frontier maps that predates the M8 work. Filed as a separate investigation item.
There was a problem hiding this comment.
Rate Limit Exceeded
@flyingrobots have exceeded the limit for the number of chat messages per hour. Please wait 1 minutes and 45 seconds before sending another message.
| /** | ||
| * @typedef {{ hostname: string, port: number, stop: () => void }} BunServer | ||
| */ |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
Bun Server stop() method signature return type Promise
💡 Result:
server.stop is typed as:
stop(closeActiveConnections?: boolean): Promise<void>;So yes—its return type is Promise<void>. [1]
🏁 Script executed:
# Check the actual code in BunHttpAdapter.js
cat -n src/infrastructure/adapters/BunHttpAdapter.js | head -180 | tail -60Repository: git-stunts/git-warp
Length of output: 1835
🏁 Script executed:
# Search for BunServer typedef and stopServer usage
rg "BunServer|stopServer" --type js -B 2 -A 2Repository: git-stunts/git-warp
Length of output: 2195
stop: () => void mismatches Bun's actual async API — stopServer silently swallows errors and fires its callback early.
Bun's Server.stop() signature is stop(closeActiveConnections?: boolean): Promise<void> — not () => void. The inaccurate typedef has two downstream effects in stopServer (lines 157–171):
- Callback fires before the server is fully stopped —
state.server.stop()is not awaited, socallback()on line 164 is called while the stop is still in-flight. - Rejection is silently swallowed — the
try/catchon lines 166–169 won't catch aPromise<void>rejection; the error never reachescallback(err).
🔧 Proposed fix
Update the typedef to reflect the real signature:
-/** `@typedef` {{ hostname: string, port: number, stop: () => void }} BunServer */
+/** `@typedef` {{ hostname: string, port: number, stop: (closeActiveConnections?: boolean) => Promise<void> }} BunServer */Then update stopServer to await stop() and handle the async error properly:
-function stopServer(state, callback) {
- try {
- if (state.server) {
- state.server.stop();
- state.server = null;
- }
- if (callback) {
- callback();
- }
- } catch (err) {
- if (callback) {
- callback(err);
- }
- }
-}
+async function stopServer(state, callback) {
+ try {
+ if (state.server) {
+ await state.server.stop();
+ state.server = null;
+ }
+ if (callback) {
+ callback();
+ }
+ } catch (err) {
+ if (callback) {
+ callback(err);
+ }
+ }
+}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/infrastructure/adapters/BunHttpAdapter.js` around lines 128 - 130, The
typedef for BunServer is wrong and stopServer misuses the async stop API: update
the BunServer typedef to declare stop(closeActiveConnections?: boolean):
Promise<void>, then change stopServer to await state.server.stop(...) and wrap
that await in try/catch so any rejection is passed to callback(err) and
callback() is only invoked after the awaited stop completes; refer to the
BunServer typedef and the stopServer function and the call to state.server.stop
in your edits.
There was a problem hiding this comment.
Rate Limit Exceeded
@flyingrobots have exceeded the limit for the number of chat messages per hour. Please wait 1 minutes and 41 seconds before sending another message.
…ialize, upgrade policy checker
- Replace 48 embedded wildcards (Array<*>, Map<string, *>, LWWRegister<*>,
Promise<{payload: *}>, etc.) with unknown or proper specific types across
31 files in src/, bin/, and infrastructure/
- Restore vvSerialize() in PatchBuilderV2 (Map→Object serialization bug)
- Collapse PersistenceReader/Writer/CheckpointPersistence → CorePersistence
- Replace z.any() with z.custom() in HttpSyncServer
- Upgrade ts-policy-check.js: now bans embedded wildcards and z.any()
- Add AuditReceipt and CasStore typedefs for proper return types
- Fix constructor JSDoc: required params no longer marked optional
- Bump to 11.3.1
There was a problem hiding this comment.
Actionable comments posted: 8
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (13)
src/domain/types/TickReceipt.js (1)
278-289:⚠️ Potential issue | 🟡 MinorTwo embedded wildcard annotations in
sortedReplacerremain unfixed despite PR intent.Lines 280 and 282 contain
@type {{ [x: string]: * }}— the same embedded wildcard pattern this PR targets for elimination. Thets-policy-check.jsscript's Rule 3 explicitly scans for and bans this pattern via the regex\{\[[\w:]+\]:\s*\*\}, making these violations that should have been caught before merge.♻️ Proposed fix
- /** `@type` {{ [x: string]: * }} */ + /** `@type` {{ [x: string]: unknown }} */ const sorted = {}; - const obj = /** `@type` {{ [x: string]: * }} */ (value); + const obj = /** `@type` {{ [x: string]: unknown }} */ (value);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/types/TickReceipt.js` around lines 278 - 289, The JSDoc wildcard annotations inside function sortedReplacer still use the banned pattern ({ [x: string]: * }); update both annotations to a non-wildcard type such as Record<string, unknown> (or { [x: string]: unknown }) and adjust the cast for the obj variable accordingly so the lines declaring sorted and obj use /** `@type` {Record<string, unknown>} */ instead of the wildcard form; ensure sortedReplacer, sorted, and obj references are updated consistently.src/domain/services/DagTopology.js (1)
79-85:⚠️ Potential issue | 🟠 MajorPre-existing:
this._traversalis not guarded before use in theshas.length === 1branch.
traversalon line 79 is a bare type-cast ofthis._traversal, which isundefinedwhen the constructor is called without atraversaloption. The cast suppresses the type warning but does not prevent aTypeErrorat line 82 whentraversal.ancestors(...)is called. The multi-SHA path at line 96 has the same exposure.Since
traversalis documented as optional, a guard (or an early-throw in the constructor whentraversalis needed) would prevent a confusing runtime crash.🛡️ Proposed fix — guard before use
+ if (!this._traversal) { + throw new Error('DagTopology.commonAncestors requires a traversal service'); + } const traversal = /** `@type` {import('./DagTraversal.js').default} */ (this._traversal);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/DagTopology.js` around lines 79 - 85, The code casts this._traversal to traversal and calls traversal.ancestors (and other traversal.* calls in the multi-SHA branch) without ensuring this._traversal is defined, leading to a runtime TypeError; add a guard that checks for this._traversal before use (either throw a clear TypeError in the DagTopology constructor when traversal is required, or check before calling traversal.ancestors/commonAncestor methods and throw a descriptive error like "DagTopology requires a traversal to perform ancestor lookups"), updating the code paths that reference traversal (the shas.length === 1 branch that uses traversal.ancestors and the multi-SHA branch that calls traversal methods) so they never call methods on an undefined this._traversal.src/domain/services/JoinReducer.js (1)
397-399:⚠️ Potential issue | 🟡 MinorRemaining wildcard cast
value: *missed in the receipt-enabledPropSetpath.Line 398 still uses
/**@type{{node: string, key: string, value: *}} */while every otherop.valueannotation in this file (lines 84, 297, 350, 533) was updated tounknownas part of this PR.🔧 Proposed fix
- outcome = propSetOutcome(state.prop, /** `@type` {{node: string, key: string, value: *}} */ (op), eventId); + outcome = propSetOutcome(state.prop, /** `@type` {{node: string, key: string, value: unknown}} */ (op), eventId);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/JoinReducer.js` around lines 397 - 399, The JSDoc cast for the PropSet case still uses a wildcard type for op.value; update the cast in the 'PropSet' branch so op is typed like the other updated annotations (use value: unknown) — specifically edit the cast applied when calling propSetOutcome(state.prop, /** `@type` {{node: string, key: string, value: *}} */ (op), eventId) to use value: unknown so it matches the annotations used elsewhere in this file.src/domain/services/IndexRebuildService.js (1)
269-273:⚠️ Potential issue | 🟡 MinorThe type cast is necessary; improve the source method's return type instead.
StreamingBitmapIndexBuilder.getMemoryStats()declares a generic@returns {Object}with properties documented via@propertytags. This provides no type information to type checkers, making the cast at line 269 load-bearing rather than redundant. Instead of casting at the call site, define a proper typedef in the source method:/** * `@typedef` {Object} MemoryStats * `@property` {number} estimatedBitmapBytes * `@property` {number} estimatedMappingBytes * `@property` {number} totalFlushedBytes * `@property` {number} flushCount * `@property` {number} nodeCount * `@property` {number} bitmapCount */ /** * Returns current memory statistics for monitoring and debugging. * `@returns` {MemoryStats} */ getMemoryStats() { ... }This eliminates the need for the cast in
IndexRebuildService.jsand provides proper type information across the codebase.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/IndexRebuildService.js` around lines 269 - 273, Add a proper typedef and update the return JSDoc on StreamingBitmapIndexBuilder.getMemoryStats so it returns a strongly-typed MemoryStats object (including estimatedBitmapBytes, estimatedMappingBytes, totalFlushedBytes, flushCount, nodeCount, bitmapCount) and change the method JSDoc `@returns` to {`@returns` {MemoryStats}}; then remove the inline cast in IndexRebuildService.js around builder.getMemoryStats() and rely on the new return type so onProgress receives currentMemoryBytes from stats.estimatedBitmapBytes without a type assertion.src/domain/errors/QueryError.js (1)
33-33:⚠️ Potential issue | 🟡 MinorClass-level
@property {Object} contextnot updated.Line 33 still uses
Object; line 38 was tightened toRecord<string, unknown>.📝 Proposed fix
- * `@property` {Object} context - Serializable context object with error details + * `@property` {Record<string, unknown>} context - Serializable context object with error details🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/errors/QueryError.js` at line 33, Update the JSDoc for the QueryError class to use the tightened type for the context property: change the `@property` declaration that currently reads "{Object} context" to "{Record<string, unknown>} context" so it matches the tightened type used elsewhere (e.g., the other annotation on line 38) in the QueryError class.src/domain/errors/OperationAbortedError.js (1)
13-13:⚠️ Potential issue | 🟡 MinorClass-level
@property {Object} contextnot updated.Same inconsistency as flagged in
SchemaUnsupportedError— the class-level@property(line 13) still usesObjectwhile the constructor param (line 18) was tightened toRecord<string, unknown>.📝 Proposed fix
- * `@property` {Object} context - Serializable context object for debugging + * `@property` {Record<string, unknown>} context - Serializable context object for debugging🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/errors/OperationAbortedError.js` at line 13, Update the class JSDoc in OperationAbortedError to match the tightened constructor parameter type: change the class-level `@property` for "context" from {Object} to {Record<string, unknown>} so it aligns with the constructor signature in the OperationAbortedError class; ensure the `@property` name remains "context" and no other wording changes are made.src/domain/errors/IndexError.js (1)
11-11:⚠️ Potential issue | 🟡 MinorClass-level
@property {Object} contextnot updated.Line 11 still uses
Object; line 22 was tightened toRecord<string, unknown>.📝 Proposed fix
- * `@property` {Object} context - Serializable context object for debugging + * `@property` {Record<string, unknown>} context - Serializable context object for debugging🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/errors/IndexError.js` at line 11, The JSDoc for the IndexError class still declares "@property {Object} context" but the actual type was tightened to "Record<string, unknown>" elsewhere; update the class-level JSDoc in IndexError (the "@property context" entry) to use "{Record<string, unknown>}" so the comment matches the implementation and the tightened type used later for the context property.src/domain/errors/TraversalError.js (1)
11-11:⚠️ Potential issue | 🟡 MinorClass-level
@property {Object} contextnot updated.Same pattern as other error classes — line 11 still uses the loose
Objecttype while line 22 was tightened toRecord<string, unknown>.📝 Proposed fix
- * `@property` {Object} context - Serializable context object for debugging + * `@property` {Record<string, unknown>} context - Serializable context object for debugging🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/errors/TraversalError.js` at line 11, Update the class JSDoc for TraversalError so the context property uses the tightened type: change the `@property` annotation for "context" from {Object} to {Record<string, unknown>} to match the pattern used elsewhere (e.g., the later annotation at line 22); locate the TraversalError class JSDoc and replace the type token for the "context" property accordingly.src/domain/errors/SchemaUnsupportedError.js (1)
11-11:⚠️ Potential issue | 🟡 MinorClass-level
@property {Object} contextnot updated to match the tightened constructor type.The constructor param was updated to
Record<string, unknown>(line 16), but the class-level@property {Object} context(line 11) still uses the looseObjecttype — contradicting the PR's goal. The same pattern applies toOperationAbortedError,TraversalError,QueryError,IndexError, andTrustError.📝 Proposed fix
- * `@property` {Object} context - Serializable context object for debugging + * `@property` {Record<string, unknown>} context - Serializable context object for debugging🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/errors/SchemaUnsupportedError.js` at line 11, Update the class-level JSDoc `@property` for context to match the tightened constructor type: change the `@property` {Object} context to `@property` {Record<string, unknown>} context in SchemaUnsupportedError (and apply the same change to OperationAbortedError, TraversalError, QueryError, IndexError, TrustError) so the class docblock aligns with the constructor signature that now accepts Record<string, unknown>; locate the class-level JSDoc above each error class (e.g., SchemaUnsupportedError) and replace the type there.src/domain/errors/TrustError.js (1)
19-19:⚠️ Potential issue | 🟡 MinorClass-level
@property {Object} contextdoes not match constructor parameter type.Line 19 has the stale
@property {Object} context, while line 24 correctly declares the parameter asRecord<string, unknown>. The base classWarpErroralready usesRecord<string, unknown>for this property, so the subclass JSDoc should match.📝 Proposed fix
- * `@property` {Object} context - Serializable context object with error details + * `@property` {Record<string, unknown>} context - Serializable context object with error details🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/errors/TrustError.js` at line 19, Update the TrustError JSDoc so its context property type matches the constructor and the base class: change the class-level `@property` for context from {Object} to {Record<string, unknown>} (the same type used in the TrustError constructor and WarpError) so the documented type aligns with the actual constructor parameter and base-class definition.src/domain/warp/query.methods.js (1)
69-89:⚠️ Potential issue | 🟡 MinorLeftover
@type {Record<string, any>}ingetEdgePropsbody contradicts the updated return type.The public return type at line 69 was narrowed to
Record<string, unknown>, but the internalpropsobject at line 88 still carries a@type {Record<string, any>}annotation — a missedanythat the PR set out to eliminate.🛠️ Proposed fix
- /** `@type` {Record<string, any>} */ + /** `@type` {Record<string, unknown>} */ const props = {};🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/warp/query.methods.js` around lines 69 - 89, The local JSDoc type for the internal props object in getEdgeProps is still annotated as `@type` {Record<string, any>} which contradicts the function's public return type; change the annotation on the props variable (in getEdgeProps) from Record<string, any> to Record<string, unknown> (or remove the inline `@type` and let the function return type stand) and ensure any subsequent assignments to props conform to unknown typing (adjust casts or checks if needed), referencing the props variable and getEdgeProps function to locate the change.bin/cli/commands/patch.js (1)
122-141:⚠️ Potential issue | 🟡 Minor
extractNodeIdsreturn typestring[]is unsound given the updated parameter type.With
ops: Array<Record<string, unknown>>,op.node,op.from, andop.toare eachunknown. The untypednew Set()is inferred asSet<unknown>, so[...ids].sort()producesunknown[], notstring[]. The declared return type is therefore incorrect, and any truthy non-string values would silently propagate through.🛠️ Proposed fix — add `typeof` guards before `ids.add()`
function extractNodeIds(ops) { if (!Array.isArray(ops)) { return []; } - const ids = new Set(); + /** `@type` {Set<string>} */ + const ids = new Set(); for (const op of ops) { - if (op.node) { + if (typeof op.node === 'string') { ids.add(op.node); } - if (op.from) { + if (typeof op.from === 'string') { ids.add(op.from); } - if (op.to) { + if (typeof op.to === 'string') { ids.add(op.to); } } return [...ids].sort(); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@bin/cli/commands/patch.js` around lines 122 - 141, The function extractNodeIds currently treats op.node/op.from/op.to as unknowns and returns string[], so change it to use a Set<string> and only add values after confirming they're strings: initialize ids as new Set<string>() and for each op check e.g. const v = op.node; if (typeof v === 'string') ids.add(v); (and likewise for op.from and op.to) so the returned [...ids].sort() is actually string[] and no non-string values propagate.src/domain/services/HttpSyncServer.js (1)
108-109:⚠️ Potential issue | 🟡 MinorResidual
@param {*}wildcards injsonResponseandisValidSyncRequest
jsonResponse(line 108) andisValidSyncRequest(line 123) still carry@param {*}annotations. These were not touched in this PR, but they are inconsistent with the PR's stated goal of eliminating all wildcard types insrc/.Suggested narrowings:
jsonResponse:@param {unknown} dataisValidSyncRequest:@param {unknown} parsedAlso applies to: 122-125
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/HttpSyncServer.js` around lines 108 - 109, Update the JSDoc param types for jsonResponse and isValidSyncRequest by replacing the wildcard {*} with the narrower unknown type: change the `@param` of jsonResponse to "@param {unknown} data" and change the `@param` of isValidSyncRequest to "@param {unknown} parsed"; ensure the corresponding JSDoc blocks for the functions jsonResponse and isValidSyncRequest in HttpSyncServer.js are updated accordingly so other annotations remain unchanged.
🧹 Nitpick comments (18)
src/domain/services/WormholeService.js (1)
324-324: Redundant double-cast ontypedJson.Both a left-side JSDoc annotation and a right-side JSDoc cast are applied — only one is needed. Either form alone is sufficient for TypeScript's type checker in a
.jsfile.♻️ Proposed simplification
- const /** `@type` {Record<string, unknown>} */ typedJson = /** `@type` {Record<string, unknown>} */ (json); + const typedJson = /** `@type` {Record<string, unknown>} */ (json);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/WormholeService.js` at line 324, The local variable declaration uses a redundant double-cast for typedJson in WormholeService.js; remove one of the annotations so only a single JSDoc type is applied (either the left-side /** `@type` {Record<string, unknown>} */ typedJson or the right-side cast (json) remains). Update the declaration for typedJson to use a single JSDoc/type cast so TypeScript still understands the type but there is no duplicate cast.src/domain/services/DagTopology.js (1)
40-40: Type cast on the default{}is inaccurate and contradicts the PR's accuracy goal.
/**@type{{ indexReader: import('./BitmapIndexReader.js').default }} */ ({})asserts that an empty object satisfies a type with a non-optionalindexReader, which it plainly does not. This is still a lie to the type system, just a narrower one than a wildcard. It also creates an inconsistency: the inline@typeon the parameter itself marksindexReaderas required, yet the default value proves it can be absent at call time.Two cleaner approaches:
Option A — Drop the default entirely (recommended if
indexReaderis always required, which the runtime check on line 41–43 implies):♻️ Proposed refactor — remove the misleading default
- constructor(/** `@type` {{ indexReader: import('./BitmapIndexReader.js').default, logger?: import('../../ports/LoggerPort.js').default, traversal?: import('./DagTraversal.js').default }} */ { indexReader, logger = nullLogger, traversal } = /** `@type` {{ indexReader: import('./BitmapIndexReader.js').default }} */ ({})) { + constructor({ indexReader, logger = nullLogger, traversal }) {The
@paramJSDoc on lines 35–39 already covers the types; the inline@typeon the destructured parameter is entirely redundant.Option B — Keep the default but use an honest optional type:
♻️ Proposed refactor — honest optional type on the default
- constructor(/** `@type` {{ indexReader: import('./BitmapIndexReader.js').default, logger?: import('../../ports/LoggerPort.js').default, traversal?: import('./DagTraversal.js').default }} */ { indexReader, logger = nullLogger, traversal } = /** `@type` {{ indexReader: import('./BitmapIndexReader.js').default }} */ ({})) { + constructor({ indexReader, logger = nullLogger, traversal } = /** `@type` {{ indexReader?: import('./BitmapIndexReader.js').default, logger?: import('../../ports/LoggerPort.js').default, traversal?: import('./DagTraversal.js').default }} */ ({})) {🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/DagTopology.js` at line 40, The inline type cast on the default empty object in the DagTopology constructor is incorrect; remove the misleading default and redundant inline JSDoc cast so the constructor signature requires the parameter (constructor({ indexReader, logger = nullLogger, traversal }) { ... }) and rely on the existing runtime checks around indexReader (lines validating presence) and the existing JSDoc param block for types; alternatively, if you want to keep a default, change the inline cast to an optional shape (e.g., /** `@type` {{ indexReader?: import('./BitmapIndexReader.js').default }} */) — but prefer removing the default per the PR suggestion so indexReader remains required.test/unit/domain/services/PatchBuilderV2.test.js (1)
611-641: Consider adding a negative test for the race-detection failure path.The
expectedParentShamechanism is the TOCTOU guard for concurrent writers, but only the happy path (SHA matches) is covered. A test wherereadRefreturns a different SHA thanexpectedParentShawould verify that the guard actually rejects/throws as intended rather than silently proceeding with a stale parent.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/unit/domain/services/PatchBuilderV2.test.js` around lines 611 - 641, Add a negative test for the TOCTOU guard by creating a case using PatchBuilderV2 where persistence.readRef returns a different SHA than the provided expectedParentSha; call builder.commit() and assert it rejects/throws (and that persistence.commitNodeWithTree was not invoked) to verify the race-detection path actually prevents committing with a stale parent. Use the same setup as the existing happy-path test (mock persistence, addNode('x'), etc.) but change the mocked readRef value to differ from expectedParentSha and assert commit() throws and no commitNodeWithTree call was made.src/domain/crdt/ORSet.js (1)
314-318: Internal@type {Array<[any, string[]]>}is inconsistent with the updated public return type.Line 314 now documents the return as
Array<[unknown, string[]]>, but the local variable on line 318 is still cast toArray<[any, string[]]>.anydisables typechecking on the element slot whileunknownenforces narrowing; mixing them undermines the improvement.-/** `@type` {Array<[any, string[]]>} */ +/** `@type` {Array<[unknown, string[]]>} */🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/crdt/ORSet.js` around lines 314 - 318, Update the internal JSDoc type annotation in the orsetSerialize function to match the public return type by replacing the local "@type {Array<[any, string[]]>}" with "@type {Array<[unknown, string[]]>}" so the element slot uses unknown (not any); this keeps the local variable's type consistent with the documented return signature and preserves stricter type narrowing for elements when serializing the Map-to-array conversion in orsetSerialize.src/domain/services/IndexRebuildService.js (2)
392-392: Root-cause fix: widenstoragefield type to eliminate all double-casts.This new intersection cast (
IndexStoragePort & BlobPort) follows the same forced-cast pattern already present at lines 305, 308, and 387, all becausethis.storageis declared only asIndexStoragePort. Widening the constructor'sstorageparameter (and the corresponding field) to the full intersection it must satisfy would remove every cast throughunknownacross the file.♻️ Proposed fix — widen `storage` at declaration
In the constructor JSDoc, change the
storagetype to the full intersection:- * `@param` {import('../../ports/IndexStoragePort.js').default} options.storage - Storage adapter + * `@param` {import('../../ports/IndexStoragePort.js').default & import('../../ports/BlobPort.js').default & import('../../ports/TreePort.js').default} options.storage - Storage adapterThen remove the individual casts in
_persistIndex,load, and_rebuildStreaming:- const oid = await /** `@type` {import('../../ports/BlobPort.js').default} */ (/** `@type` {unknown} */ (this.storage)).writeBlob(buffer); + const oid = await this.storage.writeBlob(buffer);- return await /** `@type` {import('../../ports/TreePort.js').default} */ (/** `@type` {unknown} */ (this.storage)).writeTree(flatEntries); + return await this.storage.writeTree(flatEntries);- const shardOids = await /** `@type` {import('../../ports/TreePort.js').default} */ (/** `@type` {unknown} */ (this.storage)).readTreeOids(treeOid); + const shardOids = await this.storage.readTreeOids(treeOid);- const indexFrontier = await loadIndexFrontier(shardOids, /** `@type` {import('../../ports/IndexStoragePort.js').default & import('../../ports/BlobPort.js').default} */ (this.storage), { codec: this._codec }); + const indexFrontier = await loadIndexFrontier(shardOids, this.storage, { codec: this._codec });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/IndexRebuildService.js` at line 392, The storage field is typed too narrowly, causing repeated forced intersection casts; update the constructor JSDoc so the constructor parameter and the this.storage field are declared with the full intersection type (IndexStoragePort & BlobPort) and then remove the ad-hoc casts around this.storage (e.g., in usages inside _persistIndex, load, and _rebuildStreaming such as the loadIndexFrontier(...) call) so the code uses this.storage directly with the correct widened type.
53-53: Consider extracting the inline default-parameter type to a@typedef.The inline
@typeannotation in the default parameter is ~240 characters long and only covers two of the five destructured fields, making the constructor signature nearly unreadable. A named@typedefat the top of the file would be cleaner and reusable.♻️ Proposed refactor
Add above the class:
+/** + * `@typedef` {{ graphService: { iterateNodes: (opts: { ref: string, limit: number }) => AsyncIterable<{ sha: string, parents: string[] }> }, storage: import('../../ports/IndexStoragePort.js').default }} IndexRebuildServiceOptions + */Then simplify the constructor signature:
- constructor({ graphService, storage, logger = nullLogger, codec, crypto } = /** `@type` {{ graphService: { iterateNodes: (opts: { ref: string, limit: number }) => AsyncIterable<{ sha: string, parents: string[] }> }, storage: import('../../ports/IndexStoragePort.js').default }} */ ({})) { + constructor({ graphService, storage, logger = nullLogger, codec, crypto } = /** `@type` {IndexRebuildServiceOptions} */ ({})) {🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/IndexRebuildService.js` at line 53, Extract the long inline JSDoc for the constructor default parameter into a named typedef (e.g., IndexRebuildServiceDeps) placed at the top of the file, declaring the five properties: graphService, storage, logger, codec, and crypto (with the appropriate types for graphService.iterateNodes and storage). Then update the IndexRebuildService constructor signature to use the new typedef in its JSDoc (replace the inline `@type`(...) with `@type` {IndexRebuildServiceDeps} or similar) and keep the same destructured defaults (logger = nullLogger) so behavior is unchanged; ensure the typedef name is referenced in the constructor JSDoc and that the class name IndexRebuildService and constructor remain unchanged.scripts/ts-policy-check.js (1)
50-86:check()has no I/O error handling — unreadable files produce an unhandled rejection.
readdir/readFilefailures will crash the process with a raw Node.js stack trace instead of a clear policy-check error message. Wrapping in a top-level try/catch (or adding a.catchat the call site) gives meaningful output in CI.♻️ Proposed fix
-check(); +check().catch((err) => { + console.error(`TS policy check failed unexpectedly: ${err.message}`); + process.exit(2); +});🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@scripts/ts-policy-check.js` around lines 50 - 86, The check() function currently has no I/O error handling so readFile/walkJs errors cause unhandled rejections; wrap the body of check() (or at least the for-await file loop and the readFile call) in a try/catch to catch filesystem errors from readFile/walkJs, then convert them into a clear policy-check error (e.g., push a formatted violation or log a descriptive message including err.message and the affected path using the same violations array) and ensure the process exits non-zero or the caller receives the failure; reference function check(), the readFile call and the for-await loop over walkJs so the catch is placed around those operations.src/domain/errors/ForkError.js (1)
24-29:@property {Object} contextnot updated to match the constructor and parent class.Line 24 still documents
contextas{Object}, while the constructor param (line 29) now usesRecord<string, unknown>and the parentWarpErroris annotated the same way.♻️ Proposed fix
-* `@property` {Object} context - Serializable context object with error details +* `@property` {Record<string, unknown>} context - Serializable context object with error details🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/errors/ForkError.js` around lines 24 - 29, Update the JSDoc for the ForkError class so the `@property` type for "context" matches the constructor and parent WarpError (use Record<string, unknown> instead of Object); locate the ForkError class and its JSDoc block and change the context annotation to "@property {Record<string, unknown>} context" so the documentation and the constructor signature (the constructor param in ForkError) are consistent with WarpError.bin/cli/commands/path.js (1)
82-87: Optional: preserveErrorin the intersection cast.After
error instanceof Error, the subsequent cast/**@type{{code?: string}} */ (error)narrows away theErrortype. Using an intersection type retains both:♻️ Proposed refinement
- if (error instanceof Error && /** `@type` {{code?: string}} */ (error).code === 'NODE_NOT_FOUND') { + if (error instanceof Error && /** `@type` {Error & {code?: string}} */ (error).code === 'NODE_NOT_FOUND') {🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@bin/cli/commands/path.js` around lines 82 - 87, The catch block narrows away the Error type by casting error to {{code?: string}}; preserve the original Error by using an intersection type when checking the code property. Replace the cast /** `@type` {{code?: string}} */ (error) with an intersection cast that keeps Error (for example /** `@type` {Error & {code?: string}} */ (error) or the TS equivalent) so the instance check (error instanceof Error) and the use of notFoundError(error.message) remain type-safe.bin/cli/commands/check.js (1)
140-140:buildCheckPayload(line 98) retains multiple*wildcard annotations.The return type on
handleCheckwas tightened to{payload: unknown, …}, but thebuildCheckPayloadJSDoc parameter still useshealth: *, checkpoint: *, gcMetrics: *, hook: *|null, status: *|null. This leaves several unresolved wildcards in the same call chain, contrary to the PR's stated goal.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@bin/cli/commands/check.js` at line 140, The JSDoc for buildCheckPayload still uses wildcard "*" types which conflicts with the tightened handleCheck return type; update buildCheckPayload's parameter and return annotations to concrete types (e.g., use unknown for health, checkpoint, gcMetrics and use unknown|null for hook and status) and ensure its `@returns` matches {Promise<{payload: unknown, exitCode: number}>} (or the synchronous equivalent) so the call chain between buildCheckPayload and handleCheck has consistent, non-wildcard typing; adjust annotations on the buildCheckPayload function signature and any related param tags to reference buildCheckPayload, handleCheck, and their payload/exitCode shapes.src/domain/services/BoundaryTransitionRecord.js (1)
395-414:deserializeBTRfield validation doesn't check individual field types before casting toBTR.The required-fields loop (lines 400–404) only verifies field presence (
field in obj), not types. For example,versioncould be a string,h_incould be an array, and the cast on line 406 would still succeed silently. This is pre-existing behavior, but now that you've typed the decoded object asRecord<string, unknown>, the gap is more visible.Consider reusing
validateBTRStructure(which already checksbtr.version !== BTR_VERSION) or adding lightweight type guards before the cast to prevent malformed BTRs from propagating.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/BoundaryTransitionRecord.js` around lines 395 - 414, deserializeBTR only checks that REQUIRED_FIELDS exist but doesn't validate their types before casting to BTR, so malformed values can slip through; update deserializeBTR to perform lightweight type checks (or call the existing validateBTRStructure) after decoding and before returning: use the same validation logic as validateBTRStructure (including checking btr.version === BTR_VERSION and validating primitives/arrays for h_in, h_out, U_0, P, t, kappa) or implement small type guards for each field, and ensure you still use codec/defaultCodec as currently done.src/domain/services/CheckpointSerializerV5.js (2)
220-237:valproperty access inelsebranch lacks a type guard.In the
elsebranch (line 232–233),valisunknown(coming from the now-Record<string, unknown>obj), but.lamport,.writerId, etc. are accessed without narrowing. This is pre-existing and works at runtime because CBOR decoded data has the expected shape, but it's inconsistent with the stricter typing introduced elsewhere in this file.A minimal guard like
typeof val === 'object' && val !== nullwould align this path with the PR's goals.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/CheckpointSerializerV5.js` around lines 220 - 237, In deserializeEdgeBirthEvent, the else branch accesses properties on val (val.lamport, val.writerId, val.patchSha, val.opIndex) without narrowing its unknown type; add a runtime type guard (e.g. check typeof val === 'object' && val !== null and that expected keys exist/are of correct primitives) before reading those properties in deserializeEdgeBirthEvent, then shallow-copy from the validated object (or skip/handle the entry if validation fails) so you don't access properties on non-objects and remain consistent with the stricter typing.
109-113: Cast tostringis technically unsound for the error diagnostic.
obj.versionpassed the!== undefined && !== 'full-v5'check, so it could be a number, boolean, or object — not necessarily a string. The/**@type{string} */cast is incorrect in that case. The template literal will still calltoString()so runtime behavior is fine, but the cast is misleading.💡 Suggested alternative
- const ver = /** `@type` {string} */ (obj.version); + const ver = String(obj.version);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/CheckpointSerializerV5.js` around lines 109 - 113, Remove the incorrect JSDoc string cast on obj.version in the error path of CheckpointSerializerV5 and instead coerce the value to a safe string for the message (e.g., via String(...) or JSON.stringify(...)) so any type (number, boolean, object) is represented reliably; update the throw in the same block that currently checks if (obj.version !== undefined && obj.version !== 'full-v5') to interpolate the coerced value when constructing the Error to replace the /** `@type` {string} */ (obj.version) cast.test/unit/domain/trust/TrustEvaluator.test.js (1)
121-123: Two@param {*}wildcards remain on lines 121–122.These
.mapcallbacks still use/**@param{*} e */, which is inconsistent with the PR's goal of eliminating wildcard types. They could be updated toRecord<string, unknown>(matching lines 171–172) or to the actual explanation shape.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/unit/domain/trust/TrustEvaluator.test.js` around lines 121 - 123, Replace the JSDoc wildcard param in the map callbacks used on a1.trust.explanations and a2.trust.explanations (the callbacks that extract .writerId) so they no longer use /** `@param` {*} e */; change the type to a non-wildcard such as /** `@param` {Record<string, unknown>} e */ or the actual explanation shape used elsewhere in the test (matching the type used on lines 171–172) so both .map callbacks have a concrete param type instead of {*} and remain consistent with the rest of the file.bin/cli/commands/verify-audit.js (1)
55-57: Double cast throughunknowntoCorePersistencemay mask type drift.The
/**@type{unknown} */→/**@type{CorePersistence} */pattern bypasses all structural checks. IfcreatePersistencereturns a type that diverges fromCorePersistencein the future, this cast will silently hide the mismatch.Consider adding an adapter or a narrowing assertion function (e.g.,
assertCorePersistence(persistence)) that validates the required methods exist at runtime, consistent with the duck-typing patterns elsewhere in the PR.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@bin/cli/commands/verify-audit.js` around lines 55 - 57, Replace the unsafe double cast of persistence to CorePersistence by adding a runtime narrowing/assertion or adapter and using that in the AuditVerifierService constructor: implement a function like assertCorePersistence(persistence) (or build an adapter wrapCorePersistence(persistence)) that checks the required methods/properties (duck-typed) and either throws or returns a properly-typed object, then pass the asserted/adapter value to new AuditVerifierService instead of using /** `@type` {unknown} */ casts; reference the persistence variable and the AuditVerifierService constructor call when applying this change.src/domain/trust/TrustStateBuilder.js (1)
50-55: The/**@type{string} */cast makes the??fallback invisible to the type checker.Casting
record.recordIdtostringtells TypeScript it's always a string, so?? '(unknown)'appears unreachable from TS's perspective. At runtime it works correctly, but if the intent is to handle a missingrecordId, a more accurate approach would be:💡 Suggestion
- recordId: /** `@type` {string} */ (record.recordId) ?? '(unknown)', + recordId: typeof record.recordId === 'string' ? record.recordId : '(unknown)',🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/trust/TrustStateBuilder.js` around lines 50 - 55, The inline JSDoc cast /** `@type` {string} */ on record.recordId hides the possibility of null/undefined from the type checker; change to compute a local recordId first (e.g., const recordId = record.recordId ?? '(unknown)') and then use that variable in the errors.push call so the fallback is visible to TypeScript and runtime behavior is preserved (update the code around parsed.success handling in TrustStateBuilder.js where errors.push references record.recordId).src/domain/trust/TrustRecordService.js (1)
262-262: String casts are safe here but rely on upstream schema validation — consider a defensive guard.Both
record.recordTypeandrecord.recordIdare cast tostringvia JSDoc. This is safe because_persistRecordis only called after schema validation (line 55) and recordId integrity checks (line 64) inappendRecord. However, since_persistRecordis aprivatemethod that could theoretically be called from a new code path in the future, you might consider a lightweight assertion or fallback (e.g.,String(record.recordId)).This is a minor observation — the current code is correct for all existing call sites.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/trust/TrustRecordService.js` at line 262, The template literal building the log `message` in TrustRecordService::_persistRecord currently relies on JSDoc casts for record.recordType and record.recordId; add a defensive guard so unexpected types won't break future callers — coerce both values with String(...) or a fallback like (record.recordType ?? 'unknown') and (record.recordId ?? '') before slicing, or assert types at the top of _persistRecord; adjust the `message` construction to use those guarded/coerced values so appendRecord's upstream validation remains sufficient but _persistRecord is resilient to new call paths.src/domain/services/AuditVerifierService.js (1)
319-319: Consider typingprevReceiptasAuditReceipt|nullinstead ofRecord<string, unknown>|null.
prevReceiptis assigned fromreceipt(line 446), which is cast toAuditReceipt. UsingAuditReceipt|nullwould provide stronger type information for_validateChainLinkcallers and is more precise.Suggested change
- /** `@type` {Record<string, unknown>|null} */ let prevReceipt = null; + /** `@type` {AuditReceipt|null} */ let prevReceipt = null;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/AuditVerifierService.js` at line 319, The declaration for prevReceipt uses a loose type; change its JSDoc from Record<string, unknown>|null to AuditReceipt|null so callers of _validateChainLink get stronger typing—update the JSDoc where prevReceipt is declared and ensure any assignment (e.g., assigning receipt cast as AuditReceipt) remains compatible with the new AuditReceipt|null annotation.
| * | ||
| * @param {ORSet} set - The ORSet | ||
| * @returns {Array<*>} Array of present elements | ||
| * @returns {string[]} Array of present elements |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# First, let's read the ORSet.js file to see the actual annotations and structure
cat -n src/domain/crdt/ORSet.js | head -200Repository: git-stunts/git-warp
Length of output: 7755
🏁 Script executed:
# Continue reading the file to see the serialize/deserialize functions around lines 310-355
sed -n '300,360p' src/domain/crdt/ORSet.js | cat -nRepository: git-stunts/git-warp
Length of output: 2139
🏁 Script executed:
# Now let's search for all call sites of orsetAdd to see what types of elements are actually passed
rg -n --type=js 'orsetAdd\(' -A 2 -B 2Repository: git-stunts/git-warp
Length of output: 50375
🏁 Script executed:
# Let me check if there are any non-string elements being added. Let me search for what gets passed to orsetAdd more carefully
rg 'orsetAdd\(state\.(nodeAlive|edgeAlive),' -A 1 | head -50Repository: git-stunts/git-warp
Length of output: 4068
🏁 Script executed:
# Let me check what encodeEdgeKey returns
rg 'export.*encodeEdgeKey|function encodeEdgeKey' -A 3Repository: git-stunts/git-warp
Length of output: 700
🏁 Script executed:
# Let me read the exact lines mentioned to verify the inconsistency between line 318 and 314
sed -n '309,325p' src/domain/crdt/ORSet.js | cat -nRepository: git-stunts/git-warp
Length of output: 834
string[] return type on line 171 is too narrow for orsetElements.
The ORSet typedef declares entries: Map<*, Set<string>> (line 94), allowing elements of any type. The orsetElements function returns elements from set.entries.keys(), which are typed as *, so the return type should be unknown[] to match the actual element type:
-* `@returns` {string[]} Array of present elements
+* `@returns` {unknown[]} Array of present elementsAdditionally, there is a type consistency issue in orsetSerialize: line 314's public return type uses Array<[unknown, string[]]>, but line 318's internal variable annotation uses Array<[any, string[]]>. The internal annotation should use unknown instead of any for consistency.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| * @returns {string[]} Array of present elements | |
| * `@returns` {unknown[]} Array of present elements |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/domain/crdt/ORSet.js` at line 171, Update the JSDoc/type annotations to
use unknown for ORSet element types: change the return type of orsetElements
from string[] to unknown[] to match the ORSet typedef entries: Map<*,
Set<string>> and the values returned by set.entries.keys(); and in
orsetSerialize replace the internal variable annotation Array<[any, string[]]>
with Array<[unknown, string[]]> so its internal type matches the declared public
return type Array<[unknown, string[]]> and remains consistent with ORSet's
element typing.
There was a problem hiding this comment.
False positive — the ORSet typedef at line 94 declares entries: Map<string, Set<string>>, so elements are always strings. The * in the comment's analysis was from an outdated view. string[] is correct.
There was a problem hiding this comment.
Rate Limit Exceeded
@flyingrobots have exceeded the limit for the number of chat messages per hour. Please wait 1 minutes and 45 seconds before sending another message.
Prevents ANY new TypeScript drift from reaching remote. Zero tolerance for new `any` or `*` wildcards — the ratchet locks the count at 117 and blocks any increase. Gate stack (all must pass): - ESLint: no-explicit-any bans the `any` keyword in all type annotations - TypeScript: strict mode compiler check - Policy checker: upgraded with ratchet mechanism + contracts/any-fence.json - Consumer type test: verifies public API surface - Pre-commit: runs ESLint + full policy scan before every commit - Pre-push: all 4 gates in parallel + unit tests - CI: dedicated type-firewall job (configure as required status check) Policy checker zero-tolerance rules: @ts-ignore, z.any(), TODO(ts-cleanup), any in index.d.ts. Ratcheted rules: any/star in JSDoc types (count locked, can only decrease).
…dService port bugs
Eliminates all remaining wildcards and sets the fence to 0:
- Add RoaringBitmapSubset structural typedef replacing 7 any/wildcards
across BitmapIndexBuilder, StreamingBitmapIndexBuilder, BitmapIndexReader
- Fix TrustRecordService: writeTree now uses mktree-format string[],
createCommit replaced with commitNodeWithTree (matching CommitPort)
- Wrap bitmap.serialize() in Buffer.from() for Uint8Array portability
- Type TrustRecordService constructor with CommitPort & BlobPort &
TreePort & RefPort intersection (was bare {*})
- Ratchet any-fence.json from 9 → 0
There was a problem hiding this comment.
Actionable comments posted: 12
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (6)
src/domain/services/SyncProtocol.js (2)
307-316:⚠️ Potential issue | 🟡 Minor
SyncResponsetypedef also usesObjectforpatch— same loose type.Line 311 declares
patch: Objectin theSyncResponsetypedef. For consistency with the newDecodedPatchtype and the PR's wildcard-elimination goal, this should be narrowed.📝 Proposed fix
- * `@property` {Array<{writerId: string, sha: string, patch: Object}>} patches - Patches + * `@property` {Array<{writerId: string, sha: string, patch: DecodedPatch}>} patches - Patches🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/SyncProtocol.js` around lines 307 - 316, The SyncResponse typedef currently uses a loose Object for patch; update the typedef for SyncResponse (in SyncProtocol.js) to reference the precise DecodedPatch type instead of Object — i.e. change the patches entry from `{writerId: string, sha: string, patch: Object}` to use the existing DecodedPatch typedef (e.g. `patch: DecodedPatch`) so the patch shape is strictly typed and consistent with the new wildcard-elimination types.
147-150:⚠️ Potential issue | 🟡 MinorReturn type still uses
Objectforpatch— should beDecodedPatch.
loadPatchRangedelegates toloadPatchFromCommitwhich now returnsPromise<DecodedPatch>, but the JSDoc on line 147 still declares{patch: Object, sha: string}. This contradicts the PR's goal of eliminating loose types.📝 Proposed fix
- * `@returns` {Promise<Array<{patch: Object, sha: string}>>} Array of patch objects in + * `@returns` {Promise<Array<{patch: DecodedPatch, sha: string}>>} Array of patch objects in🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/SyncProtocol.js` around lines 147 - 150, Update the JSDoc for loadPatchRange to use the concrete DecodedPatch type instead of a loose Object: change the return annotation from Promise<Array<{patch: Object, sha: string}>> to Promise<Array<{patch: DecodedPatch, sha: string}>> and ensure DecodedPatch is available in this file (either via a local typedef or by referencing the existing typedef/import used by loadPatchFromCommit). Also update any inline param/variable JSDoc in loadPatchRange that references the patch type to DecodedPatch so types remain consistent with loadPatchFromCommit.test/unit/domain/properties/Join.property.test.js (1)
12-17:⚠️ Potential issue | 🟡 Minor
@type {any}aliases suppress all downstream type checking on these functions.The three
@type {any}casts are intentional escape hatches, and the PR's stated metrics explicitly targetsrc/bin(nottest/unit/), so these are out of scope for the current wave. Worth noting for a future cleanup pass: these aliases disable type checking on every call throughjoinStates,reduceV5, andcomputeStateHashV5within this file. Narrower suppression (e.g. a single//@ts-ignore`` at a specific call site, or using typed overloads for the test helpers) would preserve checking elsewhere.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/unit/domain/properties/Join.property.test.js` around lines 12 - 17, The file-wide JSDoc casts using /** `@type` {any} */ for joinStates, reduceV5, and computeStateHashV5 (aliases _joinStates, _reduceV5, _computeStateHashV5) disables all downstream type checking; replace the broad casts with narrower suppression or proper typings: remove the /** `@type` {any} */ aliases and either add targeted // `@ts-ignore` comments at individual test call sites where type mismatches are expected, or declare precise JSDoc/TypeScript types/overloads for the test helper functions (joinStates, reduceV5, computeStateHashV5) so other calls in the file retain full type checking.src/domain/services/BoundaryTransitionRecord.js (1)
396-415:⚠️ Potential issue | 🟡 Minor
deserializeBTRskips the BTR version check thatvalidateBTRStructureperforms.
validateBTRStructure(lines 206–208) rejects records whererec.version !== BTR_VERSION.deserializeBTRonly validates field presence (lines 401–405) and then casts toBTR, so a future-version or version-corrupted record is deserialized without a version gate. Callers that bypassverifyBTR(e.g., inspection tooling or migration paths) would silently receive an unsupported-versionBTR.The simplest fix is to reuse
validateBTRStructurehere rather than duplicating the presence loop:🐛 Proposed fix — reuse `validateBTRStructure`
export function deserializeBTR(bytes, { codec } = {}) { const c = codec || defaultCodec; const obj = /** `@type` {Record<string, unknown>} */ (c.decode(bytes)); - // Validate structure (reuse module-level constant for consistency with validateBTRStructure) - for (const field of REQUIRED_FIELDS) { - if (!(field in obj)) { - throw new Error(`Invalid BTR: missing field ${field}`); - } - } + const structureError = validateBTRStructure(obj); + if (structureError) { + throw new Error(`Invalid BTR: ${structureError}`); + } return /** `@type` {BTR} */ ({🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/BoundaryTransitionRecord.js` around lines 396 - 415, deserializeBTR currently only checks REQUIRED_FIELDS and returns a BTR even if obj.version differs from BTR_VERSION; replace the manual presence loop with a call to validateBTRStructure(obj) (the same function used by validateBTRStructure which enforces rec.version === BTR_VERSION) before casting/returning the BTR so that unsupported or corrupted versions are rejected; keep using codec/defaultCodec and then call validateBTRStructure(obj) and only after it returns proceed to build and return the typed BTR object in deserializeBTR.src/domain/services/PatchBuilderV2.js (1)
74-89:⚠️ Potential issue | 🟡 Minor
@paramtype foroptions.persistenceis narrower than the field intersection it's cast toThe
@paramat line 74 declaresoptions.persistenceasGraphPersistencePort.defaultonly, but line 89 asserts the field as the full 5-way intersection. TypeScript validates call sites against the@paramtype, not the field type — so a caller passing an object that only satisfiesGraphPersistencePort.default(missingreadRef,writeBlob,writeTree,commitNodeWithTree,updateRef,showNode,readBlob) will not get a type error, yetcommit()will fail at runtime when it tries to use those methods.🛠️ Proposed fix
- * `@param` {import('../../ports/GraphPersistencePort.js').default} options.persistence - Git adapter - * (uses CommitPort + RefPort + BlobPort + TreePort methods) + * `@param` {import('../../ports/GraphPersistencePort.js').default & import('../../ports/RefPort.js').default & import('../../ports/CommitPort.js').default & import('../../ports/BlobPort.js').default & import('../../ports/TreePort.js').default} options.persistence - Git adapter + * (CommitPort + RefPort + BlobPort + TreePort methods must all be present)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/PatchBuilderV2.js` around lines 74 - 89, The JSDoc for the constructor parameter options.persistence is too narrow (only GraphPersistencePort) while the class stores it as a 5-way intersection (GraphPersistencePort & RefPort & CommitPort & BlobPort & TreePort) and methods like commit() rely on those extra ports; update the `@param` annotation for options.persistence to the full intersection type used in the assignment (GraphPersistencePort & RefPort & CommitPort & BlobPort & TreePort) so callers are type-checked correctly against all required methods (see constructor and this._persistence).src/domain/trust/TrustRecordService.js (1)
118-134:⚠️ Potential issue | 🟡 MinorMissing guard for absent
record.cborkey inreadRecords— inconsistent with_readTip.In
_readTip(lines 236–239) there is an explicit null guard before callingreadBlob:const blobOid = entries['record.cbor']; if (!blobOid) { return { tipSha, recordId: null }; }
readRecordshas no equivalent guard. If a tree ever lacksrecord.cbor(data corruption, schema migration, unexpected tree format),readBlob(undefined)is called, producing a confusing "Blob not found: undefined" error with no indication of which commit or ref is involved.🛡️ Proposed fix — add a guard and a descriptive error
while (current) { const info = await this._persistence.getNodeInfo(current); + const treeEntries = await this._persistence.readTreeOids( + await this._persistence.getCommitTree(current), + ); + const blobOid = treeEntries['record.cbor']; + if (!blobOid) { + throw new TrustError( + `Trust record tree at commit ${current} is missing record.cbor entry`, + { code: 'E_TRUST_RECORD_CORRUPT' }, + ); + } const record = /** `@type` {Record<string, unknown>} */ (this._codec.decode( - await this._persistence.readBlob( - (await this._persistence.readTreeOids( - await this._persistence.getCommitTree(current), - ))['record.cbor'], - ), + await this._persistence.readBlob(blobOid), ));🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/trust/TrustRecordService.js` around lines 118 - 134, The loop in readRecords lacks the null guard present in _readTip and currently calls readBlob with undefined when entries['record.cbor'] is missing; update readRecords to first extract const blobOid = (await this._persistence.readTreeOids(await this._persistence.getCommitTree(current)))['record.cbor'], check if (!blobOid) and either continue/skip this commit or throw a descriptive error that includes the current commit SHA (the local variable current) and the recordId/ref context, and only call this._persistence.readBlob(blobOid) when blobOid is present; mirror the null-guard pattern from _readTip to ensure errors identify which commit/ref is missing record.cbor.
|
|
||
| ▐▛███▜▌ Claude Code v2.1.42 | ||
| ▝▜█████▛▘ Sonnet 4.5 · Claude Max | ||
| ▘▘ ▝▝ ~/git/git-stunts/git-warp | ||
|
|
||
| ❯ /clear | ||
| ⎿ (no content) | ||
|
|
||
| ❯ We just pushed this branch and opened its code review. Be your own | ||
| critic: give yourself a DEEPLY STRICT and PEDANTIC and HARD AS FUCK code | ||
| review by diffing against origin/main. Pretend to be Code Rabbit. Give me | ||
| teh overview and then drop your comments and all that. Then let's fix | ||
| the issues you self-discover before we move on to addressing the REAL | ||
| code rabbit's comments. IGNORE THEM FOR NOW! | ||
|
|
||
| ⏺ Let me get the full picture of what's changed on this branch. | ||
|
|
||
| ⏺ Bash(git log --oneline origin/main..HEAD) | ||
| ⎿ 0a4d13e feat(types): M8 IRONCLAD Tiers 3-4 — eliminate all remaining w | ||
| ildcard casts | ||
| 631ef96 feat(types): M8 IRONCLAD Tiers 1-2 — eliminate ~20 wildcard ca | ||
| … +3 lines (ctrl+o to expand) | ||
|
|
||
| ⏺ Bash(git diff --stat origin/main..HEAD) | ||
| ⎿ .github/workflows/ci.yml | 2 + | ||
| CHANGELOG.md | 9 + | ||
| GRAVEYARD.md | 16 + | ||
| … +71 lines (ctrl+o to expand) | ||
|
|
||
| ⏺ 83 files, ~1864 insertions. Let me get the full diff and review it thoroughly. | ||
|
|
||
| ⏺ Bash(git diff origin/main..HEAD -- '*.js' '*.ts' '*.json' '*.yml' | head | ||
| -8000) | ||
| ⎿ diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml | ||
| index 92611c8..30d4d18 100644 | ||
| --- a/.github/workflows/ci.yml | ||
| … +2678 lines (ctrl+o to expand) | ||
|
|
||
| ⏺ Read 1 file (ctrl+o to expand) | ||
|
|
||
| ⏺ Good, the full diff is ~4158 lines. Let me read it in chunks along with the | ||
| new files. | ||
|
|
||
| ⏺ Read 1 file (ctrl+o to expand) | ||
|
|
||
| ⏺ Now let me also check the new files more closely — especially the ones with | ||
| remaining any/* types and the type surface contract. | ||
|
|
||
| ⏺ Bash(git diff origin/main..HEAD -- '*.md' | head -500) | ||
| ⎿ diff --git a/CHANGELOG.md b/CHANGELOG.md | ||
| index b2022be..2c32e1f 100644 | ||
| --- a/CHANGELOG.md | ||
| … +350 lines (ctrl+o to expand) | ||
|
|
||
| ⏺ Searched for 10 patterns (ctrl+o to expand) | ||
|
|
||
| ⏺ Good — zero @type {*}, zero TODO(ts-cleanup), zero @type {any} in src/. Let me | ||
| check bin/ too and look at the remaining any in the new .d.ts. | ||
|
|
||
| ⏺ Searched for 4 patterns (ctrl+o to expand) | ||
|
|
||
| ⏺ Now let me check a few specific things that jumped out. | ||
|
|
||
| ⏺ Searched for 5 patterns (ctrl+o to expand) | ||
|
|
||
| ⏺ Bash(npm run lint 2>&1 | tail -30) | ||
| ⎿ > @git-stunts/git-warp@11.3.0 lint | ||
| > eslint . | ||
| ⎿ (timeout 2m) | ||
|
|
||
| ⏺ Good, lint passes. Let me check typecheck too. | ||
|
|
||
| ⏺ Bash(npm run typecheck 2>&1 | tail -30) | ||
| ⎿ > @git-stunts/git-warp@11.3.0 typecheck | ||
| > tsc --noEmit | ||
| ⎿ (timeout 2m) | ||
|
|
||
| ⏺ Bash(npm run typecheck:consumer 2>&1 | tail -30) | ||
| ⎿ > @git-stunts/git-warp@11.3.0 typecheck:consumer | ||
| > tsc --noEmit -p test/type-check/tsconfig.json | ||
| ⎿ (timeout 2m) | ||
|
|
||
| ⏺ All clean. Let me do a quick test run. | ||
|
|
||
| ⏺ Bash(npm run test:local 2>&1 | tail -20) | ||
| ⎿ ✓ WarpGraph.subscribe() (PL/SUB/1) > edge cases > handles unsubscri | ||
| be called during handler execution 877ms | ||
| ✓ WarpGraph.subscribe() with replay option (PL/SUB/2) > replay: tru | ||
| … +26 lines (ctrl+o to expand) | ||
| ⎿ (timeout 5m) | ||
|
|
||
| ⏺ All 3799 tests pass. Now I have everything I need. Here's the review. | ||
|
|
||
| --- | ||
| Code Review: repo-clean — M8 IRONCLAD Tiers 1-4 | ||
|
|
||
| PR: 3 commits, 83 files, +1864 / -487 | ||
| Scope: Eliminate ~107 wildcard casts across domain/infra/CLI, add type surface | ||
| contract, consumer type test, HttpSyncServer Zod validation | ||
|
|
||
| Summary | ||
|
|
||
| This PR systematically replaces @type {*} / @type {any} casts with proper | ||
| types across the codebase. It also introduces src/globals.d.ts for Deno/Bun | ||
| ambient types, new persistence type aliases in WarpPersistence.js, error | ||
| narrowing helpers in WarpErrors.js, Zod schema validation for HttpSyncServer, | ||
| a 441-line type surface JSON contract, a TypeScript consumer smoke test, and a | ||
| reorganized ROADMAP.md. | ||
|
|
||
| Lint, typecheck, typecheck:consumer, and all 3799 unit tests pass. Zero @type | ||
| {*}, @type {any}, or TODO(ts-cleanup) remaining in src/. | ||
|
|
||
| --- | ||
| Nitpicks & Issues | ||
|
|
||
| 1. PersistenceReader === PersistenceWriter (Bug-level) | ||
|
|
||
| File: src/domain/types/WarpPersistence.js:17-22 | ||
|
|
||
| PersistenceReader and PersistenceWriter are defined as identical types: | ||
|
|
||
| /** @typedef {CommitPort & BlobPort & TreePort & RefPort} PersistenceReader */ | ||
| /** @typedef {CommitPort & BlobPort & TreePort & RefPort} PersistenceWriter */ | ||
|
|
||
| Both are CommitPort & BlobPort & TreePort & RefPort. The names imply a | ||
| read/write distinction that doesn't exist. This is misleading — someone reads | ||
| "PersistenceReader" and assumes it excludes write operations. Either make them | ||
| actually different (reader = only read methods) or collapse them into a | ||
| single type. Also, CheckpointPersistence is identical too, making 3 aliases | ||
| for the same type. | ||
|
|
||
| 2. z.any() x4 in HttpSyncServer Zod schema | ||
|
|
||
| File: src/domain/services/HttpSyncServer.js:27-37 | ||
|
|
||
| crypto: z.any().optional(), | ||
| logger: z.any().optional(), | ||
| httpPort: z.any().refine(...) | ||
| graph: z.any().refine(...) | ||
|
|
||
| Four uses of z.any() in a PR whose purpose is eliminating wildcards. httpPort | ||
| and graph should at least be z.object({}) or z.custom() — currently they'd | ||
| accept a string, number, or array and pass the .refine() check. crypto and | ||
| logger can be z.custom() too. Using z.any() here defeats the point of adding | ||
| Zod validation. | ||
|
|
||
| 3. Behavioral change: HttpSyncServer path normalization removed | ||
|
|
||
| File: src/domain/services/HttpSyncServer.js:43 + test changes | ||
|
|
||
| Previously: path: 'custom' → auto-normalized to /custom. | ||
| Now: .startsWith('/') validation → throws on 'custom'. | ||
|
|
||
| This is a breaking behavioral change buried in a "type cleanup" PR. The old | ||
| test normalizes path without leading slash was deleted and replaced with | ||
| rejects path without leading slash. If any downstream code passes a path | ||
| without /, it breaks. Should be called out in CHANGELOG or be behind a | ||
| deprecation warning. | ||
|
|
||
| 4. Record<string, *> still scattered in trust/ modules | ||
|
|
||
| File: src/domain/trust/*.js — 20 instances of Record<string, *> remain | ||
|
|
||
| The PR claims zero wildcard casts in scope, but Record<string, *> IS a | ||
| wildcard cast — it's JSDoc for Record<string, any>. These should be | ||
| Record<string, unknown> for consistency with the rest of the PR. The trust | ||
| modules appear to have been scoped out, but the changelog says "zero | ||
| TODO(ts-cleanup) tags remain in Wave 3 scope" — the scope boundary should be | ||
| explicit. | ||
|
|
||
| 5. vvSerialize removed from import but behavior changed | ||
|
|
||
| File: src/domain/services/PatchBuilderV2.js:16,546 | ||
|
|
||
| The import of vvSerialize was removed (line 16), and the call in commit() line | ||
| 546 changed from: | ||
| context: vvSerialize(this._vv), | ||
| to: | ||
| context: /** @type {VersionVector} */ (/** @type {unknown} */ (this._vv)), | ||
|
|
||
| This replaces an actual function call (serialization step) with a type-only | ||
| cast. If vvSerialize was doing real work (e.g., converting Map to plain | ||
| object), this is a behavioral regression. If it was a no-op, the removal is | ||
| fine — but it warrants a comment or test verification. | ||
|
|
||
| 6. Overly long inline type casts (readability) | ||
|
|
||
| Files: Multiple — check.js:14, doctor/checks.js:54, shared.js:115, | ||
| LogicalTraversal.js:155, ObserverView.js:129 | ||
|
|
||
| Examples: | ||
| const healthService = new HealthCheckService({ persistence: /** @type | ||
| {import('../../../src/domain/types/WarpPersistence.js').CheckpointPersistence} | ||
| */ (/** @type {unknown} */ (persistence)), clock }); | ||
|
|
||
| const materialized = await /** @type {{ _materializeGraph: () => | ||
| Promise<{state: import('./JoinReducer.js').WarpStateV5, stateHash: string, | ||
| adjacency: {outgoing: Map<string, Array<{neighborId: string, label: string}>>, | ||
| incoming: Map<string, Array<{neighborId: string, label: string}>>}}> }} */ | ||
| (this._graph)._materializeGraph(); | ||
|
|
||
| These are 200+ character single-line type casts. They're technically correct | ||
| but destroy readability. Consider extracting a typedef or using intermediate | ||
| variables with type annotations. | ||
|
|
||
| 7. onCommitSuccess callback type in patch.methods.js is byzantine | ||
|
|
||
| File: src/domain/warp/patch.methods.js:270 | ||
|
|
||
| onCommitSuccess: /** @type {(result: {patch: Object, sha: string}) => void} */ | ||
| (/** @type {unknown} */ ((/** @type {{patch?: | ||
| import('../types/WarpTypesV2.js').PatchV2, sha?: string}} */ opts) => | ||
| this._onPatchCommitted(resolvedWriterId, opts))), | ||
|
|
||
| This is a triple-nested cast of a lambda. Three @type annotations, two unknown | ||
| hops, and an inline arrow function parameter annotation. This needs a named | ||
| type or intermediate variable. | ||
|
|
||
| 8. return await in StreamingBitmapIndexBuilder | ||
|
|
||
| File: src/domain/services/StreamingBitmapIndexBuilder.js:633 | ||
|
|
||
| - return /** @type {any} */ (this.storage).writeBlob(serialized); | ||
| + return await this.storage.writeBlob(serialized); | ||
|
|
||
| Adding await before return in an async function. Functionally equivalent but | ||
| adds an extra microtask hop. Minor — but this is a change in behavior, not | ||
| just a type fix. The original was returning the promise directly. | ||
|
|
||
| 9. src/globals.d.ts declares ambient Deno namespace without declare global | ||
|
|
||
| File: src/globals.d.ts | ||
|
|
||
| The file uses bare declare namespace Deno and declare namespace Bun. This | ||
| works because the file has no imports/exports (making it a script file), but | ||
| it's fragile. If someone adds an import to this file later, it becomes a | ||
| module and the ambient declarations silently stop working. Consider adding an | ||
| explicit declare global { } wrapper for safety. | ||
|
|
||
| 10. contracts/type-surface.m8.json — massive file, no validation | ||
|
|
||
| File: contracts/type-surface.m8.json (441 lines) | ||
|
|
||
| This is a hand-maintained JSON contract with no automated verification that it | ||
| actually matches index.d.ts. If someone adds a method to WarpGraph but | ||
| forgets this file, it drifts. The file's existence implies authority, but | ||
| there's no CI enforcing it. Either add validation or add a comment making | ||
| clear this is aspirational/documentation, not enforced. | ||
|
|
||
| 11. HttpSyncServer test error message assertion weakened | ||
|
|
||
| File: test/unit/domain/services/HttpSyncServer.auth.test.js:340 | ||
|
|
||
| - }))).toThrow(/invalid auth\.mode/i); | ||
| + }))).toThrow(/HttpSyncServer config/); | ||
|
|
||
| The old test checked for a specific, descriptive error message. The new one | ||
| checks for a generic prefix. This reduces the diagnostic value of the | ||
| assertion — any Zod schema violation will match HttpSyncServer config. | ||
|
|
||
| 12. serve() JSDoc marks port and httpPort as optional but body requires them | ||
|
|
||
| File: src/domain/warp/sync.methods.js:518-533 | ||
|
|
||
| The JSDoc changed @param {number} options.port to @param {number} | ||
| [options.port] (optional), but the function body still does if (typeof port | ||
| !== 'number') { throw ... }. The JSDoc says "optional", the runtime says | ||
| "required". The HttpSyncServer Zod schema has no default for port, so this | ||
| will still throw. JSDoc should not be loosened just to avoid type errors when | ||
| the runtime behavior is unchanged. | ||
|
|
||
| 13. Constructor JSDoc parameters loosened unnecessarily | ||
|
|
||
| Files: CommitDagTraversalService.js, DagPathFinding.js, DagTopology.js, | ||
| DagTraversal.js, IndexRebuildService.js, SyncAuthService.js | ||
|
|
||
| These files changed required parameters to optional in JSDoc (e.g., @param | ||
| {Object} options → @param {Object} [options]) just to make = {} default | ||
| destructuring typecheck without cast. But the constructors immediately throw | ||
| if the required params are missing. The JSDoc now lies — it says the parameter | ||
| is optional when it isn't. | ||
|
|
||
| 14. TickReceipt.js still has Record<string, *> | ||
|
|
||
| File: src/domain/types/TickReceipt.js:70 | ||
|
|
||
| const entry = /** @type {Record<string, *>} */ (op); | ||
|
|
||
| This is inside src/ and was not cleaned up. Missed in the sweep. | ||
|
|
||
| 15. Missing InMemoryGraphAdapter in consumer test | ||
|
|
||
| File: test/type-check/consumer.ts | ||
|
|
||
| InMemoryGraphAdapter is imported but never used beyond the import. The PR adds | ||
| 8 new methods to its index.d.ts interface — none of them are exercised by the | ||
| consumer test. The consumer test validates ~40 methods on WarpGraph but skips | ||
| the adapter entirely. | ||
|
|
||
| --- | ||
| Praise | ||
|
|
||
| - Zero @type {*} / @type {any} in src/ is a legit achievement across ~40 files | ||
| - Consumer type test with @ts-expect-error negative cases is excellent | ||
| practice | ||
| - The GRAVEYARD.md + ROADMAP reorganization is clean and well-reasoned | ||
| - Error narrowing pattern (err instanceof Error ? err.message : String(err)) | ||
| is consistently applied | ||
| - All 3799 tests pass, lint clean, typecheck clean | ||
|
|
||
| --- | ||
| Issues to fix before merge | ||
|
|
||
| #: 1 | ||
| Severity: Medium | ||
| Issue: PersistenceReader === PersistenceWriter === CheckpointPersistence — | ||
| three identical types | ||
| ──────────────────────────────────────── | ||
| #: 5 | ||
| Severity: Medium | ||
| Issue: vvSerialize removed — verify this isn't a behavioral change | ||
| ──────────────────────────────────────── | ||
| #: 3 | ||
| Severity: Medium | ||
| Issue: Path normalization removal is a breaking change, not documented as such | ||
| ──────────────────────────────────────── | ||
| #: 2 | ||
| Severity: Low | ||
| Issue: 4x z.any() in a "kill all wildcards" PR | ||
| ──────────────────────────────────────── | ||
| #: 4 | ||
| Severity: Low | ||
| Issue: 20+ Record<string, *> in trust modules (scope clarification needed) | ||
| ──────────────────────────────────────── | ||
| #: 13 | ||
| Severity: Low | ||
| Issue: JSDoc says optional, runtime says required, in 6 constructors | ||
| ──────────────────────────────────────── | ||
| #: 12 | ||
| Severity: Low | ||
| Issue: serve() JSDoc loosened but runtime unchanged | ||
| ──────────────────────────────────────── | ||
| #: 14 | ||
| Severity: Low | ||
| Issue: TickReceipt.js:70 has Record<string, *> inside src/ | ||
|
|
||
| Want me to start fixing these? | ||
|
|
||
| ✻ Sautéed for 3m 29s | ||
|
|
||
| ❯ for sure... i thought we had somem kind of CI/CD ZERO TOLERANCE for this | ||
| stuff in place already? | ||
|
|
||
| ⏺ Let me check what policy enforcement exists. | ||
|
|
||
| ⏺ Searched for 1 pattern, read 1 file (ctrl+o to expand) | ||
|
|
||
| ⏺ There it is. The policy check only enforces Rule 2: "every wildcard cast must | ||
| have a TODO(ts-cleanup) tag." It doesn't ban wildcards outright — it just | ||
| requires a TODO tag on them. So the old code was "compliant" because every | ||
| @type {*} had its TODO(ts-cleanup) tag. Now that this PR removed the tags AND | ||
| the casts, the policy passes trivially — but it also doesn't catch | ||
| Record<string, *> at all because the regex only matches @type {*} and @type | ||
| {any}, not Record<string, *>. | ||
|
|
||
| And there's no policy for z.any() either. | ||
|
|
||
| Let me fix everything. I'll upgrade the policy check and fix all the code | ||
| issues. | ||
|
|
||
| ⏺ Bash(npm run typecheck:policy 2>&1) | ||
| ⎿ > @git-stunts/git-warp@11.3.0 typecheck:policy | ||
| > node scripts/ts-policy-check.js | ||
|
|
||
| TS policy check passed. | ||
|
|
||
| ⏺ Searched for 1 pattern (ctrl+o to expand) | ||
|
|
||
| ⏺ Yeah — the policy script catches @type {*} and @type {any} but not | ||
| Record<string, *>, not Array<*>, and not z.any(). It's full of holes. Let me | ||
| verify the behavioral issue first, then fix everything in parallel. | ||
|
|
||
| ⏺ Explore(Verify vvSerialize removal safety) | ||
| ⎿ Done (9 tool uses · 65.6k tokens · 18s) | ||
| (ctrl+o to expand) | ||
|
|
||
| ⏺ Behavioral regression confirmed. Let me verify the details and scope the full | ||
| fix list. | ||
|
|
||
| ⏺ Searched for 3 patterns, read 3 files (ctrl+o to expand) | ||
|
|
||
| ⏺ There's the bug. VersionVector in WarpTypesV2.js is Object.<string, number> (a | ||
| plain object), but this._vv is a Map<string, number> from the CRDT module. | ||
| The old code called vvSerialize() to convert Map → Object. The new code skips | ||
| that conversion. | ||
|
|
||
| The context field on the patch will now be a Map instead of a plain Object. | ||
| When CBOR-encoded this might survive because cbor-x handles Maps — but | ||
| downstream code that does Object.entries(patch.context) will break. | ||
|
|
||
| This is a real bug. Let me now fix everything. | ||
|
|
||
| ⏺ Fix #1: Restore vvSerialize — this is the critical behavioral bug. | ||
|
|
||
| ⏺ Update(src/domain/services/PatchBuilderV2.js) | ||
| ⎿ Added 1 line, removed 1 line | ||
| 13 | ||
| 14 import defaultCodec from '../utils/defaultCodec.js'; | ||
| 15 import nullLogger from '../utils/nullLogger.js'; | ||
| 16 -import { vvIncrement, vvClone } from '../crdt/VersionVector.js' | ||
| -; | ||
| 16 +import { vvIncrement, vvClone, vvSerialize } from '../crdt/Vers | ||
| +ionVector.js'; | ||
| 17 import { orsetGetDots, orsetContains, orsetElements } from '../ | ||
| crdt/ORSet.js'; | ||
| 18 import { | ||
| 19 createNodeAddV2, | ||
|
|
||
| ⏺ Update(src/domain/services/PatchBuilderV2.js) | ||
| ⎿ Added 1 line, removed 1 line | ||
| 454 schema, | ||
| 455 writer: this._writerId, | ||
| 456 lamport: this._lamport, | ||
| 457 - context: /** @type {import('../types/WarpTypesV2.js').Ve | ||
| -rsionVector} */ (/** @type {unknown} */ (this._vv)), | ||
| 457 + context: vvSerialize(this._vv), | ||
| 458 ops: this._ops, | ||
| 459 reads: [...this._reads].sort(), | ||
| 460 writes: [...this._writes].sort(), | ||
|
|
||
| ⏺ Update(src/domain/services/PatchBuilderV2.js) | ||
| ⎿ Added 1 line, removed 1 line | ||
| 546 schema, | ||
| 547 writer: this._writerId, | ||
| 548 lamport, | ||
| 549 - context: /** @type {import('../types/WarpTypesV2.js').Ve | ||
| -rsionVector} */ (/** @type {unknown} */ (this._vv)), | ||
| 549 + context: vvSerialize(this._vv), | ||
| 550 ops: this._ops, | ||
| 551 reads: [...this._reads].sort(), | ||
| 552 writes: [...this._writes].sort(), | ||
|
|
||
| ⏺ Now let me fix all the remaining issues in parallel. I'll dispatch the | ||
| mechanical replacement and the policy check, then handle the targeted fixes | ||
| myself. | ||
|
|
||
| ⏺ Task(Replace Record<string, *> wildcards) | ||
| ⎿ Backgrounded agent (shift+↑ to manage · ctrl+o to expand) | ||
|
|
||
| ⏺ Now let me fix the other issues while that runs. | ||
|
|
||
| ⏺ Fix #4: Collapse the identical persistence types. | ||
|
|
||
| ⏺ Read 1 file (ctrl+o to expand) | ||
|
|
||
| ⏺ PersistenceReader, PersistenceWriter, and CheckpointPersistence are identical. | ||
| Let me check what each is actually used for to see if we can differentiate or | ||
| should collapse. | ||
|
|
||
| ⏺ Searched for 1 pattern (ctrl+o to expand) | ||
|
|
||
| ⏺ The roles are correct semantically — sync and materialize use "Reader", patch | ||
| uses "Writer", checkpoint uses "Checkpoint". The problem is they're all the | ||
| same intersection type. Since these ports don't actually separate read/write | ||
| methods (each port has both), the distinction is conceptual documentation | ||
| only. The honest fix: collapse PersistenceReader / PersistenceWriter / | ||
| CheckpointPersistence into one type since they're identical, and document that | ||
| read/write separation is aspirational. | ||
|
|
||
| ⏺ Write(src/domain/types/WarpPersistence.js) | ||
| ⎿ Added 11 lines, removed 12 lines | ||
| 4 * Instead of casting to `any` when accessing persistence metho | ||
| ds, | ||
| 5 * use these narrow types to document which port methods are ac | ||
| tually needed. | ||
| 6 * | ||
| 7 + * NOTE: CommitPort, BlobPort, TreePort, and RefPort each conta | ||
| +in both | ||
| 8 + * read and write methods. True read/write separation would req | ||
| +uire | ||
| 9 + * splitting each port, which is deferred. For now, the role-na | ||
| +med | ||
| 10 + * aliases below are identical — they exist to document *intent | ||
| +* at | ||
| 11 + * each call site, not to enforce access restrictions. | ||
| 12 + * | ||
| 13 * @module domain/types/WarpPersistence | ||
| 14 */ | ||
| 15 | ||
| ... | ||
| 19 */ | ||
| 20 | ||
| 21 /** | ||
| 16 - * Read-side persistence — commit reads, blob reads, tree reads | ||
| -, ref reads. | ||
| 17 - * @typedef {import('../../ports/CommitPort.js').default & impo | ||
| -rt('../../ports/BlobPort.js').default & import('../../ports/Tre | ||
| -ePort.js').default & import('../../ports/RefPort.js').default} | ||
| -PersistenceReader | ||
| 22 + * Commit + blob + tree + ref (no config). | ||
| 23 + * Used by sync readers, checkpoint creators, patch writers, an | ||
| +d | ||
| 24 + * materialize paths. Identical to CheckpointPersistence by des | ||
| +ign | ||
| 25 + * (see module-level note). | ||
| 26 + * @typedef {import('../../ports/CommitPort.js').default & impo | ||
| +rt('../../ports/BlobPort.js').default & import('../../ports/Tre | ||
| +ePort.js').default & import('../../ports/RefPort.js').default} | ||
| +CorePersistence | ||
| 27 */ | ||
| 28 | ||
| 29 /** | ||
| 21 - * Write-side persistence — commit creation, blob writes, tree | ||
| -writes, ref updates. | ||
| 22 - * @typedef {import('../../ports/CommitPort.js').default & impo | ||
| -rt('../../ports/BlobPort.js').default & import('../../ports/Tre | ||
| -ePort.js').default & import('../../ports/RefPort.js').default} | ||
| -PersistenceWriter | ||
| 23 - */ | ||
| 24 - | ||
| 25 -/** | ||
| 30 * Ref-only persistence — ref reads, writes, CAS, listing. | ||
| 31 * @typedef {import('../../ports/RefPort.js').default} RefPersi | ||
| stence | ||
| 32 */ | ||
| 33 | ||
| 34 /** | ||
| 31 - * Checkpoint persistence — commit + blob + tree + ref (no conf | ||
| -ig). | ||
| 32 - * @typedef {import('../../ports/CommitPort.js').default & impo | ||
| -rt('../../ports/BlobPort.js').default & import('../../ports/Tre | ||
| -ePort.js').default & import('../../ports/RefPort.js').default} | ||
| -CheckpointPersistence | ||
| 33 - */ | ||
| 34 - | ||
| 35 -/** | ||
| 35 * Index storage — blob reads/writes, tree reads/writes, ref re | ||
| ads/writes. | ||
| 36 * Matches the dynamically-composed IndexStoragePort interface. | ||
| 37 * @typedef {import('../../ports/BlobPort.js').default & import | ||
| ('../../ports/TreePort.js').default & import('../../ports/RefPo | ||
| rt.js').default} IndexStorage | ||
|
|
||
| ⏺ Now update all usage sites to use CorePersistence. | ||
|
|
||
| ⏺ Update(src/domain/warp/sync.methods.js) | ||
| ⎿ Added 1 line, removed 1 line | ||
| 33 import { signSyncRequest, canonicalizePath } from '../services/ | ||
| SyncAuthService.js'; | ||
| 34 import { isError } from '../types/WarpErrors.js'; | ||
| 35 | ||
| 36 -/** @typedef {import('../types/WarpPersistence.js').Persistence | ||
| -Reader} PersistenceReader */ | ||
| 36 +/** @typedef {import('../types/WarpPersistence.js').CorePersist | ||
| +ence} CorePersistence */ | ||
| 37 | ||
| 38 // ── Private helpers ───────────────────────────────────────── | ||
| ──────────────── | ||
| 39 | ||
|
|
||
| ⏺ Update(src/domain/warp/sync.methods.js) | ||
| ⎿ Added 1 line, removed 1 line | ||
| 224 */ | ||
| 225 export async function processSyncRequest(request) { | ||
| 226 const localFrontier = await this.getFrontier(); | ||
| 227 - /** @type {PersistenceReader} */ | ||
| 227 + /** @type {CorePersistence} */ | ||
| 228 const persistence = this._persistence; | ||
| 229 return await processSyncRequestImpl( | ||
| 230 request, | ||
|
|
||
| ⏺ Update(src/domain/warp/materializeAdvanced.methods.js) | ||
| ⎿ Added 1 line, removed 1 line | ||
| 19 import { materializeIncremental } from '../services/CheckpointS | ||
| ervice.js'; | ||
| 20 import { createFrontier, updateFrontier } from '../services/Fro | ||
| ntier.js'; | ||
| 21 | ||
| 22 -/** @typedef {import('../types/WarpPersistence.js').Persistence | ||
| -Reader} PersistenceReader */ | ||
| 22 +/** @typedef {import('../types/WarpPersistence.js').CorePersist | ||
| +ence} CorePersistence */ | ||
| 23 import { buildWriterRef } from '../utils/RefLayout.js'; | ||
| 24 import { decodePatchMessage, detectMessageKind } from '../servi | ||
| ces/WarpMessageCodec.js'; | ||
| 25 | ||
|
|
||
| ⏺ Update(src/domain/warp/materializeAdvanced.methods.js) | ||
| ⎿ Added 1 line, removed 1 line | ||
| 328 }; | ||
| 329 | ||
| 330 // 4. Call materializeIncremental with the checkpoint and ta | ||
| rget frontier | ||
| 331 - /** @type {PersistenceReader} */ | ||
| 331 + /** @type {CorePersistence} */ | ||
| 332 const persistence = this._persistence; | ||
| 333 const state = await materializeIncremental({ | ||
| 334 persistence, | ||
|
|
||
| ⏺ Update(src/domain/warp/checkpoint.methods.js) | ||
| ⎿ Added 1 line, removed 1 line | ||
| 16 import { collectGCMetrics } from '../services/GCMetrics.js'; | ||
| 17 import { computeAppliedVV } from '../services/CheckpointSeriali | ||
| zerV5.js'; | ||
| 18 | ||
| 19 -/** @typedef {import('../types/WarpPersistence.js').CheckpointP | ||
| -ersistence} CheckpointPersistence */ | ||
| 19 +/** @typedef {import('../types/WarpPersistence.js').CorePersist | ||
| +ence} CorePersistence */ | ||
| 20 | ||
| 21 /** | ||
| 22 * Creates a checkpoint of the current graph state. | ||
|
|
||
| ⏺ Update(src/domain/warp/checkpoint.methods.js) | ||
| ⎿ Added 1 line, removed 1 line | ||
| 61 } | ||
| 62 | ||
| 63 // 4. Call CheckpointService.create() with provenance index | ||
| if available | ||
| 64 - /** @type {CheckpointPersistence} */ | ||
| 64 + /** @type {CorePersistence} */ | ||
| 65 const persistence = this._persistence; | ||
| 66 const checkpointSha = await createCheckpointCommit({ | ||
| 67 persistence, | ||
|
|
||
| ⏺ Update(src/domain/warp/patch.methods.js) | ||
| ⎿ Added 1 line, removed 1 line | ||
| 18 import { Writer } from './Writer.js'; | ||
| 19 import { generateWriterId, resolveWriterId } from '../utils/Wri | ||
| terId.js'; | ||
| 20 | ||
| 21 -/** @typedef {import('../types/WarpPersistence.js').Persistence | ||
| -Writer} PersistenceWriter */ | ||
| 21 +/** @typedef {import('../types/WarpPersistence.js').CorePersist | ||
| +ence} CorePersistence */ | ||
| 22 | ||
| 23 /** | ||
| 24 * Creates a new PatchBuilderV2 for this graph. | ||
|
|
||
| ⏺ Update(src/domain/warp/patch.methods.js) | ||
| ⎿ Added 1 line, removed 1 line | ||
| 258 configSet, | ||
| 259 }); | ||
| 260 | ||
| 261 - /** @type {PersistenceWriter} */ | ||
| 261 + /** @type {CorePersistence} */ | ||
| 262 const persistence = this._persistence; | ||
| 263 return new Writer({ | ||
| 264 persistence, | ||
|
|
||
| ⏺ Now fix the second PersistenceWriter usage in patch.methods.js. | ||
|
|
||
| ⏺ Update(src/domain/warp/patch.methods.js) | ||
| ⎿ Added 1 line, removed 1 line | ||
| 316 await this._persistence.configSet(configKey, freshWriterId | ||
| ); | ||
| 317 } | ||
| 318 | ||
| 319 - /** @type {PersistenceWriter} */ | ||
| 319 + /** @type {CorePersistence} */ | ||
| 320 const writerPersistence = this._persistence; | ||
| 321 return new Writer({ | ||
| 322 persistence: writerPersistence, | ||
|
|
||
| ⏺ Now update the CLI files that used CheckpointPersistence. | ||
|
|
||
| ⏺ Searched for 2 patterns (ctrl+o to expand) | ||
|
|
||
| ⏺ Update(bin/cli/commands/doctor/checks.js) | ||
| ⎿ Added 1 line, removed 1 line | ||
| 43 export async function checkRepoAccessible(ctx) { | ||
| 44 try { | ||
| 45 const clock = ClockAdapter.global(); | ||
| 46 - const svc = new HealthCheckService({ persistence: /** @type | ||
| - {import('../../../../src/domain/types/WarpPersistence.js').Che | ||
| -ckpointPersistence} */ (/** @type {unknown} */ (ctx.persistence | ||
| -)), clock }); | ||
| 46 + const svc = new HealthCheckService({ persistence: /** @type | ||
| + {import('../../../../src/domain/types/WarpPersistence.js').Cor | ||
| +ePersistence} */ (/** @type {unknown} */ (ctx.persistence)), cl | ||
| +ock }); | ||
| 47 const health = await svc.getHealth(); | ||
| 48 if (health.components.repository.status === 'unhealthy') { | ||
| 49 return { | ||
|
|
||
| ⏺ Update(bin/cli/commands/check.js) | ||
| ⎿ Added 1 line, removed 1 line | ||
| 11 /** @param {Persistence} persistence */ | ||
| 12 async function getHealth(persistence) { | ||
| 13 const clock = ClockAdapter.global(); | ||
| 14 - const healthService = new HealthCheckService({ persistence: / | ||
| -** @type {import('../../../src/domain/types/WarpPersistence.js' | ||
| -).CheckpointPersistence} */ (/** @type {unknown} */ (persistenc | ||
| -e)), clock }); | ||
| 14 + const healthService = new HealthCheckService({ persistence: / | ||
| +** @type {import('../../../src/domain/types/WarpPersistence.js' | ||
| +).CorePersistence} */ (/** @type {unknown} */ (persistence)), c | ||
| +lock }); | ||
| 15 return await healthService.getHealth(); | ||
| 16 } | ||
| 17 | ||
|
|
||
| ⏺ Update(bin/cli/commands/verify-audit.js) | ||
| ⎿ Added 1 line, removed 1 line | ||
| 53 const { persistence } = await createPersistence(options.repo) | ||
| ; | ||
| 54 const graphName = await resolveGraphName(persistence, options | ||
| .graph); | ||
| 55 const verifier = new AuditVerifierService({ | ||
| 56 - persistence: /** @type {import('../../../src/domain/types/W | ||
| -arpPersistence.js').CheckpointPersistence} */ (/** @type {unkno | ||
| -wn} */ (persistence)), | ||
| 56 + persistence: /** @type {import('../../../src/domain/types/W | ||
| +arpPersistence.js').CorePersistence} */ (/** @type {unknown} */ | ||
| + (persistence)), | ||
| 57 codec: defaultCodec, | ||
| 58 }); | ||
| 59 | ||
|
|
||
| ⏺ Update(bin/cli/commands/trust.js) | ||
| ⎿ Added 1 line, removed 1 line | ||
| 104 const graphName = await resolveGraphName(persistence, option | ||
| s.graph); | ||
| 105 | ||
| 106 const recordService = new TrustRecordService({ | ||
| 107 - persistence: /** @type {import('../../../src/domain/types/ | ||
| -WarpPersistence.js').CheckpointPersistence} */ (/** @type {unk | ||
| -nown} */ (persistence)), | ||
| 107 + persistence: /** @type {import('../../../src/domain/types/ | ||
| +WarpPersistence.js').CorePersistence} */ (/** @type {unknown} | ||
| +*/ (persistence)), | ||
| 108 codec: defaultCodec, | ||
| 109 }); | ||
| 110 | ||
|
|
||
| ⏺ Also update the WarpGraph.js that referenced CheckpointPersistence. | ||
|
|
||
| ⏺ Searched for 2 patterns (ctrl+o to expand) | ||
|
|
||
| ⏺ Update(src/domain/WarpGraph.js) | ||
| ⎿ Added 1 line, removed 1 line | ||
| 285 // Initialize audit service if enabled | ||
| 286 if (graph._audit) { | ||
| 287 graph._auditService = new AuditReceiptService({ | ||
| 288 - persistence: /** @type {import('./types/WarpPersistenc | ||
| -e.js').CheckpointPersistence} */ (persistence), | ||
| 288 + persistence: /** @type {import('./types/WarpPersistenc | ||
| +e.js').CorePersistence} */ (persistence), | ||
| 289 graphName, | ||
| 290 writerId, | ||
| 291 codec: graph._codec, | ||
|
|
||
| ⏺ Also need to update the shared.js in CLI that used FsAdapter type from | ||
| HookInstaller. | ||
|
|
||
| ⏺ Fix #5: Replace z.any() in HttpSyncServer. | ||
|
|
||
| ⏺ Read 1 file (ctrl+o to expand) | ||
|
|
||
| ⏺ Update(src/domain/services/HttpSyncServer.js) | ||
| ⎿ Added 2 lines, removed 2 lines | ||
| 24 (obj) => Object.keys(obj).length > 0, | ||
| 25 'auth.keys must not be empty', | ||
| 26 ), | ||
| 27 - crypto: z.any().optional(), | ||
| 28 - logger: z.any().optional(), | ||
| 27 + crypto: z.custom((v) => v === undefined || (typeof v === 'obj | ||
| +ect' && v !== null)).optional(), | ||
| 28 + logger: z.custom((v) => v === undefined || (typeof v === 'obj | ||
| +ect' && v !== null)).optional(), | ||
| 29 wallClockMs: /** @type {z.ZodType<() => number>} */ (z.custom | ||
| ((v) => v === undefined || typeof v === 'function')).optional() | ||
| , | ||
| 30 }).strict(); | ||
| 31 | ||
|
|
||
| ⏺ Update(src/domain/services/HttpSyncServer.js) | ||
| ⎿ Added 2 lines, removed 2 lines | ||
| 30 }).strict(); | ||
| 31 | ||
| 32 const optionsSchema = z.object({ | ||
| 33 - httpPort: z.any().refine( | ||
| 33 + httpPort: z.custom( | ||
| 34 (v) => v !== null && v !== undefined && typeof v === 'objec | ||
| t', | ||
| 35 'httpPort is required', | ||
| 36 ), | ||
| 37 - graph: z.any().refine( | ||
| 37 + graph: z.custom( | ||
| 38 (v) => v !== null && v !== undefined && typeof v === 'objec | ||
| t', | ||
| 39 'graph is required', | ||
| 40 ), | ||
|
|
||
| ⏺ Fix #6: Revert the JSDoc that falsely marks required params as optional. The | ||
| actual fix is to type the default = {} correctly instead of loosening the | ||
| param. | ||
|
|
||
| ⏺ Read 1 file (ctrl+o to expand) | ||
|
|
||
| ⏺ The pattern across all 6 files is the same: the constructor has = {} default | ||
| for destructuring, and the old code had = /** @type {*} */ ({}) to suppress | ||
| the type error. The new code made the JSDoc params optional, but they throw at | ||
| runtime if missing. | ||
|
|
||
| The correct fix: revert JSDoc to required, and cast the default {} to the | ||
| options type via /** @type {any} */ — wait, that defeats the purpose. The real | ||
| solution: use /** @type {Partial<...>} */ on the default, but that's complex. | ||
| The simplest correct fix: revert to required JSDoc and use = /** @type {{ | ||
| indexReader?: ..., logger?: ... }} */ ({}) on the default. | ||
|
|
||
| Actually, the simplest approach that works with tsc: mark options itself as | ||
| optional (it IS optional — you can call new CommitDagTraversalService() and | ||
| it'll throw) but mark indexReader within it as required. But tsc/JSDoc doesn't | ||
| support "optional parameter with required property" well. | ||
|
|
||
| The cleanest fix: just make options optional but indexReader required in the | ||
| JSDoc. This accurately reflects the runtime: options can be omitted (= | ||
| throws), but if provided, indexReader is required. | ||
|
|
||
| ⏺ Update(src/domain/services/CommitDagTraversalService.js) | ||
| ⎿ Added 3 lines, removed 3 lines | ||
| 35 /** | ||
| 36 * Creates a new CommitDagTraversalService. | ||
| 37 * | ||
| 38 - * @param {Object} [options] | ||
| 39 - * @param {import('./BitmapIndexReader.js').default} [options | ||
| -.indexReader] - Index reader for O(1) lookups | ||
| 38 + * @param {Object} options | ||
| 39 + * @param {import('./BitmapIndexReader.js').default} options. | ||
| +indexReader - Index reader for O(1) lookups | ||
| 40 * @param {import('../../ports/LoggerPort.js').default} [opti | ||
| ons.logger] - Logger instance | ||
| 41 */ | ||
| 42 - constructor({ indexReader, logger = nullLogger } = {}) { | ||
| 42 + constructor({ indexReader, logger = nullLogger } = /** @type | ||
| +{{ indexReader: import('./BitmapIndexReader.js').default, logge | ||
| +r?: import('../../ports/LoggerPort.js').default }} */ ({})) { | ||
| 43 if (!indexReader) { | ||
| 44 throw new Error('CommitDagTraversalService requires an in | ||
| dexReader'); | ||
| 45 } | ||
|
|
||
| ⏺ Read 1 file (ctrl+o to expand) | ||
|
|
||
| ⏺ This one has the inline type annotation on the parameter. Let me fix the same | ||
| pattern across all remaining files. | ||
|
|
||
| ⏺ Update(src/domain/services/DagPathFinding.js) | ||
| ⎿ Added 1 line, removed 1 line | ||
| 41 * @param {import('./BitmapIndexReader.js').default} options. | ||
| indexReader - Index reader for O(1) lookups | ||
| 42 * @param {import('../../ports/LoggerPort.js').default} [opti | ||
| ons.logger] - Logger instance | ||
| 43 */ | ||
| 44 - constructor(/** @type {{ indexReader?: import('./BitmapIndexR | ||
| -eader.js').default, logger?: import('../../ports/LoggerPort.js' | ||
| -).default }} */ { indexReader, logger = nullLogger } = {}) { | ||
| 44 + constructor(/** @type {{ indexReader: import('./BitmapIndexRe | ||
| +ader.js').default, logger?: import('../../ports/LoggerPort.js') | ||
| +.default }} */ { indexReader, logger = nullLogger } = /** @type | ||
| + {{ indexReader: import('./BitmapIndexReader.js').default }} */ | ||
| + ({})) { | ||
| 45 if (!indexReader) { | ||
| 46 throw new Error('DagPathFinding requires an indexReader') | ||
| ; | ||
| 47 } | ||
|
|
||
| ⏺ I see the linter reverted several of my changes (CommitDagTraversalService, | ||
| WarpPersistence.js, WarpGraph.js, PatchBuilderV2.js, patch.methods.js, | ||
| checkpoint.methods.js, etc). This is the pre-commit hook. Let me check what's | ||
| happening — the linter likely reverted my file because the background agent is | ||
| also editing these files. Let me wait for the background agent to finish | ||
| first, then re-apply my changes. | ||
|
|
||
| ⏺ Task Output afae531 | ||
| ⎿ Task is still running… | ||
|
|
||
| ⏺ The background agent is hitting typecheck errors after its Record<string, *> → | ||
| Record<string, unknown> replacements. This is expected — unknown is stricter | ||
| than *. I need to let it finish resolving those errors, then verify myself. | ||
| Let me stop waiting and handle the remaining tasks while it works. | ||
|
|
||
| Let me fix the DagTopology and DagTraversal constructors plus the serve() | ||
| JSDoc (task #6) while the agent finishes. | ||
|
|
||
| ⏺ Read 3 files (ctrl+o to expand) | ||
|
|
||
| ⏺ For these constructors, the inline type annotation on the destructured | ||
| parameter makes indexReader optional with ?. Let me fix indexReader back to | ||
| required in the inline types and add a proper cast on the default {}. | ||
|
|
||
| ⏺ Update(src/domain/services/DagTopology.js) | ||
| ⎿ Added 1 line, removed 1 line | ||
| 37 * @param {import('../../ports/LoggerPort.js').default} [opti | ||
| ons.logger] - Logger instance | ||
| 38 * @param {import('./DagTraversal.js').default} [options.trav | ||
| ersal] - Traversal service for ancestor enumeration | ||
| 39 */ | ||
| 40 - constructor(/** @type {{ indexReader?: import('./BitmapIndexR | ||
| -eader.js').default, logger?: import('../../ports/LoggerPort.js' | ||
| -).default, traversal?: import('./DagTraversal.js').default }} * | ||
| -/ { indexReader, logger = nullLogger, traversal } = {}) { | ||
| 40 + constructor(/** @type {{ indexReader: import('./BitmapIndexRe | ||
| +ader.js').default, logger?: import('../../ports/LoggerPort.js') | ||
| +.default, traversal?: import('./DagTraversal.js').default }} */ | ||
| + { indexReader, logger = nullLogger, traversal } = /** @type {{ | ||
| + indexReader: import('./BitmapIndexReader.js').default }} */ ({ | ||
| +})) { | ||
| 41 if (!indexReader) { | ||
| 42 throw new Error('DagTopology requires an indexReader'); | ||
| 43 } | ||
|
|
||
| ⏺ Update(src/domain/services/DagTraversal.js) | ||
| ⎿ Added 1 line, removed 1 line | ||
| 43 * @param {import('./BitmapIndexReader.js').default} options. | ||
| indexReader - Index reader for O(1) lookups | ||
| 44 * @param {import('../../ports/LoggerPort.js').default} [opti | ||
| ons.logger] - Logger instance | ||
| 45 */ | ||
| 46 - constructor(/** @type {{ indexReader?: import('./BitmapIndexR | ||
| -eader.js').default, logger?: import('../../ports/LoggerPort.js' | ||
| -).default }} */ { indexReader, logger = nullLogger } = {}) { | ||
| 46 + constructor(/** @type {{ indexReader: import('./BitmapIndexRe | ||
| +ader.js').default, logger?: import('../../ports/LoggerPort.js') | ||
| +.default }} */ { indexReader, logger = nullLogger } = /** @type | ||
| + {{ indexReader: import('./BitmapIndexReader.js').default }} */ | ||
| + ({})) { | ||
| 47 if (!indexReader) { | ||
| 48 throw new Error('DagTraversal requires an indexReader'); | ||
| 49 } | ||
|
|
||
| ⏺ Update(src/domain/services/IndexRebuildService.js) | ||
| ⎿ Added 3 lines, removed 3 lines | ||
| 39 /** | ||
| 40 * Creates an IndexRebuildService instance. | ||
| 41 * | ||
| 42 - * @param {Object} [options] - Configuration options | ||
| 43 - * @param {{ iterateNodes: (opts: { ref: string, limit: numbe | ||
| -r }) => AsyncIterable<{ sha: string, parents: string[] }> }} [o | ||
| -ptions.graphService] - Graph service providing node iteration. | ||
| 44 - * @param {import('../../ports/IndexStoragePort.js').default} | ||
| - [options.storage] - Storage adapter | ||
| 42 + * @param {Object} options - Configuration options | ||
| 43 + * @param {{ iterateNodes: (opts: { ref: string, limit: numbe | ||
| +r }) => AsyncIterable<{ sha: string, parents: string[] }> }} op | ||
| +tions.graphService - Graph service providing node iteration. | ||
| 44 + * @param {import('../../ports/IndexStoragePort.js').default} | ||
| + options.storage - Storage adapter | ||
| 45 * for persisting index blobs and trees. Typically GitGraph | ||
| Adapter. | ||
| 46 * @param {import('../../ports/LoggerPort.js').default} [opti | ||
| ons.logger] - Logger for | ||
| 47 * structured logging. Defaults to null logger (no logging) | ||
| . | ||
|
|
||
| ⏺ Now fix SyncAuthService and the serve() function. | ||
|
|
||
| ⏺ Read 1 file (ctrl+o to expand) | ||
|
|
||
| ⏺ Update(src/domain/services/SyncAuthService.js) | ||
| ⎿ Added 2 lines, removed 2 lines | ||
| 171 | ||
| 172 export default class SyncAuthService { | ||
| 173 /** | ||
| 174 - * @param {Object} [options] | ||
| 175 - * @param {Record<string, string>} [options.keys] - Key-id t | ||
| -o secret mapping | ||
| 174 + * @param {Object} options | ||
| 175 + * @param {Record<string, string>} options.keys - Key-id to | ||
| +secret mapping | ||
| 176 * @param {'enforce'|'log-only'} [options.mode='enforce'] - | ||
| Auth enforcement mode | ||
| 177 * @param {number} [options.nonceCapacity] - Nonce LRU capac | ||
| ity | ||
| 178 * @param {number} [options.maxClockSkewMs] - Max clock skew | ||
| tolerance | ||
|
|
||
| ⏺ Now fix serve() in sync.methods.js — port and httpPort should be required. | ||
|
|
||
| ⏺ Read 1 file (ctrl+o to expand) | ||
|
|
||
| ⏺ Update(src/domain/warp/sync.methods.js) | ||
| ⎿ Added 3 lines, removed 3 lines | ||
| 518 * Starts a built-in sync server for this graph. | ||
| 519 * | ||
| 520 * @this {import('../WarpGraph.js').default} | ||
| 521 - * @param {Object} [options] | ||
| 522 - * @param {number} [options.port] - Port to listen on | ||
| 521 + * @param {Object} options | ||
| 522 + * @param {number} options.port - Port to listen on | ||
| 523 * @param {string} [options.host='127.0.0.1'] - Host to bind | ||
| 524 * @param {string} [options.path='/sync'] - Path to handle syn | ||
| c requests | ||
| 525 * @param {number} [options.maxRequestBytes=4194304] - Max req | ||
| uest size in bytes | ||
| 526 - * @param {import('../../ports/HttpServerPort.js').default} [o | ||
| -ptions.httpPort] - HTTP server adapter (required) | ||
| 526 + * @param {import('../../ports/HttpServerPort.js').default} op | ||
| +tions.httpPort - HTTP server adapter | ||
| 527 * @param {{ keys: Record<string, string>, mode?: 'enforce'|'l | ||
| og-only' }} [options.auth] - Auth configuration | ||
| 528 * @returns {Promise<{close: () => Promise<void>, url: string} | ||
| >} Server handle | ||
| 529 * @throws {Error} If port is not a number | ||
|
|
||
| ⏺ Now let me work on task #2 — upgrading the policy check — while waiting for | ||
| the background agent. | ||
|
|
||
| ⏺ Read 1 file (ctrl+o to expand) | ||
|
|
||
| ⏺ Now I need to upgrade this. The new rules: | ||
| 1. Ban @ts-ignore (existing) | ||
| 2. Ban @type {*} and @type {any} outright (upgraded from "require tag" to | ||
| "ban") | ||
| 3. Ban Record<string, *>, Array<*>, {[k:string]: *} — embedded wildcards in | ||
| JSDoc types | ||
| 4. Ban z.any() in source files | ||
|
|
||
| ⏺ Write(scripts/ts-policy-check.js) | ||
| ⎿ Added 33 lines, removed 7 lines | ||
| 1 #!/usr/bin/env node | ||
| 2 | ||
| 3 /** | ||
| 4 - * TS policy checker — enforces two rules in source files (src/ | ||
| -, bin/, scripts/): | ||
| 4 + * TS policy checker — enforces type safety rules in source fil | ||
| +es (src/, bin/, scripts/): | ||
| 5 * | ||
| 6 * 1. Ban @ts-ignore — use @ts-expect-error instead. | ||
| 7 - * 2. Require TODO(ts-cleanup) tag on every inline wildcard cas | ||
| -t. | ||
| 7 + * 2. Ban wildcard casts: @type {*}, @type {any}. | ||
| 8 + * 3. Ban embedded wildcards in JSDoc type params: Record<strin | ||
| +g, *>, Array<*>, etc. | ||
| 9 + * 4. Ban z.any() — use z.custom() or z.unknown() instead. | ||
| 10 * | ||
| 11 * Exit 0 when clean, 1 when violations found. | ||
| 12 */ | ||
| ... | ||
| 31 } | ||
| 32 } | ||
| 33 | ||
| 32 -/* eslint-disable no-control-regex */ | ||
| 34 +// ── Rule patterns ─────────────────────────────────────────── | ||
| +──────────────── | ||
| 35 + | ||
| 36 +// Rule 1: @ts-ignore | ||
| 37 const TS_IGNORE_RE = /@ts-ignore\b/; | ||
| 38 + | ||
| 39 +// Rule 2: Bare wildcard casts — @type {*} or @type {any} | ||
| 40 const WILDCARD_CAST_RE = /@type\s+\{(\*|any)\}/; | ||
| 35 -const TAG_RE = /TODO\(ts-cleanup\)/; | ||
| 36 -/* eslint-enable no-control-regex */ | ||
| 41 | ||
| 42 +// Rule 3: Embedded wildcards in generic JSDoc types | ||
| 43 +// Catches: Record<string, *>, Array<*>, Map<string, *>, {[k: | ||
| +string]: *} | ||
| 44 +// Does NOT match inside import() paths or comments unrelated | ||
| + to @type. | ||
| 45 +const EMBEDDED_WILDCARD_RE = /(?:@type|@param|@returns|@typedef | ||
| +)\s.*(?:<[^>]*\*[^>]*>|\{\[[\w:]+\]:\s*\*\})/; | ||
| 46 + | ||
| 47 +// Rule 4: z.any() in Zod schemas | ||
| 48 +const ZOD_ANY_RE = /z\.any\(\)/; | ||
| 49 + | ||
| 50 async function check() { | ||
| 51 const violations = []; | ||
| 52 | ||
| ... | ||
| 61 const lines = content.split('\n'); | ||
| 62 for (let i = 0; i < lines.length; i++) { | ||
| 63 const line = lines[i]; | ||
| 64 + | ||
| 65 + // Rule 1 | ||
| 66 if (TS_IGNORE_RE.test(line)) { | ||
| 67 violations.push(`${rel}:${i + 1}: error: use @ts-expe | ||
| ct-error instead of @ts-ignore`); | ||
| 68 } | ||
| 55 - if (WILDCARD_CAST_RE.test(line) && !TAG_RE.test(line)) | ||
| -{ | ||
| 56 - violations.push(`${rel}:${i + 1}: error: wildcard cas | ||
| -t missing TODO(ts-cleanup) tag`); | ||
| 69 + | ||
| 70 + // Rule 2 | ||
| 71 + if (WILDCARD_CAST_RE.test(line)) { | ||
| 72 + violations.push(`${rel}:${i + 1}: error: wildcard cas | ||
| +t @type {*}/@type {any} is banned`); | ||
| 73 } | ||
| 74 + | ||
| 75 + // Rule 3 | ||
| 76 + if (EMBEDDED_WILDCARD_RE.test(line)) { | ||
| 77 + violations.push(`${rel}:${i + 1}: error: embedded wil | ||
| +dcard in JSDoc type (use 'unknown' instead of '*')`); | ||
| 78 + } | ||
| 79 + | ||
| 80 + // Rule 4 | ||
| 81 + if (ZOD_ANY_RE.test(line)) { | ||
| 82 + violations.push(`${rel}:${i + 1}: error: z.any() is b | ||
| +anned — use z.custom() or z.unknown()`); | ||
| 83 + } | ||
| 84 } | ||
| 85 } | ||
| 86 } | ||
|
|
||
| ⏺ Let me check the policy check against the current state to see remaining | ||
| violations. | ||
|
|
||
| ⏺ Bash(node /Users/james/git/git-stunts/git-warp/scripts/ts-policy-check.js 2>&1 | ||
| | head -60) | ||
| ⎿ src/domain/crdt/ORSet.js:314: error: embedded wildcard in JSDoc type ( | ||
| use 'unknown' instead of '*') | ||
| src/domain/crdt/ORSet.js:352: error: embedded wildcard in JSDoc type ( | ||
| … +72 lines (ctrl+o to expand) | ||
|
|
||
| ⏺ 48 violations. That's the background agent's territory — it's handling the | ||
| Record<string, *> fixes. But there are also many more in files it wasn't | ||
| touching (JoinReducer, ORSet, etc). Let me wait for the agent to finish, then | ||
| sweep up anything remaining. | ||
|
|
||
| ⏺ Task Output afae531 | ||
| ⎿ Task is still running… | ||
|
|
||
| ⏺ The agent is nearly done. It's confirmed zero Record<string, *>, Array<*>, and | ||
| {[k:string]: *} remaining. Lint is clean. But it introduced typecheck errors | ||
| in 3 source files from the JSDoc changes I also made (IndexRebuildService, | ||
| SyncAuthService, sync.methods.js = {} defaults now rejected). These are the | ||
| same files I just fixed in task #6, so there's a conflict. Let me wait for the | ||
| agent to complete, then do a full reconciliation pass. | ||
|
|
||
| ⏺ Task Output afae531 | ||
| ⎿ Read output (ctrl+o to expand) | ||
| ⎿ Context limit reached · /compact or /clear to continue | ||
|
|
There was a problem hiding this comment.
Session transcript should not be committed to the repository.
This file is a raw Claude Code interactive session log — a development artifact with no canonical value in source control. It adds noise to git log, may confuse contributors about what is authoritative, and is not a substitute for a PR description, CHANGELOG entry, or ADR. The deliberation and decisions captured here belong in the CHANGELOG/PR body, not committed as a file.
Add a matching pattern (e.g., *-local-command-*.txt) to .gitignore and remove this file from the commit.
🧰 Tools
🪛 LanguageTool
[style] ~12-~12: Consider using a different verb for a more formal wording.
Context: ... your comments and all that. Then let's fix the issues you self-discover befo...
(FIX_RESOLVE)
[grammar] ~19-~19: Ensure spelling is correct
Context: ...LAD Tiers 3-4 — eliminate all remaining w ildcard casts 631ef96 feat(types): M8 IRON...
(QB_NEW_EN_ORTHOGRAPHY_ERROR_IDS_1)
[uncategorized] ~25-~25: The official name of this software platform is spelled with a capital “H”.
Context: ...t diff --stat origin/main..HEAD) ⎿ .github/workflows/ci.yml ...
(GITHUB)
[uncategorized] ~34-~34: The official name of this software platform is spelled with a capital “H”.
Context: ... | head -8000) ⎿ diff --git a/.github/workflows/ci.yml b/.github/workflows/ci...
(GITHUB)
[uncategorized] ~34-~34: The official name of this software platform is spelled with a capital “H”.
Context: ...iff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index 92611c8..30...
(GITHUB)
[uncategorized] ~36-~36: The official name of this software platform is spelled with a capital “H”.
Context: ...dex 92611c8..30d4d18 100644 --- a/.github/workflows/ci.yml … +2678 lines (ct...
(GITHUB)
[grammar] ~88-~88: Ensure spelling is correct
Context: ...with replay option (PL/SUB/2) > replay: tru … +26 lines (ctrl+o to expand) ⎿ (tim...
(QB_NEW_EN_ORTHOGRAPHY_ERROR_IDS_1)
[style] ~157-~157: To form a complete sentence, be sure to include a subject.
Context: ...e passes a path without /, it breaks. Should be called out in CHANGELOG or be behind...
(MISSING_IT_THERE)
[grammar] ~351-~351: Ensure spelling is correct
Context: ... 3m 29s ❯ for sure... i thought we had somem kind of CI/CD ZERO TOLERANCE for this ...
(QB_NEW_EN_ORTHOGRAPHY_ERROR_IDS_1)
[style] ~450-~450: Consider using a different verb for a more formal wording.
Context: ...anage · ctrl+o to expand) ⏺ Now let me fix the other issues while that runs. ⏺ Fi...
(FIX_RESOLVE)
[grammar] ~477-~477: Ensure spelling is correct
Context: ...t, BlobPort, TreePort, and RefPort each conta +in both ...
(QB_NEW_EN_ORTHOGRAPHY_ERROR_IDS_1)
[style] ~871-~871: In American English, abbreviations like “etc.” require a period.
Context: ...ch.methods.js, checkpoint.methods.js, etc). This is the pre-commit hook. Let me c...
(ETC_PERIOD)
[grammar] ~1034-~1034: Ensure spelling is correct
Context: ...mbedded wildcards in JSDoc type params: Record<strin +g, >, Array<>, etc. ...
(QB_NEW_EN_ORTHOGRAPHY_ERROR_IDS_1)
[style] ~1118-~1118: In American English, abbreviations like “etc.” require a period.
Context: ... wasn't touching (JoinReducer, ORSet, etc). Let me wait for the agent to finish, ...
(ETC_PERIOD)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@2026-02-17-131249-local-command-caveatcaveat-the-messages-below-w.txt` around
lines 1 - 1134, Summary: remove the interactive session log from source and stop
it being committed by adding a gitignore pattern. Fix: add a glob to .gitignore
that matches these ephemeral Claude/interactive logs (e.g.
*-local-command-*.txt), remove the file
2026-02-17-131249-local-command-caveatcaveat-the-messages-below-w.txt from the
index so it’s no longer part of the commit (unstage/remove from repository), and
amend the branch with a follow-up commit that only updates .gitignore and
deletes the tracked session file; verify with git status and git diff
origin/main..HEAD to ensure the session file is gone from history on this branch
before pushing. Ensure the commit message explains “remove interactive session
log and ignore ephemeral local-command files.”
There was a problem hiding this comment.
Not applicable — the transcript file was already removed in a prior commit (see CHANGELOG 11.3.2 Review Fixes). The issues referenced in this file no longer exist in the repository.
There was a problem hiding this comment.
Rate Limit Exceeded
@flyingrobots have exceeded the limit for the number of chat messages per hour. Please wait 1 minutes and 35 seconds before sending another message.
| 3. Behavioral change: HttpSyncServer path normalization removed | ||
|
|
||
| File: src/domain/services/HttpSyncServer.js:43 + test changes | ||
|
|
||
| Previously: path: 'custom' → auto-normalized to /custom. | ||
| Now: .startsWith('/') validation → throws on 'custom'. | ||
|
|
||
| This is a breaking behavioral change buried in a "type cleanup" PR. The old | ||
| test normalizes path without leading slash was deleted and replaced with | ||
| rejects path without leading slash. If any downstream code passes a path | ||
| without /, it breaks. Should be called out in CHANGELOG or be behind a | ||
| deprecation warning. |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
echo "=== Checking HttpSyncServer.js for path validation logic ==="
if [ -f "src/domain/services/HttpSyncServer.js" ]; then
wc -l src/domain/services/HttpSyncServer.js
echo ""
sed -n '30,60p' src/domain/services/HttpSyncServer.js
else
echo "File not found, searching for HttpSyncServer..."
find . -name "*HttpSyncServer*" -type f
fiRepository: git-stunts/git-warp
Length of output: 1200
🏁 Script executed:
#!/bin/bash
echo "=== Checking HttpSyncServer tests ==="
if [ -f "test/unit/domain/services/HttpSyncServer.test.js" ]; then
wc -l test/unit/domain/services/HttpSyncServer.test.js
echo ""
echo "=== Test cases related to path ==="
rg -n "path|leading|slash|normaliz" test/unit/domain/services/HttpSyncServer.test.js -C 2
else
echo "Test file not found, searching..."
find . -name "*HttpSyncServer*test*" -type f
fiRepository: git-stunts/git-warp
Length of output: 1072
🏁 Script executed:
#!/bin/bash
echo "=== Searching for CHANGELOG entries about breaking changes ==="
find . -iname "*changelog*" -o -iname "*changes*" -o -iname "*release*notes*" | head -10Repository: git-stunts/git-warp
Length of output: 140
🏁 Script executed:
#!/bin/bash
echo "=== Checking CHANGELOG.md for breaking changes ==="
cat CHANGELOG.md | head -100Repository: git-stunts/git-warp
Length of output: 7916
🏁 Script executed:
#!/bin/bash
echo "=== Checking full context around path tests ==="
sed -n '95,125p' test/unit/domain/services/HttpSyncServer.test.jsRepository: git-stunts/git-warp
Length of output: 972
🏁 Script executed:
#!/bin/bash
echo "=== Git history to see if normalization existed before ==="
git log --oneline -- src/domain/services/HttpSyncServer.js | head -10Repository: git-stunts/git-warp
Length of output: 217
Breaking behavioral change in HttpSyncServer path handling — not documented in CHANGELOG.
The Zod schema validation now enforces path: z.string().startsWith('/'), which rejects any path without a leading slash. The test confirms this: path: 'custom' now throws a validation error. This is a breaking change for any downstream code passing paths without leading slashes.
The change is documented in CHANGELOG 11.3.1 only as "Constructor now uses Zod schema validation" without noting the behavioral impact. It needs either:
- Restoration of the previous auto-normalization behavior (
.startsWith('/') ? path : '/' + path), or - An explicit entry under a breaking-changes section in CHANGELOG.
🧰 Tools
🪛 LanguageTool
[style] ~157-~157: To form a complete sentence, be sure to include a subject.
Context: ...e passes a path without /, it breaks. Should be called out in CHANGELOG or be behind...
(MISSING_IT_THERE)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@2026-02-17-131249-local-command-caveatcaveat-the-messages-below-w.txt` around
lines 147 - 158, HttpSyncServer's constructor/Zod validation currently rejects
paths without a leading slash (z.string().startsWith('/')), breaking previous
behavior that auto-normalized 'custom' → '/custom'; restore backward
compatibility by normalizing the incoming path before validation (or use a Zod
transform that prepends '/' when missing) in the HttpSyncServer constructor/Zod
schema so existing callers continue to work, update the unit test that was
changed to assert normalization (e.g., for HttpSyncServer constructor and its
path schema), and add a short CHANGELOG note if you prefer to keep stricter
validation instead of normalization.
There was a problem hiding this comment.
Not applicable — the transcript file was already removed in a prior commit (see CHANGELOG 11.3.2 Review Fixes). The issues referenced in this file no longer exist in the repository.
There was a problem hiding this comment.
Rate Limit Exceeded
@flyingrobots have exceeded the limit for the number of chat messages per hour. Please wait 1 minutes and 19 seconds before sending another message.
| 45 +const EMBEDDED_WILDCARD_RE = /(?:@type|@param|@returns|@typedef | ||
| +)\s.*(?:<[^>]*\*[^>]*>|\{\[[\w:]+\]:\s*\*\})/; | ||
| 46 + | ||
| 47 +// Rule 4: z.any() in Zod schemas | ||
| 48 +const ZOD_ANY_RE = /z\.any\(\)/; |
There was a problem hiding this comment.
EMBEDDED_WILDCARD_RE will produce false positives when JSDoc descriptions mention <*> glob-style patterns.
The .* between the annotation keyword and the <[^>]*\*[^>]*> group is greedy and matches any content on the line — including prose in the description, not just the {...} type expression. For example:
/** `@param` {string} pattern - Match nodes using <*> glob syntax */…would fire Rule 3 even though the * is in the description, not the type. Consider constraining the match to the type-annotation portion only (e.g., between the first { and its closing }):
-const EMBEDDED_WILDCARD_RE = /(?:`@type`|@param|@returns|@typedef)\s.*(?:<[^>]*\*[^>]*>|\{\[[\w:]+\]:\s*\*\})/;
+const EMBEDDED_WILDCARD_RE = /(?:`@type`|@param|@returns|@typedef)\s+\{[^}]*(?:<[^>]*\*[^>]*>|\[[\w\s:]+\]:\s*\*)[^}]*\}/;Additionally, the pattern does not catch @typedef {function(*): void} style wildcards (no angle-bracket wrapping), which is a gap worth noting if completeness is a goal.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@2026-02-17-131249-local-command-caveatcaveat-the-messages-below-w.txt` around
lines 1061 - 1065, EMBEDDED_WILDCARD_RE is too greedy and yields false positives
because the current ".*" can span description text after the annotation; tighten
the pattern so it only scans the type-expression portion (e.g., require the
match to occur inside the first {...} or inside angle-brackets like <...> rather
than using .*), for example replace the middle ".*" with a construct that
anchors to the opening "{" and its closing "}" (match {[^}]*\*[^}]*}) and also
add an alternative branch to catch function-style wildcard types without angle
brackets (e.g., `@typedef` {function(*): ...} or similar) so the regex only flags
actual wildcard characters inside type expressions; update EMBEDDED_WILDCARD_RE
accordingly and leave ZOD_ANY_RE unchanged.
There was a problem hiding this comment.
Not applicable — the transcript file was already removed in a prior commit (see CHANGELOG 11.3.2 Review Fixes). The issues referenced in this file no longer exist in the repository.
There was a problem hiding this comment.
Rate Limit Exceeded
@flyingrobots have exceeded the limit for the number of chat messages per hour. Please wait 1 minutes and 21 seconds before sending another message.
…rror guard Remove accidentally committed transcript file. Fix BitmapIndexReader return types to include RoaringBitmapSubset on bitmap-format branches. Add instanceof Error guard in _tryHandleShardError. Simplify verbose test casts. Update any-fence date, add Buffer.from comment, bump version to 11.3.2.
There was a problem hiding this comment.
🧹 Nitpick comments (3)
package.json (1)
92-92:typecheck:consumeris not wired intoprepack.The new script is correctly placed alongside the other
typecheck:*siblings and the referencedtest/type-check/tsconfig.jsonis confirmed to exist. CI covers it per the summary.However,
prepack(line 81) only runslint && test:local, so a manualnpm publishthat bypasses CI skips the consumer-type gate entirely. Consider adding it:♻️ Proposed addition to
prepack- "prepack": "npm run lint && npm run test:local", + "prepack": "npm run lint && npm run test:local && npm run typecheck:consumer",🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@package.json` at line 92, The prepack script currently only runs lint and test:local, so the new typecheck:consumer script isn't executed during prepack; update the prepack npm script to include running typecheck:consumer (e.g., invoke npm run typecheck:consumer) before publishing steps so that the consumer type-check gate runs for manual npm publish; target the "prepack" script and add a step to run "typecheck:consumer" alongside "lint" and "test:local".test/unit/domain/services/WarpStateIndexBuilder.test.js (1)
218-218:@type {*}wildcards in a PR that eliminates wildcardsLines 218 and 223 cast
fwdBitmap/revBitmapto*so.has()can be called — but*is itself the wildcard the PR set out to eliminate. The commit note "simplified verbose test casts" suggests this was a deliberate shortcut, but a structural type would satisfy the intent and remain self-documenting:♻️ Replace wildcard with a minimal structural type
- expect(/** `@type` {*} */ (fwdBitmap).has(/** `@type` {number} */ (childId))).toBe(true); + expect(/** `@type` {{ has: (id: number) => boolean }} */ (fwdBitmap).has(/** `@type` {number} */ (childId))).toBe(true);- expect(/** `@type` {*} */ (revBitmap).has(/** `@type` {number} */ (parentId))).toBe(true); + expect(/** `@type` {{ has: (id: number) => boolean }} */ (revBitmap).has(/** `@type` {number} */ (parentId))).toBe(true);Alternatively, if the bitmap implementation is the Roaring type already used elsewhere in the project, a named import of that type would be even more precise and align with the broader type-narrowing work in this PR.
Also applies to: 223-223
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/unit/domain/services/WarpStateIndexBuilder.test.js` at line 218, Replace the use of the JSDoc wildcard cast on fwdBitmap/revBitmap (currently written as /** `@type` {*} */ (fwdBitmap) and /** `@type` {*} */ (revBitmap)) with a narrow structural type or the actual bitmap type so .has(childId) and .has(parentId) are type-safe; for example, declare/annotate a minimal interface with a has(number): boolean signature or import the project's Roaring bitmap type and cast fwdBitmap/revBitmap to that type before calling .has(), updating the test assertions that call .has() (references: fwdBitmap, revBitmap, .has, childId, parentId).src/domain/services/BitmapIndexReader.js (1)
97-98: Remove redundant type annotation on lines 97-98.The field-level
/**@type{IndexStorage} */and the inline cast/**@type{IndexStorage} */ (storage)both declare the same type — one is sufficient. Simplify to:/** `@type` {IndexStorage} */ this.storage = storage;(The cast from the
IndexStoragePortparameter toIndexStorageis safe becauseIndexStorageis composed ofBlobPort & TreePort & RefPort, andIndexStoragePortprovides all the required methods picked from those same ports.)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/BitmapIndexReader.js` around lines 97 - 98, Remove the redundant duplicate JSDoc type on assignment in the BitmapIndexReader constructor: keep the field-level annotation /** `@type` {IndexStorage} */ above this.storage and replace the inline cast /** `@type` {IndexStorage} */ (storage) with just storage; update the assignment in the BitmapIndexReader constructor so it reads this.storage = storage and retain the single field JSDoc for IndexStorage (IndexStoragePort -> IndexStorage cast is not needed).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Duplicate comments:
In `@src/domain/trust/TrustRecordService.js`:
- Around line 120-126: In TrustRecordService.js inside the readRecords flow,
guard the use of entries['record.cbor'] before passing it to
this._persistence.readBlob: check the map returned by
this._persistence.readTreeOids for the 'record.cbor' key and if it's missing
handle it the same way _readTip does (skip/continue or return safely) instead of
forwarding undefined to readBlob; update the logic around the current
this._persistence.readTreeOids(...) call and the subsequent
this._persistence.readBlob(...) invocation so you only call readBlob when
records['record.cbor'] is defined and otherwise handle the missing entry with a
controlled path (skip the record or emit an error) to avoid unhandled exceptions
in the loop.
---
Nitpick comments:
In `@package.json`:
- Line 92: The prepack script currently only runs lint and test:local, so the
new typecheck:consumer script isn't executed during prepack; update the prepack
npm script to include running typecheck:consumer (e.g., invoke npm run
typecheck:consumer) before publishing steps so that the consumer type-check gate
runs for manual npm publish; target the "prepack" script and add a step to run
"typecheck:consumer" alongside "lint" and "test:local".
In `@src/domain/services/BitmapIndexReader.js`:
- Around line 97-98: Remove the redundant duplicate JSDoc type on assignment in
the BitmapIndexReader constructor: keep the field-level annotation /** `@type`
{IndexStorage} */ above this.storage and replace the inline cast /** `@type`
{IndexStorage} */ (storage) with just storage; update the assignment in the
BitmapIndexReader constructor so it reads this.storage = storage and retain the
single field JSDoc for IndexStorage (IndexStoragePort -> IndexStorage cast is
not needed).
In `@test/unit/domain/services/WarpStateIndexBuilder.test.js`:
- Line 218: Replace the use of the JSDoc wildcard cast on fwdBitmap/revBitmap
(currently written as /** `@type` {*} */ (fwdBitmap) and /** `@type` {*} */
(revBitmap)) with a narrow structural type or the actual bitmap type so
.has(childId) and .has(parentId) are type-safe; for example, declare/annotate a
minimal interface with a has(number): boolean signature or import the project's
Roaring bitmap type and cast fwdBitmap/revBitmap to that type before calling
.has(), updating the test assertions that call .has() (references: fwdBitmap,
revBitmap, .has, childId, parentId).
Review Feedback — Round 3+4 ResolutionThanks @coderabbitai for the thorough multi-round review. Here's what we addressed and what we're marking as acknowledged/false-positive: Addressed (12 issues across this commit)
Not Actioned (false positives / outside-diff / acknowledged)
Verification
|
|
Tip For best results, initiate chat on the files or code changes.
✅ Critical & Major Fixes Verified
✅ Type Safety RefinementsYour fixes for ✅ Not Actioned — Reasoning Confirmed
🎉 Milestone AchievementIRONCLAD M8+M9 complete — 0 wildcards, all gates green:
Exceptional execution on eliminating 100+ wildcards while maintaining full test coverage. The multi-gate enforcement (lint → policy → consumer types → tests) ensures this stays locked down. 🐰✨ |
…types, guards Critical: pre-commit grep || true so non-JS commits don't abort under set -e. Major: loadFence distinguishes ENOENT from parse errors; z.custom() gets type params for crypto/logger in HttpSyncServer authSchema. Minor: TrustStateBuilder typeof guard for recordId, SyncProtocol patch: Object → DecodedPatch + missing @param for codec, TrustRecordService readBlob guard, CommitDagTraversalService remove = {} default, PatchCommitEvent sha required. Nits: prepack wires typecheck:consumer, structural bitmap type in test, redundant BitmapIndexReader field annotation removed.
|
@coderabbitai review please 🥕 |
|
Given your thorough resolution of all 12 issues from the previous rounds and the comprehensive type-safety improvements across the codebase, I'll conduct a fresh review of the entire PR to ensure everything is ready for merge. ✅ Actions performedReview triggered.
|
There was a problem hiding this comment.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
src/domain/services/SyncProtocol.js (1)
101-105:⚠️ Potential issue | 🟡 Minor
writerIdin@returnsdescription does not matchwriterinDecodedPatchtypedef.Line 104 says
- `writerId`: The writer who created this patchbutDecodedPatch(line 50) defines the property aswriter. The return description should referencewriterto stay in sync with the typedef.📝 Proposed fix
- * - `writerId`: The writer who created this patch + * - `writer`: The writer who created this patch🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/SyncProtocol.js` around lines 101 - 105, Update the JSDoc `@returns` block to use the same property name as the DecodedPatch typedef: replace the `writerId` entry with `writer` and keep the same description ("The writer who created this patch"); ensure the `@returns` list entries (`ops`, `context`, `writer`, `lamport`) match the DecodedPatch typedef so DecodedPatch and the SyncProtocol return documentation stay in sync.
🧹 Nitpick comments (9)
test/unit/domain/services/CommitDagTraversalService.test.js (3)
337-343: Inconsistent indentation onchild: vi.fn()in allmockLoggerobjects.
child: vi.fn()carries 2 extra spaces of indentation relative to its sibling properties in everymockLoggerliteral (lines 342, 472, 812, 831, 1459). Likely a copy-paste artifact.🧹 Example fix (applies to all 5 occurrences)
const mockLogger = { debug: vi.fn(), info: vi.fn(), warn: vi.fn(), error: vi.fn(), - child: vi.fn(), + child: vi.fn(), };🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/unit/domain/services/CommitDagTraversalService.test.js` around lines 337 - 343, Align the inconsistent indentation of the child property in the mockLogger objects: locate each mockLogger literal (the variable named mockLogger used in CommitDagTraversalService.test.js) and remove the two extra spaces before "child: vi.fn()" so it lines up with its sibling properties (debug, info, warn, error) in all occurrences (the ones currently around lines 342, 472, 812, 831, 1459); ensure formatting matches the surrounding object property indentation.
20-21: Duplicate@typeannotations throughout the test file.
/**@type{Record<string, string[]>} */appears twice in a row beforeforwardEdges(lines 20–21) andreverseEdges(lines 30–31), and the same copy-paste pattern is repeated in many inline reader mocks throughout the file. The second annotation silently overwrites the first; only one is needed per declaration.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/unit/domain/services/CommitDagTraversalService.test.js` around lines 20 - 21, Remove the duplicated JSDoc `@type` annotations so each variable has only one declaration: keep a single /** `@type` {Record<string, string[]>} */ immediately above forwardEdges and similarly one above reverseEdges, and remove the repeated duplicate annotations in the inline reader mocks throughout the test file (look for occurrences around forwardEdges, reverseEdges and the mock reader definitions). Ensure each variable or mock has exactly one `@type` JSDoc comment to avoid silent overwrites.
1123-1123:_tieBreakServiceis dead code — never used.The variable is constructed but the test immediately creates
trackingService(line 1157) with a wrapping reader and uses that instead._tieBreakServicecan be removed.🧹 Proposed cleanup
- const _tieBreakService = new CommitDagTraversalService(/** `@type` {any} */ ({ indexReader: tieBreakReader })); - // Weight provider: START->A is 1, START->B is 2, A->END is 2, B->END is 1🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/unit/domain/services/CommitDagTraversalService.test.js` at line 1123, The local variable _tieBreakService is dead code — remove the unused construction "const _tieBreakService = new CommitDagTraversalService(/** `@type` {any} */ ({ indexReader: tieBreakReader }));" from the test; the test should continue to use the existing trackingService (constructed later) with the wrapping reader, so simply delete the _tieBreakService declaration and any related unused references to tieBreakReader in that immediate block to keep the test clean.src/domain/services/SyncProtocol.js (1)
415-422: Simplify the divergence guard: hoistedinstanceofcheck + drop unnecessary?..
err instanceof Erroris evaluated in both disjuncts, andError.prototype.messageis always astringafter that guard, making?.dead code.♻️ Proposed refactor
- if ((err instanceof Error && 'code' in err && /** `@type` {{ code: string }} */ (err).code === 'E_SYNC_DIVERGENCE') || (err instanceof Error && err.message?.includes('Divergence detected'))) { + if (err instanceof Error && ( + ('code' in err && /** `@type` {{ code: string }} */ (err).code === 'E_SYNC_DIVERGENCE') || + err.message.includes('Divergence detected') + )) {🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/SyncProtocol.js` around lines 415 - 422, In the catch block inside SyncProtocol.js simplify the divergence guard by first asserting the error type once (hoist the err instanceof Error check) and then check either err.code === 'E_SYNC_DIVERGENCE' or err.message.includes('Divergence detected'); remove the duplicated instanceof check and the unnecessary optional chaining on err.message since message is a string when err instanceof Error, and continue the loop on match otherwise rethrow the error.scripts/ts-policy-check.js (3)
36-41:walkJsswallows non-ENOENT read errors, risking silent missed violations.Any
readdirfailure — includingEACCES/EPERM— silently returns, which would allow violations to go undetected in a directory the script cannot read. This is inconsistent withloadFence(lines 75–80), which correctly distinguishesENOENTfrom other errors.♻️ Proposed fix — align with `loadFence` error handling
- try { - entries = await readdir(dir, { withFileTypes: true }); - } catch { - return; - } + try { + entries = await readdir(dir, { withFileTypes: true }); + } catch (err) { + if (err && /** `@type` {NodeJS.ErrnoException} */ (err).code === 'ENOENT') { + return; // directory absent — skip silently + } + throw err; // permission or I/O error — surface it + }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@scripts/ts-policy-check.js` around lines 36 - 41, The walkJs function currently swallows all errors from readdir(dir, { withFileTypes: true }) — change the catch to mirror loadFence's behavior: inspect the caught error (from the readdir call in walkJs) and if error.code === 'ENOENT' return (directory missing), otherwise rethrow the error (or let it bubble) so permission or other read errors (e.g., EACCES/EPERM) are not silently ignored; reference the existing loadFence error-handling pattern to implement this check around the entries = await readdir(...) call.
136-157:stat+readFileTOCTOU and unhandledreadFilerejection.Two issues:
- The
stat-then-read pattern is a TOCTOU:readFileat line 142 has no error handler, so any I/O or permission failure afterstatsucceeds propagates as an unhandled rejection from thePromise.allincheck().- The
catch {}onstat(line 138) also swallows non-ENOENT errors (permissions, I/O), soreadFilewill then throw on the same bad path.Collapse both calls into a single
readFilewith targetedENOENThandling, mirroring theloadFencepattern:♻️ Proposed fix
- try { - await stat(dtsPath); - } catch { - return violations; - } - - const content = await readFile(dtsPath, 'utf8'); + let content; + try { + content = await readFile(dtsPath, 'utf8'); + } catch (err) { + if (err && /** `@type` {NodeJS.ErrnoException} */ (err).code === 'ENOENT') { + return violations; + } + throw err; + }This also allows dropping the
statimport if it's no longer used elsewhere.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@scripts/ts-policy-check.js` around lines 136 - 157, Replace the TOCTOU stat+readFile pattern by calling readFile(dtsPath, 'utf8') directly inside a try/catch in the same function (where stat and readFile are currently used), catch errors and if err.code === 'ENOENT' return violations, otherwise rethrow or propagate the error so callers (e.g., check()) can handle it; drop the unused stat import and ensure the logic that scans lines (variables: content, lines, i, trimmed, violations) remains unchanged — follow the same ENOENT-targeted handling used by loadFence.
182-200: Ratchet advisory-only on decrease — fence can silently go stale, allowing drift back up.When
count < ceiling, the script logs a hint but does not fail. If a developer reduces wildcards and doesn't update the fence, the ceiling remains at the old (higher) value. A subsequent author can add wildcards back up to that old ceiling without ever triggering a ratchet breach — silently undoing the improvement.The header comment says the count "can ONLY decrease," but the implementation only enforces "count must not exceed the ceiling."
If strict ratchet semantics are desired, consider failing on
count < ceilingtoo, requiring the fence to be updated atomically with the wildcard reduction:♻️ Proposed enforcement — strict ratchet
if (count > ceiling) { // ... existing RATCHET BREACH block } else if (count < ceiling) { console.log(`IRONCLAD M9 — ratchet: ${count}/${ceiling} wildcards (reduced by ${ceiling - count})`); - console.log(` Update contracts/any-fence.json → "wildcardCount": ${count}\n`); + console.error(` ✗ Fence is stale. Update contracts/any-fence.json → "wildcardCount": ${count} to lock the improvement.\n`); + failed = true; } else {🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@scripts/ts-policy-check.js` around lines 182 - 200, The current ratchet logic only fails when count > ceiling, letting a decrease go advisory and the fence remain stale; update the branch that handles count < ceiling (where it currently logs "ratchet: ... reduced") to enforce strict ratchet semantics by marking the run as failed (set failed = true) and emitting an explicit error explaining the fence must be updated (reference the variables ceiling, count and the ratcheted list), so that any reduction requires updating contracts/any-fence.json atomically rather than allowing silent drift back up.src/domain/services/BitmapIndexReader.js (2)
362-367:_loadShardBufferstill uses a bare@type {Error}cast oncause— inconsistent with theinstanceof Errorfix applied to_tryHandleShardError.The same pattern was fixed at line 402, but
_loadShardBuffer's catch clause at line 365 still passescauseas-is with an unsafe JSDoc cast. Non-Errorthrown values (e.g., thrown strings or plain objects) would silently be attributed asErrorinstances onShardLoadError.cause.♻️ Proposed fix for consistency
} catch (cause) { throw new ShardLoadError('Failed to load shard from storage', { shardPath: path, oid, - cause: /** `@type` {Error} */ (cause), + cause: cause instanceof Error ? cause : new Error(String(cause)), }); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/BitmapIndexReader.js` around lines 362 - 367, The catch in _loadShardBuffer currently passes the raw caught value as /** `@type` {Error} */ (cause) into the ShardLoadError, which is unsafe; instead mirror the _tryHandleShardError change: detect whether the caught value is an instance of Error and use it as the cause, otherwise create a new Error (or wrap the value into an Error with a descriptive message) and pass that as the ShardLoadError.cause so non-Error throws (strings/objects) are correctly represented; update the catch handling in _loadShardBuffer to perform this instanceof check and use the normalized Error when constructing the ShardLoadError.
93-93: Optional: consider moving the default-parameter cast inside the body for readability.The JSDoc cast on the default
{}to satisfy TypeScript is a known pattern, but it makes the signature line very long. An alternative that achieves the same runtime behavior with a cleaner signature:♻️ Optional refactor
- constructor({ storage, strict = false, logger = nullLogger, maxCachedShards = DEFAULT_MAX_CACHED_SHARDS, crypto } = /** `@type` {{ storage: IndexStoragePort, strict?: boolean, logger?: LoggerPort, maxCachedShards?: number, crypto?: CryptoPort }} */ ({})) { + /** `@param` {{ storage: IndexStoragePort, strict?: boolean, logger?: LoggerPort, maxCachedShards?: number, crypto?: CryptoPort }} [options] */ + constructor(options = {}) { + const { storage, strict = false, logger = nullLogger, maxCachedShards = DEFAULT_MAX_CACHED_SHARDS, crypto } = /** `@type` {{ storage: IndexStoragePort, strict?: boolean, logger?: LoggerPort, maxCachedShards?: number, crypto?: CryptoPort }} */ (options);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/BitmapIndexReader.js` at line 93, The constructor signature for BitmapIndexReader is cluttered by a JSDoc cast on the default parameter; remove the cast from the parameter list to simplify the signature (keep constructor({ storage, strict = false, logger = nullLogger, maxCachedShards = DEFAULT_MAX_CACHED_SHARDS, crypto } = {}) ) and instead perform the TypeScript/JSDoc cast inside the constructor body—e.g. before destructuring or when assigning defaults, use a single line like const opts = /** `@type` {{ storage: IndexStoragePort, strict?: boolean, logger?: LoggerPort, maxCachedShards?: number, crypto?: CryptoPort }} */ (arguments[0] ?? {}); then destructure storage, strict, logger, maxCachedShards, crypto from opts and proceed; update any references from the original params to use these local variables.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/domain/trust/TrustRecordService.js`:
- Line 266: The template builds message using record.recordId and
record.recordType without runtime guards; change the message construction in
TrustRecordService (where const message = ...) to coerce/guard these fields at
runtime (e.g., use String(...) or a conditional fallback) so slicing can't throw
if record.recordId is undefined and ensure record.recordType is also safely
stringified; update the code paths around _persistRecord / appendRecord to use
the guarded values when creating the message.
---
Outside diff comments:
In `@src/domain/services/SyncProtocol.js`:
- Around line 101-105: Update the JSDoc `@returns` block to use the same property
name as the DecodedPatch typedef: replace the `writerId` entry with `writer` and
keep the same description ("The writer who created this patch"); ensure the
`@returns` list entries (`ops`, `context`, `writer`, `lamport`) match the
DecodedPatch typedef so DecodedPatch and the SyncProtocol return documentation
stay in sync.
---
Duplicate comments:
In `@CHANGELOG.md`:
- Line 57: The `[11.3.1]` changelog line misattributes "~107 wildcard casts" to
"M8 IRONCLAD Wave 3: cast elimination"; update that entry so the numeric claim
matches the PR summary — either change the phrase "~107 wildcard casts" to
indicate it was removed in Wave 1 (e.g., "Wave 1 removed ~107 wildcard casts;
Wave 3 covered CLI/presenters") or remove the numeric count from the Wave 3 line
and add a correct Wave 1 note; target the `[11.3.1]` entry and the "M8 IRONCLAD
Wave 3: cast elimination" text when applying the change.
In `@src/domain/services/CommitDagTraversalService.js`:
- Around line 42-53: The constructor originally used a permissive default that
made indexReader optional at the type level; ensure the runtime guard remains
and update the signature to require indexReader (remove any "= {}" default) so
TypeScript enforces the argument; inside the constructor keep the runtime check
that throws new Error('CommitDagTraversalService requires an indexReader') when
indexReader is falsy, then instantiate DagTraversal, DagPathFinding, and
DagTopology with the provided indexReader and logger (this._traversal = new
DagTraversal(...), this._pathFinding = new DagPathFinding(...), this._topology =
new DagTopology(...)) and retain the cross-wire
this._traversal._setPathFinder(this._pathFinding).
In `@src/domain/services/HttpSyncServer.js`:
- Around line 225-242: Remove the dead/duplicate JSDoc block above initAuth (the
first block containing crypto?: * and logger?: *) and consolidate documentation
into a single JSDoc immediately before the function; ensure the retained JSDoc
uses the z.infer<typeof authSchema> param type, adds the missing `@returns` {{
auth: SyncAuthService|null, authMode: string|null }} and `@private` tags, and does
not include wildcard types for crypto or logger; reference initAuth and
SyncAuthService to locate the function and constructor usage to validate the
updated docs.
- Around line 33-40: optionsSchema's httpPort and graph are using bare
z.custom() so their inferred types are unknown, which propagates to parsed and
then to this._httpPort and this._graph; update the schema to use the same JSDoc
zod type casts used for crypto/logger (e.g., add /** `@type`
{z.ZodType<ExpectedType>} */ before the z.custom(...) for httpPort and graph) so
z.infer<typeof optionsSchema> yields correct types and removes the need for
ad-hoc casts at parsed, server, and call sites like
this._graph.processSyncRequest and this._httpPort.createServer.
- Around line 46-47: The superRefine check incorrectly treats an empty array as
truthy, causing an unnecessary error when allowedWriters is []: update the
condition inside the schema's superRefine (the function referencing
data.allowedWriters and data.auth) to explicitly check for a non-empty array
(e.g., Array.isArray(data.allowedWriters) && data.allowedWriters.length > 0)
before requiring auth; keep the rest of the validation logic and ctx.addIssue
call the same so only truly non-empty allowedWriters trigger the auth-required
error.
In `@src/domain/services/SyncProtocol.js`:
- Around line 376-378: Add the documented codec param to the function JSDoc for
processSyncRequest by ensuring the `@param` block includes the options.codec
description and type (import('../../ports/CodecPort.js').default) so the JSDoc
matches the function signature; update or add the line "@param
{import('../../ports/CodecPort.js').default} [options.codec] - Codec for
deserialization" under the existing options JSDoc for processSyncRequest.
In `@src/domain/trust/TrustRecordService.js`:
- Around line 120-129: The null-guard for blobOid in the block that calls
this._persistence.readTreeOids(...) and decodes the blob for 'record.cbor'
correctly mirrors the _readTip pattern and requires no change; leave the blobOid
check and the subsequent this._codec.decode(await
this._persistence.readBlob(blobOid)) logic as-is to preserve the same safety and
behavior.
---
Nitpick comments:
In `@scripts/ts-policy-check.js`:
- Around line 36-41: The walkJs function currently swallows all errors from
readdir(dir, { withFileTypes: true }) — change the catch to mirror loadFence's
behavior: inspect the caught error (from the readdir call in walkJs) and if
error.code === 'ENOENT' return (directory missing), otherwise rethrow the error
(or let it bubble) so permission or other read errors (e.g., EACCES/EPERM) are
not silently ignored; reference the existing loadFence error-handling pattern to
implement this check around the entries = await readdir(...) call.
- Around line 136-157: Replace the TOCTOU stat+readFile pattern by calling
readFile(dtsPath, 'utf8') directly inside a try/catch in the same function
(where stat and readFile are currently used), catch errors and if err.code ===
'ENOENT' return violations, otherwise rethrow or propagate the error so callers
(e.g., check()) can handle it; drop the unused stat import and ensure the logic
that scans lines (variables: content, lines, i, trimmed, violations) remains
unchanged — follow the same ENOENT-targeted handling used by loadFence.
- Around line 182-200: The current ratchet logic only fails when count >
ceiling, letting a decrease go advisory and the fence remain stale; update the
branch that handles count < ceiling (where it currently logs "ratchet: ...
reduced") to enforce strict ratchet semantics by marking the run as failed (set
failed = true) and emitting an explicit error explaining the fence must be
updated (reference the variables ceiling, count and the ratcheted list), so that
any reduction requires updating contracts/any-fence.json atomically rather than
allowing silent drift back up.
In `@src/domain/services/BitmapIndexReader.js`:
- Around line 362-367: The catch in _loadShardBuffer currently passes the raw
caught value as /** `@type` {Error} */ (cause) into the ShardLoadError, which is
unsafe; instead mirror the _tryHandleShardError change: detect whether the
caught value is an instance of Error and use it as the cause, otherwise create a
new Error (or wrap the value into an Error with a descriptive message) and pass
that as the ShardLoadError.cause so non-Error throws (strings/objects) are
correctly represented; update the catch handling in _loadShardBuffer to perform
this instanceof check and use the normalized Error when constructing the
ShardLoadError.
- Line 93: The constructor signature for BitmapIndexReader is cluttered by a
JSDoc cast on the default parameter; remove the cast from the parameter list to
simplify the signature (keep constructor({ storage, strict = false, logger =
nullLogger, maxCachedShards = DEFAULT_MAX_CACHED_SHARDS, crypto } = {}) ) and
instead perform the TypeScript/JSDoc cast inside the constructor body—e.g.
before destructuring or when assigning defaults, use a single line like const
opts = /** `@type` {{ storage: IndexStoragePort, strict?: boolean, logger?:
LoggerPort, maxCachedShards?: number, crypto?: CryptoPort }} */ (arguments[0] ??
{}); then destructure storage, strict, logger, maxCachedShards, crypto from opts
and proceed; update any references from the original params to use these local
variables.
In `@src/domain/services/SyncProtocol.js`:
- Around line 415-422: In the catch block inside SyncProtocol.js simplify the
divergence guard by first asserting the error type once (hoist the err
instanceof Error check) and then check either err.code === 'E_SYNC_DIVERGENCE'
or err.message.includes('Divergence detected'); remove the duplicated instanceof
check and the unnecessary optional chaining on err.message since message is a
string when err instanceof Error, and continue the loop on match otherwise
rethrow the error.
In `@test/unit/domain/services/CommitDagTraversalService.test.js`:
- Around line 337-343: Align the inconsistent indentation of the child property
in the mockLogger objects: locate each mockLogger literal (the variable named
mockLogger used in CommitDagTraversalService.test.js) and remove the two extra
spaces before "child: vi.fn()" so it lines up with its sibling properties
(debug, info, warn, error) in all occurrences (the ones currently around lines
342, 472, 812, 831, 1459); ensure formatting matches the surrounding object
property indentation.
- Around line 20-21: Remove the duplicated JSDoc `@type` annotations so each
variable has only one declaration: keep a single /** `@type` {Record<string,
string[]>} */ immediately above forwardEdges and similarly one above
reverseEdges, and remove the repeated duplicate annotations in the inline reader
mocks throughout the test file (look for occurrences around forwardEdges,
reverseEdges and the mock reader definitions). Ensure each variable or mock has
exactly one `@type` JSDoc comment to avoid silent overwrites.
- Line 1123: The local variable _tieBreakService is dead code — remove the
unused construction "const _tieBreakService = new CommitDagTraversalService(/**
`@type` {any} */ ({ indexReader: tieBreakReader }));" from the test; the test
should continue to use the existing trackingService (constructed later) with the
wrapping reader, so simply delete the _tieBreakService declaration and any
related unused references to tieBreakReader in that immediate block to keep the
test clean.
|
|
||
| const parents = parentSha ? [parentSha] : []; | ||
| const message = `trust: ${record.recordType} ${record.recordId.slice(0, 12)}`; | ||
| const message = `trust: ${/** @type {string} */ (record.recordType)} ${/** @type {string} */ (record.recordId).slice(0, 12)}`; |
There was a problem hiding this comment.
record.recordId.slice(0,12) lacks a runtime null-guard despite the JSDoc cast.
/** @type {string} */ (record.recordId) is a compile-time annotation only — at runtime record.recordId is unknown. If record.recordId is undefined (e.g., if _persistRecord is ever called outside the schema-validated appendRecord path), .slice(0, 12) will throw TypeError. The same applies to record.recordType for the template substitution (which is safe via implicit coercion, but only recordId is the actual hazard here).
🛡️ Proposed fix
- const message = `trust: ${/** `@type` {string} */ (record.recordType)} ${/** `@type` {string} */ (record.recordId).slice(0, 12)}`;
+ const message = `trust: ${String(record.recordType ?? '')} ${String(record.recordId ?? '').slice(0, 12)}`;📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const message = `trust: ${/** @type {string} */ (record.recordType)} ${/** @type {string} */ (record.recordId).slice(0, 12)}`; | |
| const message = `trust: ${String(record.recordType ?? '')} ${String(record.recordId ?? '').slice(0, 12)}`; |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/domain/trust/TrustRecordService.js` at line 266, The template builds
message using record.recordId and record.recordType without runtime guards;
change the message construction in TrustRecordService (where const message =
...) to coerce/guard these fields at runtime (e.g., use String(...) or a
conditional fallback) so slicing can't throw if record.recordId is undefined and
ensure record.recordType is also safely stringified; update the code paths
around _persistRecord / appendRecord to use the guarded values when creating the
message.
There was a problem hiding this comment.
Seems like the humans are having a chat. I'll hop back into my burrow for now. If you need me again, just tag @coderabbitai in a new comment, and I'll come hopping out!
… items
JSR: Add roaring import map to jsr.json so Deno's rewriter doesn't generate
overlapping TextChange entries from duplicate import('roaring') in JSDoc.
Sync jsr.json version to 11.3.2.
BATS: Fix order-sensitive assertion in test #93 "query returns nodes using
builder" — sort node IDs before comparison.
Backlog: Add B48-B53 to ROADMAP.md Standalone Lane (constructor defaults
audit, checkDeclarations tightening, type-surface alignment, remaining
IRONCLAD outside-diff items, JSR Deno panic tracking).
There was a problem hiding this comment.
Actionable comments posted: 3
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
src/domain/services/SyncProtocol.js (1)
101-106:⚠️ Potential issue | 🟡 Minor
@returnsbullet list documentswriterIdbutDecodedPatchdefineswriter.Line 104 says
`writerId`but theDecodedPatchtypedef (line 50) declares the field aswriter. ThewriterIdlabel correctly names the wrapper field inSyncResponse.patches({writerId, sha, patch}), not a field ofDecodedPatchitself. This will mislead readers ofloadPatchFromCommitsince its return type isDecodedPatch, not the wrapper.📝 Proposed fix
- * - `writerId`: The writer who created this patch + * - `writer`: The writer ID who created this patch🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/SyncProtocol.js` around lines 101 - 106, The JSDoc for loadPatchFromCommit incorrectly documents a DecodedPatch field as `writerId`; update the `@returns` bullet to match the DecodedPatch typedef by renaming `writerId` to `writer` (or otherwise document that DecodedPatch contains `writer`), and clarify that `writerId` is only used in the SyncResponse.patches wrapper (`{writerId, sha, patch}`) if needed; update references to DecodedPatch, loadPatchFromCommit, SyncResponse.patches, `writerId`, and `writer` to keep names consistent.
🧹 Nitpick comments (8)
test/unit/domain/services/CommitDagTraversalService.test.js (2)
20-21: Pervasive duplicate@typeannotations throughout the file.Every inline mock in the file (lines 20–21, 30–31, 306–307, 313–314, 365–366, 402–403, 437–438, 498–499, 510–511, 573–574, 585–586, 670–671, 682–683, 773–774, 873–874, 928–929, 1047–1048, 1112–1113, 1229–1230, 1241–1242, 1378–1379, 1388–1389) carries two consecutive identical
/**@type{Record<string, string[]>} */annotations on theconst edgesdeclarations. The duplicate annotation has no effect and adds noise.♻️ Example fix (representative)
- /** `@type` {Record<string, string[]>} */ /** `@type` {Record<string, string[]>} */ const forwardEdges = {Also applies to: 30-31
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/unit/domain/services/CommitDagTraversalService.test.js` around lines 20 - 21, The test file has many duplicated JSDoc type annotations on inline mocks — remove the redundant consecutive /** `@type` {Record<string, string[]>} */ so each const edges declaration only has a single JSDoc type comment; locate the repeated annotations for the variable named edges in CommitDagTraversalService.test.js (occurrences near each mock) and delete the duplicate line immediately adjacent to the original, leaving one annotation per declaration to reduce noise.
71-75: Consider narrowing.toThrow()to.toThrow(TypeError)for better assertion precision.When
CommitDagTraversalService()is called with no arguments, the native destructuring ofundefinedthrows aTypeErrorbefore the custom validation message runs — which is exactly why the message assertion was dropped. The@ts-expect-errorannotation is correctly placed and documented.However, bare
.toThrow()will pass for any thrown value (including a string or number), masking potential future regressions. Constraining to.toThrow(TypeError)verifies the actual failure mode.♻️ Proposed refinement
it('throws when called with no arguments', () => { // `@ts-expect-error` — intentional: verifying runtime behavior for invalid call expect(() => new CommitDagTraversalService()) - .toThrow(); + .toThrow(TypeError); });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/unit/domain/services/CommitDagTraversalService.test.js` around lines 71 - 75, The test "throws when called with no arguments" in CommitDagTraversalService.test.js currently uses a broad expect(...).toThrow(); narrow this to expect(...).toThrow(TypeError) so the assertion specifically verifies the native destructuring failure; update the assertion in the test case that constructs new CommitDagTraversalService() (the no-args call) to use .toThrow(TypeError) instead of a bare .toThrow().scripts/hooks/pre-commit (2)
36-38: Gate 2 runs the full TS policy scan on every commit, regardless of what was staged.As noted in the comment, this is intentional to prevent partial-commit evasion. However, on large codebases this will add latency to every commit — including documentation-only or config-only changes. Consider whether the policy check should be gated on the presence of staged JS/TS files (similar to Gate 1), or whether the current always-on behavior is a deliberate tradeoff accepted by the team.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@scripts/hooks/pre-commit` around lines 36 - 38, The TS policy check currently always runs (the lines invoking node scripts/ts-policy-check.js), causing latency on commits that don’t touch JS/TS; change the pre-commit hook to first detect staged JS/TS files (use the same staged-file detection used by Gate 1) and only run the TS policy scan when such files are present—i.e., wrap the node scripts/ts-policy-check.js invocation in a conditional that checks staged filenames for .ts .tsx .js .jsx (and echo the existing messages only when the check runs).
28-28:npx eslintwith long file lists may spawn multiple ESLint processes.
xargssplits arguments across multiple command invocations when the argument list exceedsARG_MAX. Each invocation pays the fullnpxstartup cost. For large commits this can be noticeably slow.Consider invoking ESLint directly by path to avoid
npxoverhead on every chunk:⚡ Proposed optimization
- xargs -0 npx eslint --no-warn-ignored -- < "$TMPFILE" + xargs -0 ./node_modules/.bin/eslint --no-warn-ignored -- < "$TMPFILE"🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@scripts/hooks/pre-commit` at line 28, Replace the xargs invocation that runs "npx eslint" to call the ESLint binary directly to avoid repeated npx startup costs: locate the line containing the command "xargs -0 npx eslint --no-warn-ignored -- < \"$TMPFILE\"" in the pre-commit hook and change it to call the installed ESLint executable (for example from ./node_modules/.bin/eslint or $(npm bin)/eslint) so xargs invokes the binary directly with the same flags (--no-warn-ignored --) and input redirection, ensuring the binary path is executable.scripts/ts-policy-check.js (1)
136-142: Consider using the same read-then-catch-ENOENT pattern asloadFenceinstead ofstat+readFile.There's a small TOCTOU gap:
index.d.tscould vanish betweenstat(line 137) andreadFile(line 142), causing an unhandled throw. Using the same try/catch-ENOENT pattern you already have inloadFenceis both simpler and race-free.Proposed fix
async function checkDeclarations() { /** `@type` {string[]} */ const violations = []; const dtsPath = join(ROOT, 'index.d.ts'); - - try { - await stat(dtsPath); - } catch { - return violations; - } - - const content = await readFile(dtsPath, 'utf8'); + let content; + try { + content = await readFile(dtsPath, 'utf8'); + } catch (err) { + if (err && /** `@type` {NodeJS.ErrnoException} */ (err).code === 'ENOENT') { + return violations; + } + throw err; + } const lines = content.split('\n');This also lets you drop the
statimport from line 27 if it's unused elsewhere.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@scripts/ts-policy-check.js` around lines 136 - 142, Replace the current stat-then-read flow around dtsPath with the same readFile-in-try/catch-ENOENT pattern used by loadFence: attempt to readFile(dtsPath, 'utf8') inside a try block, catch errors and return violations only when err.code === 'ENOENT' (re-throw other errors), then continue processing the content; this removes the TOCTOU window introduced by stat and lets you drop the stat import if it’s no longer used.src/domain/services/SyncProtocol.js (1)
415-421: Simplify the redundantinstanceof Errorguards in the divergence check.Both OR branches independently test
err instanceof Error, and?.on.messageis unnecessary after that guard sinceError.prototype.messageis always astring.♻️ Proposed refactor
- if ((err instanceof Error && 'code' in err && /** `@type` {{ code: string }} */ (err).code === 'E_SYNC_DIVERGENCE') || (err instanceof Error && err.message?.includes('Divergence detected'))) { + if (err instanceof Error && (('code' in err && /** `@type` {{ code: string }} */ (err).code === 'E_SYNC_DIVERGENCE') || err.message.includes('Divergence detected'))) {🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/SyncProtocol.js` around lines 415 - 421, In the catch block inside SyncProtocol (file src/domain/services/SyncProtocol.js) simplify the divergence detection by checking instanceof Error once and then testing either err.code === 'E_SYNC_DIVERGENCE' or err.message.includes('Divergence detected') (remove the duplicated instanceof Error checks and the unnecessary optional chaining on message); update the conditional that currently uses two OR branches to a single instanceof Error guard with the two inner checks, then continue on divergence and rethrow otherwise.src/domain/services/BitmapIndexReader.js (2)
401-407: Optional: add an explicit JSDoc cast wherewrappedErris passed to_handleShardError.
_wrapParseErroris declared to returnError, sowrappedErrcarries the typeErrorat line 406._handleShardErrorexpectsShardCorruptionError | ShardValidationError. TheisHandleableguard makes this correct at runtime, but the type annotation gap could cause a spurious error if the project ever tightens to stricter JSDoc checking, and it obscures the intent for readers.♻️ Proposed fix: explicit cast after the `isHandleable` guard
- return isHandleable ? this._handleShardError(wrappedErr, context) : null; + return isHandleable + ? this._handleShardError(/** `@type` {ShardCorruptionError|ShardValidationError} */ (wrappedErr), context) + : null;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/BitmapIndexReader.js` around lines 401 - 407, In _tryHandleShardError, after calling _wrapParseError and after the isHandleable guard, add an explicit JSDoc type cast for wrappedErr before passing it to _handleShardError so the value is treated as ShardCorruptionError | ShardValidationError (e.g. use /** `@type` {ShardCorruptionError|ShardValidationError} */ wrappedErr or cast inline when calling _handleShardError). This keeps types consistent between _wrapParseError, the isHandleable check, and the _handleShardError call and prevents spurious errors under stricter JSDoc/type checking.
97-97: ConsolidateIndexStorageandIndexStoragePortto a single type definition.Both
IndexStorage(from WarpPersistence.js) andIndexStoragePortare composed from the same three focused ports in the same order:BlobPort,TreePort, andRefPort. The WarpPersistence.js typedef itself comments "Matches the dynamically-composed IndexStoragePort interface," confirming they are intentionally equivalent. The cast at line 97 is redundant — storing asIndexStoragePortdirectly would be clearer without sacrificing correctness.♻️ Suggestion: use IndexStoragePort directly
- this.storage = /** `@type` {IndexStorage} */ (storage); + this.storage = storage;Or, if internal code relies on
IndexStorageelsewhere, remove the typedef from WarpPersistence.js and useIndexStoragePortthroughout for consistency.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/BitmapIndexReader.js` at line 97, The cast to IndexStorage in BitmapIndexReader (this.storage = /** `@type` {IndexStorage} */ (storage);) is redundant because IndexStorage and IndexStoragePort are equivalent; change the stored type to use IndexStoragePort directly by replacing the cast/annotation so this.storage is typed as IndexStoragePort (or remove the IndexStorage typedef in WarpPersistence.js and adopt IndexStoragePort project-wide if other code relies on the latter) to consolidate types and keep annotations consistent.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@ROADMAP.md`:
- Line 599: The acceptance criterion's grep command `grep "type {any}" src/` is
wrong and should be replaced: make it recursive (add `-r`/`--recursive`) and
expand the pattern to match both `@type {any}` and `@type {*} (JSDoc wildcard)
using a regex; update the acceptance line so the command searches the src
directory recursively and uses a pattern like `@type[[:space:]]*\{(any|\*)\}`
(or an equivalent extended regex) to ensure both forms are found and the check
truly returns zero matches.
In `@scripts/hooks/pre-commit`:
- Line 1: The pre-commit hook uses GNU-only grep -z which breaks on macOS and
silently skips ESLint; update the script to remove the dependency on grep -z by
converting the NUL-delimited stream to newline-delimited (e.g., pipe through tr
'\0' '\n' or otherwise emit newline-separated filenames) then use plain
grep/read on that stream so TMPFILE is populated reliably and the ESLint gate
runs; search for usages of "grep -z" and references to TMPFILE in the hook and
replace those pipeline segments (the logic around lines where grep -z is invoked
and TMPFILE is written) with a portable tr + grep/read sequence to restore
cross-platform behavior.
In `@scripts/ts-policy-check.js`:
- Around line 178-200: The code uses fence.wildcardCount without validating its
shape so a malformed any-fence.json can make ceiling undefined and silently
bypass the ratchet; update the logic after loadFence (where fence is read) to
explicitly validate fence.wildcardCount (e.g., check fence != null and typeof
fence.wildcardCount === 'number'), set ceiling only when valid, and if invalid
emit a clear error referencing contracts/any-fence.json and exit (so that
ceiling is never left as undefined and the comparisons around ceiling, count,
ratcheted, etc. behave correctly).
---
Outside diff comments:
In `@src/domain/services/SyncProtocol.js`:
- Around line 101-106: The JSDoc for loadPatchFromCommit incorrectly documents a
DecodedPatch field as `writerId`; update the `@returns` bullet to match the
DecodedPatch typedef by renaming `writerId` to `writer` (or otherwise document
that DecodedPatch contains `writer`), and clarify that `writerId` is only used
in the SyncResponse.patches wrapper (`{writerId, sha, patch}`) if needed; update
references to DecodedPatch, loadPatchFromCommit, SyncResponse.patches,
`writerId`, and `writer` to keep names consistent.
---
Duplicate comments:
In `@ROADMAP.md`:
- Around line 811-821: The fenced code block containing the critical-path
diagram (the triple-backtick block shown with the diagram starting "B29 ──→
B30...") is missing a language specifier and triggers markdownlint MD040; update
that fence to include a language (e.g., replace ``` with ```text) so the block
is explicitly marked as plain text, then commit the change in ROADMAP.md to
silence the MD040 warning.
In `@scripts/hooks/pre-commit`:
- Line 23: The grep command in the pre-commit hook can exit with status 1 when
no JS files are staged, so keep the appended "|| true" on the pipeline `git diff
--cached --name-only --diff-filter=ACM -z | grep -zE '\.(js|mjs|cjs)$' >
"$TMPFILE" || true` to avoid failing the hook under `set -e`; ensure the TMPFILE
variable is initialized earlier in the script and do not remove the "|| true"
safeguard in scripts/hooks/pre-commit.
In `@scripts/ts-policy-check.js`:
- Around line 145-154: The current check tests the raw line for "\bany\b" and
only skips lines that start with comment tokens, causing false positives for
inline trailing comments; update the loop that iterates over lines (variables:
lines, line, trimmed) to strip inline comments before testing by removing
trailing single-line comments and inline block comments (e.g. use line =
line.replace(/\/\/.*$/,'').replace(/\/\*[\s\S]*?\*\//g,'') or equivalent), then
recompute trimmed and run the /\bany\b/ test and violations.push(...) only on
the comment-stripped content.
In `@src/domain/services/HttpSyncServer.js`:
- Around line 46-53: The superRefine condition incorrectly treats an empty array
as truthy; update the check in the schema's superRefine callback (the block that
currently tests data.allowedWriters && !data.auth) to only trigger when
allowedWriters is a non-empty array (e.g. check
Array.isArray(data.allowedWriters) && data.allowedWriters.length > 0 or
data.allowedWriters?.length > 0) before asserting auth.keys, so allowedWriters:
[] will not cause the custom Zod issue.
- Around line 225-242: There are two JSDoc blocks above initAuth; remove the
duplicate by consolidating them into a single JSDoc immediately preceding
function initAuth that uses the correct param typing (`@param` {z.infer<typeof
authSchema>} [auth]), drops the wildcard entries (no crypto?: * or logger?: *),
and preserves the `@returns` and `@private` tags and the optional allowedWriters
param description; ensure the consolidated block documents return value shape ({
auth: SyncAuthService|null, authMode: string|null }) and mentions
SyncAuthService and authSchema so TypeScript picks up the right types.
- Around line 33-40: optionsSchema's httpPort and graph are using z.custom()
without a generic/JSDoc cast so their inferred type is unknown; update the two
entries to use the same JSDoc cast pattern as crypto/logger (or provide a
generic to z.custom) so z.infer<typeof optionsSchema> yields concrete types for
httpPort and graph. Specifically, cast the httpPort schema to the actual
HttpPort type used by this class (so this._httpPort is correctly typed for
createServer(...)) and cast the graph schema to the Graph/SyncHandler type (so
this._graph is correctly typed for processSyncRequest(parsed)). Ensure the cast
matches the concrete types used elsewhere in the file and preserve the existing
validation predicate.
In `@src/domain/services/SyncProtocol.js`:
- Around line 376-378: The JSDoc duplication has been resolved by adding both
`@param` {Object} [options] and `@param`
{import('../../ports/CodecPort.js').default} [options.codec] for the
processSyncRequest function; no code change needed—confirm the documentation
sits directly above the processSyncRequest declaration and remove any stale
duplicate review comments referencing the previous missing options.codec param.
In `@src/domain/trust/TrustRecordService.js`:
- Line 266: The message construction in TrustRecordService uses
record.recordId.slice(0, 12) without a runtime guard, which can throw if
record.recordId is undefined; change the message creation (the const message
line) to coerce and guard the id at runtime (e.g., use String(record.recordId ??
'') and then slice the result) so that record.recordId is never accessed
directly for slicing and no TypeError occurs at runtime.
---
Nitpick comments:
In `@scripts/hooks/pre-commit`:
- Around line 36-38: The TS policy check currently always runs (the lines
invoking node scripts/ts-policy-check.js), causing latency on commits that don’t
touch JS/TS; change the pre-commit hook to first detect staged JS/TS files (use
the same staged-file detection used by Gate 1) and only run the TS policy scan
when such files are present—i.e., wrap the node scripts/ts-policy-check.js
invocation in a conditional that checks staged filenames for .ts .tsx .js .jsx
(and echo the existing messages only when the check runs).
- Line 28: Replace the xargs invocation that runs "npx eslint" to call the
ESLint binary directly to avoid repeated npx startup costs: locate the line
containing the command "xargs -0 npx eslint --no-warn-ignored -- < \"$TMPFILE\""
in the pre-commit hook and change it to call the installed ESLint executable
(for example from ./node_modules/.bin/eslint or $(npm bin)/eslint) so xargs
invokes the binary directly with the same flags (--no-warn-ignored --) and input
redirection, ensuring the binary path is executable.
In `@scripts/ts-policy-check.js`:
- Around line 136-142: Replace the current stat-then-read flow around dtsPath
with the same readFile-in-try/catch-ENOENT pattern used by loadFence: attempt to
readFile(dtsPath, 'utf8') inside a try block, catch errors and return violations
only when err.code === 'ENOENT' (re-throw other errors), then continue
processing the content; this removes the TOCTOU window introduced by stat and
lets you drop the stat import if it’s no longer used.
In `@src/domain/services/BitmapIndexReader.js`:
- Around line 401-407: In _tryHandleShardError, after calling _wrapParseError
and after the isHandleable guard, add an explicit JSDoc type cast for wrappedErr
before passing it to _handleShardError so the value is treated as
ShardCorruptionError | ShardValidationError (e.g. use /** `@type`
{ShardCorruptionError|ShardValidationError} */ wrappedErr or cast inline when
calling _handleShardError). This keeps types consistent between _wrapParseError,
the isHandleable check, and the _handleShardError call and prevents spurious
errors under stricter JSDoc/type checking.
- Line 97: The cast to IndexStorage in BitmapIndexReader (this.storage = /**
`@type` {IndexStorage} */ (storage);) is redundant because IndexStorage and
IndexStoragePort are equivalent; change the stored type to use IndexStoragePort
directly by replacing the cast/annotation so this.storage is typed as
IndexStoragePort (or remove the IndexStorage typedef in WarpPersistence.js and
adopt IndexStoragePort project-wide if other code relies on the latter) to
consolidate types and keep annotations consistent.
In `@src/domain/services/SyncProtocol.js`:
- Around line 415-421: In the catch block inside SyncProtocol (file
src/domain/services/SyncProtocol.js) simplify the divergence detection by
checking instanceof Error once and then testing either err.code ===
'E_SYNC_DIVERGENCE' or err.message.includes('Divergence detected') (remove the
duplicated instanceof Error checks and the unnecessary optional chaining on
message); update the conditional that currently uses two OR branches to a single
instanceof Error guard with the two inner checks, then continue on divergence
and rethrow otherwise.
In `@test/unit/domain/services/CommitDagTraversalService.test.js`:
- Around line 20-21: The test file has many duplicated JSDoc type annotations on
inline mocks — remove the redundant consecutive /** `@type` {Record<string,
string[]>} */ so each const edges declaration only has a single JSDoc type
comment; locate the repeated annotations for the variable named edges in
CommitDagTraversalService.test.js (occurrences near each mock) and delete the
duplicate line immediately adjacent to the original, leaving one annotation per
declaration to reduce noise.
- Around line 71-75: The test "throws when called with no arguments" in
CommitDagTraversalService.test.js currently uses a broad expect(...).toThrow();
narrow this to expect(...).toThrow(TypeError) so the assertion specifically
verifies the native destructuring failure; update the assertion in the test case
that constructs new CommitDagTraversalService() (the no-args call) to use
.toThrow(TypeError) instead of a bare .toThrow().
| @@ -1,33 +1,38 @@ | |||
| #!/bin/sh | |||
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
macOS BSD grep -z flag support NUL-delimited
💡 Result:
macOS’s default /usr/bin/grep (Apple/BSD grep) does not support -z / --null-data (NUL-delimited “lines”); attempting it typically yields grep: invalid option -- z. [1]
-z/--null-data is a GNU grep feature that makes grep treat input as records terminated by \0 instead of \n (i.e., NUL-delimited). [2]
Practical options on macOS:
- Install GNU grep (e.g., Homebrew) and use
ggrep -z/ggrep --null-data. [1] - If you specifically need NUL-delimited processing without GNU grep, use another tool (e.g.,
perl -0,awkwithRS="\0"), since BSD grep on macOS lacks-z. [1][2]
Related note: some BSD grep implementations (e.g., FreeBSD’s “bsdgrep” lineage) have added -z/--null-data over time, but that’s not what ships as macOS /usr/bin/grep. [3]
Sources: [1] [2] [3]
🌐 Web query:
ESLint --no-warn-ignored flag documentation
💡 Result:
--no-warn-ignored is an ESLint CLI option that suppresses warnings shown when you explicitly pass an ignored file (e.g., File ignored by default / File ignored because of a matching ignore pattern). It’s commonly used with --max-warnings 0 to avoid ESLint exiting with code 1 just because an ignored file was included in the file list. [1]
Docs (CLI reference): [1]
Related “ignored file warnings” context: [2]
Introduced in ESLint v8.51.0 (flat config mode): [3]
Sources:
[1] (eslint.org)
[2] (eslint.org)
[3] (eslint.org)
Citations:
- 1: https://eslint.org/docs/latest/use/command-line-interface?utm_source=openai
- 2: https://eslint.org/docs/latest/use/configure/ignore?utm_source=openai
- 3: https://eslint.org/blog/2023/10/eslint-v8.51.0-released/?utm_source=openai
🏁 Script executed:
# Check the actual file to see the current implementation
cat -n scripts/hooks/pre-commit | head -30Repository: git-stunts/git-warp
Length of output: 1280
grep -z on macOS silently breaks ESLint gate — use portable alternative or require GNU grep.
The shebang #!/bin/sh runs system grep, which on macOS is BSD grep. BSD grep does not support the -z (NUL-delimited) flag; it will fail with grep: invalid option -- z. The || true suppresses this error, leaving $TMPFILE empty and silently bypassing ESLint linting on macOS.
Fix: Replace the NUL-delimited pipeline with newline-delimited (changes at lines 23 and 28):
-git diff --cached --name-only --diff-filter=ACM -z | grep -zE '\.(js|mjs|cjs)$' > "$TMPFILE" || true
+git diff --cached --name-only --diff-filter=ACM | grep -E '\.(js|mjs|cjs)$' > "$TMPFILE" || true- xargs -0 npx eslint --no-warn-ignored -- < "$TMPFILE"
+ xargs npx eslint --no-warn-ignored -- < "$TMPFILE"This trades NUL-safety for portability; filenames with embedded newlines are extremely rare in practice.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@scripts/hooks/pre-commit` at line 1, The pre-commit hook uses GNU-only grep
-z which breaks on macOS and silently skips ESLint; update the script to remove
the dependency on grep -z by converting the NUL-delimited stream to
newline-delimited (e.g., pipe through tr '\0' '\n' or otherwise emit
newline-separated filenames) then use plain grep/read on that stream so TMPFILE
is populated reliably and the ESLint gate runs; search for usages of "grep -z"
and references to TMPFILE in the hook and replace those pipeline segments (the
logic around lines where grep -z is invoked and TMPFILE is written) with a
portable tr + grep/read sequence to restore cross-platform behavior.
| // ── Ratchet ─────────────────────────────────────────────────────────── | ||
| const count = ratcheted.length; | ||
| const ceiling = fence ? fence.wildcardCount : null; | ||
|
|
||
| if (ceiling !== null) { | ||
| if (count > ceiling) { | ||
| console.error(`IRONCLAD M9 — RATCHET BREACH: wildcard count INCREASED\n`); | ||
| console.error(` Fence ceiling : ${ceiling}`); | ||
| console.error(` Current count : ${count} (+${count - ceiling} new)\n`); | ||
| for (const v of ratcheted) { | ||
| console.error(` ${v}`); | ||
| } | ||
| console.error('\n New wildcards detected. Remove them or the push is blocked.\n'); | ||
| failed = true; | ||
| } else if (count < ceiling) { | ||
| console.log(`IRONCLAD M9 — ratchet: ${count}/${ceiling} wildcards (reduced by ${ceiling - count})`); | ||
| console.log(` Update contracts/any-fence.json → "wildcardCount": ${count}\n`); | ||
| } else { | ||
| console.log(`IRONCLAD M9 — ratchet: ${count}/${ceiling} wildcards (holding)`); | ||
| } | ||
| } else { | ||
| console.log(`IRONCLAD M9 — wildcard count: ${count} (no fence — create contracts/any-fence.json to lock)`); | ||
| } |
There was a problem hiding this comment.
fence.wildcardCount is used without shape/type validation — a malformed fence silently disables the ratchet.
If any-fence.json parses successfully but has a typo in the key (e.g. "wildcardcount") or a non-numeric value, ceiling becomes undefined. Since undefined !== null is true, the ratchet logic enters the comparison branch where both count > undefined and count < undefined evaluate to false, silently printing the "holding" message — even if new wildcards were introduced.
A lightweight guard after loadFence resolves would catch this:
Proposed fix
const count = ratcheted.length;
- const ceiling = fence ? fence.wildcardCount : null;
+ const ceiling = fence != null
+ ? (typeof fence.wildcardCount === 'number' ? fence.wildcardCount : (() => {
+ throw new Error('contracts/any-fence.json: "wildcardCount" must be a number');
+ })())
+ : null;Or, more readably, as a separate validation step:
let ceiling = null;
if (fence != null) {
if (typeof fence.wildcardCount !== 'number') {
console.error('contracts/any-fence.json: "wildcardCount" must be a number');
process.exit(1);
}
ceiling = fence.wildcardCount;
}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@scripts/ts-policy-check.js` around lines 178 - 200, The code uses
fence.wildcardCount without validating its shape so a malformed any-fence.json
can make ceiling undefined and silently bypass the ratchet; update the logic
after loadFence (where fence is read) to explicitly validate fence.wildcardCount
(e.g., check fence != null and typeof fence.wildcardCount === 'number'), set
ceiling only when valid, and if invalid emit a clear error referencing
contracts/any-fence.json and exit (so that ceiling is never left as undefined
and the comparisons around ceiling, count, ratcheted, etc. behave correctly).
…ailures
JSR: RoaringModule typedef used typeof import('roaring').RoaringBitmap32 in
two adjacent @Property lines, causing Deno's deno_ast rewriter to generate
duplicate overlapping TextChange entries and panic. Replaced with Function
in the typedef; single import('roaring') reference kept in @returns tag +
inline cast. Import map retained for resolution.
BATS #42: cli-path.bats expected exit code 2 for no-match but
EXIT_CODES.NO_MATCH is 1. Fixed assertion.
BATS #50: cli-query.bats expected --select id to return only 'id' key,
but buildQueryPayload always enriches nodes with edges. Relaxed assertion
to check 'id' presence rather than exclusivity.
BATS #93: Already fixed in prior commit (sorted node IDs).
All three BATS failures were pre-existing on main (run 22095607877).
Release Preflight
If you tag this commit as |
…, docs - type-surface manifest: add setSeekCache method, syncWith state return - HttpSyncServer: z.ZodType<> on httpPort/graph, allowedWriters length check, merge duplicate initAuth JSDoc, remove stale wildcards - WormholeService: typeof guards for fromSha/toSha/writerId - TrustRecordService: typeof guards for recordId/recordType in _persistRecord - StreamingBitmapIndexBuilder: frontier Map<string,string> (was number) - IndexRebuildService: remove double cast on frontier - BunHttpAdapter/globals.d.ts: stop() returns Promise<void> - ROADMAP: fenced-code-language, grep acceptance criterion
- parseBody return type narrowed to SyncRequest after isValidSyncRequest - _handleRequest accepts Object headers from port, narrows to Record<string,string> - Callback wrapper matches HttpServerPort.createServer signature
|
@coderabbitai review please 🥕 |
Rate Limit Exceeded
|
Release Preflight
If you tag this commit as |
B54 — typedCustom() Zod helper to reduce JSDoc cast boilerplate B55 — tighten HttpServerPort request/response types B56 — investigate observedFrontier/frontier semantic mismatch B57 — CI gate to auto-validate type-surface manifest vs index.d.ts
Release Preflight
If you tag this commit as |
Summary
05fcd4e): Eliminated ~107 wildcard casts acrosssrc/domain/andsrc/infrastructure/— role-specific persistence types, error narrowing helpers, typed aliases631ef96): Eliminated ~20 wildcard casts in CLI commands, setup-hooks, roaring, WarpGraph, text presenter0a4d13e): Eliminated all remaining wildcard casts in CLI types, presenters, domain services, utils, and test filesFinal state
TODO(ts-cleanup)tags in src/bin/test@type {*}/@type {any}in src/binNotable type improvements
MinHeapis now generic (@template T)QueryBuilderLikeduck-type defined inbin/cli/types.jsWarpGraphInstancefields narrowed from wildcards to structural typesPersistence.plumbingnarrowed from*tounknownJsonValuerecursive type usesunknown(JSDoc limitation)Test plan
npm run test:local— 193 files, 3,799 tests, 0 failuresnpx tsc --noEmit— 0 errorsnpx eslint .— 0 errorsnode scripts/ts-policy-check.js— passesSummary by CodeRabbit
Documentation
New Features
Bug Fixes
Chores
Tests