Skip to content

Conversation

@jdreetz
Copy link
Collaborator

@jdreetz jdreetz commented Jan 14, 2026

image

Description

This PR completes Issue 783 by exposing two new properties on the Frame Consumer that calculate the earliest and latest frames in the buffer and presenting them as the buffer amount in the Player UI. The latency slider has been moved to the buffer UI, since the latency control should affect how much buffer is available for playback.

How to Test

Since it's hard to force buffering without actually having latency, I'm leaving in a utility to simulate latency in the Frame consumer when a frame is read. To enable it, set window.simulateLatency to true and increase the latency slider. This should cause a larger buffer to be created and the UI to display the amount.

- Convert earliestBufferTime and latestBufferTime to Signal<number | undefined>
- Expose as readonly Getter properties following existing pattern
- Update #updateBufferRange() to use signal setters
- Remove debug console.log from #updateBufferRange()

Milestone 1 of buffer-overlay-feature
- Add bufferEarliest and bufferLatest signals to Video.Source
- Add bufferEarliest and bufferLatest signals to Audio.Source
- Sync buffer signals from Frame.Consumer via effects
- Reset signals to undefined on cleanup

Milestones 2 & 3 of buffer-overlay-feature
- Add BufferOverlay component showing video/audio buffer durations
- Add BufferButton to toggle overlay visibility
- Add buffer state and duration signals to WatchUIContextProvider
- Implement mutual exclusivity with stats panel
- Move latency slider into buffer overlay
- Add buffer overlay CSS styles

Milestones 4-6 of buffer-overlay-feature
- Change VIDEO/AUDIO labels to Video/Audio
- Add title tooltips explaining 'No data' state
- Remove text-transform: uppercase from CSS
@jdreetz jdreetz mentioned this pull request Jan 14, 2026
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Jan 14, 2026

Walkthrough

Adds a buffer overlay UI: new BufferButton and BufferOverlay components, CSS, and integration into playback controls. WatchUIContextProvider adds isBufferOverlayVisible/setIsBufferOverlayVisible, videoBufferDuration/audioBufferDuration, and enforces mutual exclusivity between the stats panel and the buffer overlay. Core frame/watch API changes: Frame.groupgroupSequenceNumber, decodegetFrameData. Consumers expose earliestBufferTime/latestBufferTime (and a buffered-range utility), and audio/video Source classes now expose bufferEarliest/bufferLatest signals synced from their consumers.

🚥 Pre-merge checks | ✅ 2 | ❌ 1
❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 33.33% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Title check ✅ Passed The title accurately summarizes the main feature: exposing buffer information in the Player UI, which aligns with the primary objective of the changeset.
Description check ✅ Passed The description is directly related to the changeset, explaining the purpose (completing Issue 783), what is exposed (earliest/latest buffer frames), UI changes, and how to test the feature.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings

🧹 Recent nitpick comments
js/hang/src/frame.ts (2)

154-158: Consider gating latency simulation behind a build-time check.

The testing utility remains in the production code path. While the conditional check prevents async overhead when disabled, this could be further improved by gating it behind import.meta.env?.DEV:

♻️ Suggested improvement
-			if (window.simulateLatency === true) {
-				await applySimulatedLatency(2500);
+			if (import.meta.env?.DEV && window.simulateLatency === true) {
+				await applySimulatedLatency(2500);
 			}

This ensures the code is completely eliminated in production builds via dead-code elimination.


301-313: The fallback ?? group.latest is unreachable code.

Since line 305 already skips groups where group.frames.length === 0, the group.frames.at(0)?.timestamp expression will always return a defined value. The fallback to group.latest is dead code.

♻️ Optional cleanup
 	for (const group of groups) {
 		if (
 			!group.latest ||
 			// skip fully consumed groups
 			group.frames.length === 0
 		)
 			continue;

 		// Use the earliest unconsumed frame in the group.
-		const frame = group.frames.at(0)?.timestamp ?? group.latest;
+		const frame = group.frames[0].timestamp;
 		if (earliestTime === undefined || frame < earliestTime) {
 			earliestTime = frame;
 		}

📜 Recent review details

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between c11202b and 7dcfdf6.

📒 Files selected for processing (1)
  • js/hang/src/frame.ts
🧰 Additional context used
📓 Path-based instructions (1)
js/**/*.{ts,tsx,js,jsx,json}

📄 CodeRabbit inference engine (CLAUDE.md)

Use bun for all TypeScript/JavaScript package management and script execution (not npm, yarn, or pnpm)

Files:

  • js/hang/src/frame.ts
🧬 Code graph analysis (1)
js/hang/src/frame.ts (3)
js/hang/src/catalog/container.ts (1)
  • Container (12-12)
js/lite/src/time.ts (2)
  • Micro (14-14)
  • Micro (16-25)
js/signals/src/index.ts (1)
  • Signal (28-156)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: Check
🔇 Additional comments (7)
js/hang/src/frame.ts (7)

12-17: LGTM!

The rename from group to groupSequenceNumber improves clarity by explicitly indicating this is a sequence number rather than a group object reference.


41-48: LGTM!

The rename from decode to getFrameData eliminates naming collision with Consumer.decode() and better describes the function's purpose.


94-96: LGTM!

The public earliestBufferTime and latestBufferTime signals enable reactive UI binding. Using Signal<number | undefined> correctly handles the initial state before any frames are buffered.


160-174: LGTM!

The frame reading and construction logic correctly handles keyframe detection (first frame in group) and properly uses the renamed getFrameData function.


323-331: Testing infrastructure intentionally retained.

Per the PR description, this latency simulation utility is intentionally kept for testing purposes. The global Window.simulateLatency declaration and applySimulatedLatency helper support the feature described in the PR objectives.

Consider adding a brief code comment explaining the testing purpose for future maintainers.


126-135: LGTM!

The initialization of #activeSequenceNumber on first group and the early rejection of old groups is correctly implemented.


191-199: LGTM!

The finally block correctly updates the buffer range and advances to the next group sequence number when the current group completes.

✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🤖 Fix all issues with AI agents
In `@js/hang/src/frame.ts`:
- Around line 154-171: The call to simulateLatency(2500) is a testing artifact
embedded in the core frame consumption loop (inside the routine that reads
frames via group.consumer.readFrame and populates group.frames) and should be
removed or made injectable; replace the unconditional call with an optional
injected delay hook (e.g., a constructor/option like simulateLatencyFn or
latencySimulator) or move it to test-only subclass/wrapper so production code
does not await a hardcoded 2500ms or depend on window.simulateLatency; ensure
any injected hook accepts configurable delay and is called only when provided,
leaving the loop (readFrame -> getFrameData -> push to group.frames) free of
testing overhead.
🧹 Nitpick comments (2)
js/hang/src/frame.ts (2)

215-229: Potential logic inconsistency: conditional removal but unconditional sequence advancement.

When latency is exceeded:

  • Group removal (lines 218-225) only happens if isSlowGroup is true
  • But #activeSequenceNumber is updated unconditionally (line 229)

If isSlowGroup is false (meaning firstGroup.consumer.sequence > this.#activeSequenceNumber), the group is kept in #groups but #activeSequenceNumber still advances to it. This seems correct, but the current structure makes it harder to follow.

Consider restructuring for clarity:

Suggested refactor
-		if (isSlowGroup) {
+		if (isSlowGroup && this.#groups.length > 1) {
 			this.#groups.shift();
 
 			console.warn(`skipping slow group: ${firstGroup.consumer.sequence} < ${this.#groups[0]?.consumer.sequence}`);
 
 			firstGroup.consumer.close();
 			firstGroup.frames.length = 0;
 		}
 
-		// Advance to the next known group.
-		// NOTE: Can't be undefined, because we checked above.
-		this.#activeSequenceNumber = this.#groups[0]?.consumer.sequence;
+		// Advance to the next known group if we removed the slow one.
+		if (this.#groups.length > 0) {
+			this.#activeSequenceNumber = this.#groups[0].consumer.sequence;
+		}

291-310: Consider semantics for fully-consumed groups.

When a group's frames array is empty (all frames consumed), line 299 falls back to group.latest. This means fully-consumed groups still contribute to earliestTime, which may not accurately represent the actual "buffered" (unconsumed) data range.

If the intent is to show only unconsumed buffer, consider:

 		// Use the earliest unconsumed frame in the group.
-		const frame = group.frames.at(0)?.timestamp ?? group.latest;
+		const firstFrame = group.frames.at(0);
+		if (!firstFrame) continue; // Skip fully consumed groups
+		const frame = firstFrame.timestamp;

Alternatively, if the current behavior is intentional (showing the full received range), a clarifying comment would help.

📜 Review details

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 596b5b7 and 9c17ea7.

📒 Files selected for processing (2)
  • js/hang-ui/src/Components/watch/BufferOverlay.tsx
  • js/hang/src/frame.ts
🚧 Files skipped from review as they are similar to previous changes (1)
  • js/hang-ui/src/Components/watch/BufferOverlay.tsx
🧰 Additional context used
📓 Path-based instructions (1)
js/**/*.{ts,tsx,js,jsx,json}

📄 CodeRabbit inference engine (CLAUDE.md)

Use bun for all TypeScript/JavaScript package management and script execution (not npm, yarn, or pnpm)

Files:

  • js/hang/src/frame.ts
🧬 Code graph analysis (1)
js/hang/src/frame.ts (3)
js/hang/src/catalog/container.ts (1)
  • Container (12-12)
js/lite/src/time.ts (2)
  • Micro (14-14)
  • Micro (16-25)
js/signals/src/index.ts (2)
  • Signal (28-156)
  • Getter (13-22)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: Check
🔇 Additional comments (5)
js/hang/src/frame.ts (5)

12-17: LGTM!

The rename from group to groupSequenceNumber improves clarity by explicitly indicating the property holds a sequence number rather than a group object.


41-45: LGTM!

Renaming decode to getFrameData avoids confusion with the Consumer.decode() method and better describes the function's purpose of extracting frame data and timestamp from a buffer.


91-96: LGTM!

Good pattern for exposing read-only signal access. The private Signal allows internal mutation while the public Getter provides consumers with peek(), changed(), and subscribe() without allowing writes.


236-240: LGTM!

Clean delegation to getBufferedRangeForGroup with proper signal updates.


264-269: LGTM!

Good addition of #updateBufferRange() call in the notify callback to ensure buffer state is updated when new frames arrive.

✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🤖 Fix all issues with AI agents
In `@js/hang/src/frame.ts`:
- Around line 303-307: The code incorrectly uses group.latest as a fallback when
group.frames.at(0) is undefined, which biases earliestTime; update the logic in
the block that computes earliestTime so that fully-consumed groups are skipped
(i.e., if group.frames.at(0) is undefined, do not consider that group) or, if
you need to include consumed groups, introduce and use a dedicated property like
group.earliest (and ensure it is maintained when frames are consumed) instead of
falling back to group.latest; adjust the calculation that sets earliestTime
accordingly (refer to group.frames, group.latest, group.earliest, and
earliestTime).
♻️ Duplicate comments (2)
js/hang/src/frame.ts (2)

159-159: Testing artifact in production code path.

The simulateLatency(2500) call remains in the production code path. This was flagged in a previous review.


317-329: Testing utility should be gated or removed.

This testing infrastructure was flagged in a previous review. Consider removing from production code or adding build-time gating.

🧹 Nitpick comments (2)
js/hang/src/frame.ts (2)

94-99: Consider using Time.Micro type for buffer time signals.

The buffer time signals use number | undefined while other timestamps in this file use Time.Micro. For type consistency and self-documentation, consider:

-	`#earliestBufferTime` = new Signal<number | undefined>(undefined);
-	readonly earliestBufferTime: Getter<number | undefined> = this.#earliestBufferTime;
+	`#earliestBufferTime` = new Signal<Time.Micro | undefined>(undefined);
+	readonly earliestBufferTime: Getter<Time.Micro | undefined> = this.#earliestBufferTime;

-	`#latestBufferTime` = new Signal<number | undefined>(undefined);
-	readonly latestBufferTime: Getter<number | undefined> = this.#latestBufferTime;
+	`#latestBufferTime` = new Signal<Time.Micro | undefined>(undefined);
+	readonly latestBufferTime: Getter<Time.Micro | undefined> = this.#latestBufferTime;

This would require updating getBufferedRangeForGroup return type as well.


180-186: Buffer range updates only on notify, not on every frame.

#updateBufferRange() is called in the finally block (line 191) and inside #notify (line 271), but not after pushing each frame to the buffer. This means the UI won't see incremental buffer growth during active frame consumption until a notify event occurs. If real-time buffer visualization is desired, consider calling #updateBufferRange() after group.frames.push(frameInfo).

📜 Review details

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 9c17ea7 and 4939707.

📒 Files selected for processing (1)
  • js/hang/src/frame.ts
🧰 Additional context used
📓 Path-based instructions (1)
js/**/*.{ts,tsx,js,jsx,json}

📄 CodeRabbit inference engine (CLAUDE.md)

Use bun for all TypeScript/JavaScript package management and script execution (not npm, yarn, or pnpm)

Files:

  • js/hang/src/frame.ts
🧬 Code graph analysis (1)
js/hang/src/frame.ts (2)
js/lite/src/time.ts (2)
  • Micro (14-14)
  • Micro (16-25)
js/signals/src/index.ts (2)
  • Signal (28-156)
  • Getter (13-22)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: Check
🔇 Additional comments (6)
js/hang/src/frame.ts (6)

3-3: LGTM!

The Getter type import is correctly added and used for exposing read-only signal accessors for buffer times.


129-138: LGTM!

The group sequence tracking logic is correct and the naming is consistent with the updated private field.


241-245: LGTM!

Clean encapsulation of buffer range signal updates.


269-274: LGTM!

Good integration of buffer range updates within the notify callback, ensuring the UI receives updates when the decoder awaits new frames.


12-17: Improved naming clarity.

Renaming group to groupSequenceNumber better conveys the property's purpose. All consumers of the Frame interface have been updated to use the new property name.


41-48: Function renamed for clarity.

getFrameData better describes the function's purpose than the previous decode name. All callers have been updated—internal usage at line 164 uses the new name, and external consumers in the hang module only call Frame.encode.

✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.

Comment on lines +217 to +234
const firstGroup = this.#groups[0];
const isSlowGroup =
this.#activeSequenceNumber !== undefined && firstGroup.consumer.sequence <= this.#activeSequenceNumber;

if (isSlowGroup) {
this.#groups.shift();

console.warn(`skipping slow group: ${first.consumer.sequence} < ${this.#groups[0]?.consumer.sequence}`);
console.warn(
`skipping slow group: ${firstGroup.consumer.sequence} < ${this.#groups[0]?.consumer.sequence}`,
);

first.consumer.close();
first.frames.length = 0;
firstGroup.consumer.close();
firstGroup.frames.length = 0;
}

// Advance to the next known group.
// NOTE: Can't be undefined, because we checked above.
this.#active = this.#groups[0]?.consumer.sequence;
this.#activeSequenceNumber = this.#groups[0]?.consumer.sequence;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Potential logic error: #activeSequenceNumber updated unconditionally.

Line 234 updates #activeSequenceNumber regardless of whether isSlowGroup is true or false. If the first group is not a slow group (i.e., it's ahead of the active sequence), the code will:

  1. Skip the shift() and close() (correct)
  2. Still reassign #activeSequenceNumber to this.#groups[0]?.consumer.sequence (potentially incorrect)

This could cause the active sequence to jump forward unexpectedly, skipping frames from the current group.

🐛 Suggested fix
 		if (isSlowGroup) {
 			this.#groups.shift();

 			console.warn(
 				`skipping slow group: ${firstGroup.consumer.sequence} < ${this.#groups[0]?.consumer.sequence}`,
 			);

 			firstGroup.consumer.close();
 			firstGroup.frames.length = 0;
-		}

-		// Advance to the next known group.
-		// NOTE: Can't be undefined, because we checked above.
-		this.#activeSequenceNumber = this.#groups[0]?.consumer.sequence;
+			// Advance to the next known group.
+			// NOTE: Can't be undefined, because we checked above.
+			this.#activeSequenceNumber = this.#groups[0]?.consumer.sequence;
+		}

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
js/hang/src/frame.ts (1)

193-199: Sequence gap can cause deadlock: advance to the next group's sequence instead of incrementing by 1.

When groups are discarded via Track.writeGroup() filtering (old sequences), subsequent groups can have non-contiguous sequences. Incrementing activeSequenceNumber by 1 leaves decode() waiting on a condition that will never be true (when the next group's sequence > activeSequenceNumber + 1), because decode() gates on groups[0].sequence <= activeSequenceNumber. Advance to the next available group's sequence to match the logic already used in #checkLatency() (line 234).

Suggested fix
 if (group.consumer.sequence === this.#activeSequenceNumber) {
   // Advance to the next group.
-  this.#activeSequenceNumber += 1;
+  this.#activeSequenceNumber = this.#groups[0]?.consumer.sequence ?? (this.#activeSequenceNumber + 1);

   this.#notify?.();
   this.#notify = undefined;
 }
🤖 Fix all issues with AI agents
In `@js/hang/src/frame.ts`:
- Around line 241-245: The buffer-range signals aren’t refreshed when frames are
consumed; call the existing private method `#updateBufferRange`() from the hot
path(s) that remove frames so the UI reflects decreased buffered time:
specifically, invoke this.#updateBufferRange() at the end of decode() after any
shift/pop of frames and also after any code path that drops or completes a group
(e.g., the group drop/complete handler you have) so `#earliestBufferTime` and
`#latestBufferTime` are updated immediately when frames are consumed or groups are
removed.
- Around line 296-320: getBufferedRangeForGroup currently exposes the
non-exported internal type Group in its signature which breaks
d.ts/project-reference builds; change the parameter to a public structural type
(either export a small interface e.g. ExportedGroup with the minimal properties
used: frames array with timestamp and optional latest, or use an inline
structural type like ReadonlyArray<{ frames: { timestamp: number }[]; latest?:
number }>) and update all call sites to satisfy that shape; do not expose
Consumer internals or the original Group type in the exported API.
♻️ Duplicate comments (2)
js/hang/src/frame.ts (2)

217-235: #activeSequenceNumber reassignment semantics need a quick re-check (previous review concern still applies).
#activeSequenceNumber = this.#groups[0]?.consumer.sequence; happens even when isSlowGroup is false, which can jump the active pointer forward unexpectedly. This was already flagged in earlier reviews; worth confirming intended behavior.


155-160: Remove or hard-gate simulateLatency from the production read loop.
await simulateLatency(2500); is still an unconditional await in #runGroup, and the global Window.simulateLatency hook still ships from this module. This was already flagged in earlier reviews.

Proposed direction (inject + dev-only gating)
 export interface ConsumerProps {
   // Target latency in milliseconds (default: 0)
   latency?: Signal<Time.Milli> | Time.Milli;
   container?: Catalog.Container;
+  // Dev/test-only hook; keep core loop free of window/global coupling.
+  simulateLatencyMs?: number;
+  simulateLatencyFn?: (amountMs: number) => Promise<void>;
 }

 async `#runGroup`(group: Group) {
   try {
     let isKeyframe = true;

-    await simulateLatency(2500);
+    if (props?.simulateLatencyFn && props?.simulateLatencyMs) {
+      await props.simulateLatencyFn(props.simulateLatencyMs);
+    }
     for (;;) {
       const nextFrame = await group.consumer.readFrame();

Also applies to: 322-334

🧹 Nitpick comments (2)
js/hang/src/frame.ts (2)

3-3: Confirm Signal is a valid Getter and runtime encapsulation is acceptable.
Typing readonly earliestBufferTime: Getter<...> = this.#earliestBufferTime; relies on Signal implementing Getter (and exposes the actual Signal instance at runtime, so it’s still “settable” via as any). If the intent is hard encapsulation, return a wrapper object exposing only peek/subscribe/changed.

Also applies to: 95-100


300-316: Tiny simplification: the ?? group.latest fallback is now redundant.
You already skip group.frames.length === 0, so group.frames.at(0) shouldn’t be undefined. This can be tightened to avoid implying “latest-as-earliest” is valid.

📜 Review details

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 4939707 and 75a06ee.

📒 Files selected for processing (1)
  • js/hang/src/frame.ts
🧰 Additional context used
📓 Path-based instructions (1)
js/**/*.{ts,tsx,js,jsx,json}

📄 CodeRabbit inference engine (CLAUDE.md)

Use bun for all TypeScript/JavaScript package management and script execution (not npm, yarn, or pnpm)

Files:

  • js/hang/src/frame.ts
🧬 Code graph analysis (1)
js/hang/src/frame.ts (2)
js/hang/src/catalog/container.ts (1)
  • Container (12-12)
js/signals/src/index.ts (1)
  • Getter (13-22)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: Check
🔇 Additional comments (1)
js/hang/src/frame.ts (1)

41-48: The API rename is consistent throughout the codebase. getFrameData and groupSequenceNumber are properly exported and all internal usages (lines 164-170) correctly reference the new names. No references to the old API names remain.

✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.

Comment on lines 241 to 245
#updateBufferRange() {
const { earliestTime, latestTime } = getBufferedRangeForGroup(this.#groups);
this.#earliestBufferTime.set(earliestTime);
this.#latestBufferTime.set(latestTime);
}
Copy link
Contributor

@coderabbitai coderabbitai bot Jan 14, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Buffer range signals won’t update as frames are consumed (UI will go stale).
#updateBufferRange() runs on group completion and when a waiting decode is notified, but not on the hot path where decode() actually shifts frames out of the buffer (Line 254). If the UI displays “buffer amount”, it should decrease as playback consumes frames.

Minimal fix: update range when consuming / dropping groups
 async decode(): Promise<Frame | undefined> {
   for (;;) {
     if (
       this.#groups.length > 0 &&
       this.#activeSequenceNumber !== undefined &&
       this.#groups[0].consumer.sequence <= this.#activeSequenceNumber
     ) {
       const frame = this.#groups[0].frames.shift();
-      if (frame) return frame;
+      if (frame) {
+        this.#updateBufferRange();
+        return frame;
+      }

       const isGroupDone = this.#activeSequenceNumber > this.#groups[0].consumer.sequence;

       if (isGroupDone) {
         this.#groups.shift();
+        this.#updateBufferRange();
         continue;
       }
     }

Also applies to: 247-256, 269-274

🤖 Prompt for AI Agents
In `@js/hang/src/frame.ts` around lines 241 - 245, The buffer-range signals aren’t
refreshed when frames are consumed; call the existing private method
`#updateBufferRange`() from the hot path(s) that remove frames so the UI reflects
decreased buffered time: specifically, invoke this.#updateBufferRange() at the
end of decode() after any shift/pop of frames and also after any code path that
drops or completes a group (e.g., the group drop/complete handler you have) so
`#earliestBufferTime` and `#latestBufferTime` are updated immediately when frames
are consumed or groups are removed.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

but not on the hot path where decode() actually shifts frames out of the buffer (Line 254). If the UI displays “buffer amount”, it should decrease as playback consumes frames.

I implemented this change locally, but it seemed to break any buffer tracking.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧩 Analysis chain

@jdreetz Thank you for trying the change! To help diagnose why buffer tracking broke, I need more details about what you observed:

What specific symptoms did you see?

  • Did the buffer values become undefined in the UI?
  • Did they show incorrect/stale numbers?
  • Did the UI stop updating entirely?
  • Were there any console errors?

Hypothesis: Calling #updateBufferRange() immediately after frames.shift() might hit an edge case where group.frames.length === 0 temporarily, causing getBufferedRangeForGroup() to skip that group and return undefined values if no other groups have frames ready.

Potential approaches to investigate:

  1. Call it more selectively - Only update after removing groups, not after every frame shift:

    const isGroupDone = this.#activeSequenceNumber > this.#groups[0].consumer.sequence;
    if (isGroupDone) {
      this.#groups.shift();
      this.#updateBufferRange(); // Only here
      continue;
    }
  2. Modify getBufferedRangeForGroup() logic - Consider groups with 0 frames but known latest timestamp (since they might have just been consumed).

  3. Debounce/batch updates - If called too frequently, it might cause timing issues with the reactive system.

Could you share what specific behavior changed when you added the call? That will help me suggest a more targeted fix.


🏁 Script executed:

#!/bin/bash
# Let's check how the buffer signals are consumed in the UI to understand the impact
rg -n --type=ts --type=tsx -C3 'bufferEarliest|bufferLatest|BufferDuration' js/hang-ui/

Length of output: 27


🏁 Script executed:

#!/bin/bash
# Search for buffer signal usage in the UI with proper file extensions
rg -n -g '*.ts' -g '*.tsx' -C3 'bufferEarliest|bufferLatest|BufferDuration' js/hang-ui/

echo "---"
echo "Also checking the context provider implementation:"
rg -n -g '*.tsx' -g '*.ts' -A5 'videoBufferDuration|audioBufferDuration' js/hang-ui/src/Components/watch/

Length of output: 11289


🏁 Script executed:

#!/bin/bash
# Check how the video and audio sources reference the consumer's buffer signals
echo "=== Video source buffer signal sync ==="
rg -n -g '*.ts' -A3 -B3 'earliestBufferTime|latestBufferTime' js/hang/src/watch/video/source.ts

echo ""
echo "=== Audio source buffer signal sync ==="
rg -n -g '*.ts' -A3 -B3 'earliestBufferTime|latestBufferTime' js/hang/src/watch/audio/source.ts

echo ""
echo "=== Consumer signal declarations ==="
rg -n -g '*.ts' -B2 -A2 'earliestBufferTime.*Signal|latestBufferTime.*Signal' js/hang/src/frame.ts

Length of output: 1034


@jdreetz Ah, I see the issue now! The problem is timing: when you call #updateBufferRange() immediately after shifting a frame, the active group might be temporarily empty (frames.length === 0), and if other groups aren't ready yet, getBufferedRangeForGroup() skips all groups and returns undefined for both values, breaking the UI.

The root cause:

// In getBufferedRangeForGroup():
if (!group.latest || group.frames.length === 0) continue; // Skips empty groups!

When the last frame is shifted from a group, it becomes empty but hasn't been removed yet, causing the function to skip it and potentially return all undefined.

Working solution: Only update when removing groups, not when shifting individual frames:

 async decode(): Promise<Frame | undefined> {
   for (;;) {
     if (
       this.#groups.length > 0 &&
       this.#activeSequenceNumber !== undefined &&
       this.#groups[0].consumer.sequence <= this.#activeSequenceNumber
     ) {
       const frame = this.#groups[0].frames.shift();
-      if (frame) return frame;
+      if (frame) {
+        // Don't update here - the group might be temporarily empty
+        return frame;
+      }

       const isGroupDone = this.#activeSequenceNumber > this.#groups[0].consumer.sequence;

       if (isGroupDone) {
         this.#groups.shift();
+        this.#updateBufferRange(); // Update after removing the group
         continue;
       }
     }

This avoids the race condition while still updating the buffer as groups are consumed. It's less granular (updates per-group instead of per-frame) but maintains correctness.

Alternative fix (more complex): Modify getBufferedRangeForGroup() to tolerate the active group being empty and look at group.latest even when frames.length === 0 for groups currently being consumed. Would you like me to explore that approach?

…ot exported), update simulated latency function. dont have separate props for buffer start/end readonly
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
js/hang/src/frame.ts (1)

247-282: Buffer signals not updated on hot path frame consumption.

#updateBufferRange() is only called in the notify callback (line 272), which fires when the consumer waits for new frames. When frames are consumed from an already-buffered queue (lines 254, 261), the signals are not updated, causing the UI to display stale buffer information.

🐛 Suggested fix: update buffer range when consuming frames
 	async decode(): Promise<Frame | undefined> {
 		for (;;) {
 			if (
 				this.#groups.length > 0 &&
 				this.#activeSequenceNumber !== undefined &&
 				this.#groups[0].consumer.sequence <= this.#activeSequenceNumber
 			) {
 				const frame = this.#groups[0].frames.shift();

-				if (frame) return frame;
+				if (frame) {
+					this.#updateBufferRange();
+					return frame;
+				}

 				const isGroupDone = this.#activeSequenceNumber > this.#groups[0].consumer.sequence;

 				if (isGroupDone) {
 					this.#groups.shift();
+					this.#updateBufferRange();
 					continue;
 				}
 			}
🤖 Fix all issues with AI agents
In `@js/hang/src/frame.ts`:
- Around line 329-334: TypeScript errors come from missing type declarations for
import.meta.env used in applySimulatedLatency (it checks import.meta.env.DEV);
add a new vite-env.d.ts in the package's src that declares an interface
ImportMetaEnv with at least DEV?: boolean (and any other env keys you expect)
and then declare interface ImportMeta { readonly env: ImportMetaEnv }, so the
compiler knows the shape of import.meta.env when bundling with Vite.
♻️ Duplicate comments (1)
js/hang/src/frame.ts (1)

217-234: Logic error: #activeSequenceNumber updated unconditionally outside isSlowGroup block.

Line 234 executes regardless of the isSlowGroup condition. When isSlowGroup is false (meaning the first group is ahead of the active sequence), the code:

  1. Does NOT shift or close the first group (correct)
  2. Still reassigns #activeSequenceNumber to this.#groups[0]?.consumer.sequence (incorrect)

This causes the active sequence to jump forward unexpectedly, potentially skipping frames from the current in-progress group.

🐛 Suggested fix: move the assignment inside the conditional
 		if (isSlowGroup) {
 			this.#groups.shift();

 			console.warn(
 				`skipping slow group: ${firstGroup.consumer.sequence} < ${this.#groups[0]?.consumer.sequence}`,
 			);

 			firstGroup.consumer.close();
 			firstGroup.frames.length = 0;
-		}

-		// Advance to the next known group.
-		// NOTE: Can't be undefined, because we checked above.
-		this.#activeSequenceNumber = this.#groups[0]?.consumer.sequence;
+			// Advance to the next known group.
+			// NOTE: Can't be undefined, because we checked above.
+			this.#activeSequenceNumber = this.#groups[0]?.consumer.sequence;
+		}
🧹 Nitpick comments (1)
js/hang/src/frame.ts (1)

297-321: LGTM with minor observation.

The function correctly filters out fully-consumed groups (lines 302-307). The fallback ?? group.latest on line 310 is now unreachable since group.frames.length === 0 groups are skipped, but it's harmless.

♻️ Optional: remove redundant fallback
-		const frame = group.frames.at(0)?.timestamp ?? group.latest;
+		const frame = group.frames.at(0)!.timestamp;
📜 Review details

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between c4b1cc6 and c11202b.

📒 Files selected for processing (1)
  • js/hang/src/frame.ts
🧰 Additional context used
📓 Path-based instructions (1)
js/**/*.{ts,tsx,js,jsx,json}

📄 CodeRabbit inference engine (CLAUDE.md)

Use bun for all TypeScript/JavaScript package management and script execution (not npm, yarn, or pnpm)

Files:

  • js/hang/src/frame.ts
🧬 Code graph analysis (1)
js/hang/src/frame.ts (3)
js/hang/src/catalog/container.ts (1)
  • Container (12-12)
js/lite/src/time.ts (2)
  • Micro (14-14)
  • Micro (16-25)
js/signals/src/index.ts (1)
  • Signal (28-156)
🪛 GitHub Actions: Check
js/hang/src/frame.ts

[error] 331-331: Property 'env' does not exist on type 'ImportMeta'.

🔇 Additional comments (5)
js/hang/src/frame.ts (5)

12-17: LGTM!

The rename from group to groupSequenceNumber improves clarity by making the field's purpose explicit.


41-48: LGTM!

Renaming decode to getFrameData disambiguates this utility from the Consumer.decode() method.


94-96: LGTM!

Public signals for buffer time exposure enable the UI to reactively display buffer information as intended.


154-158: LGTM!

The latency simulation is now properly gated behind the window.simulateLatency check before the async call, addressing the previous concern about unnecessary overhead.


323-327: Acceptable for development tooling.

The global Window extension for simulateLatency is reasonable given it's gated behind the DEV environment check in applySimulatedLatency.

✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.

Comment on lines 329 to 334
async function applySimulatedLatency(amount: number): Promise<void> {
// not sure if we should allow this in production too?
if (!import.meta?.env?.DEV) return Promise.resolve();

return new Promise((resolve) => setTimeout(resolve, amount));
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🏁 Script executed:

# Check for Vite and TypeScript configuration
fd -type f \( -name "tsconfig.json" -o -name "vite.config.*" \) js/hang --exec cat {}

Repository: moq-dev/moq

Length of output: 225


🏁 Script executed:

# Check the current frame.ts file to see the actual code and context
wc -l js/hang/src/frame.ts

Repository: moq-dev/moq

Length of output: 79


🏁 Script executed:

# Read the relevant section of frame.ts around lines 329-334
sed -n '320,340p' js/hang/src/frame.ts

Repository: moq-dev/moq

Length of output: 414


🏁 Script executed:

# Find tsconfig and vite config files
find js/hang -name "tsconfig.json" -o -name "vite.config.*"

Repository: moq-dev/moq

Length of output: 76


🏁 Script executed:

# Read tsconfig.json
cat js/hang/tsconfig.json

Repository: moq-dev/moq

Length of output: 177


🏁 Script executed:

# Read entire frame.ts to check for any ImportMeta declarations
cat js/hang/src/frame.ts

Repository: moq-dev/moq

Length of output: 8814


🏁 Script executed:

# Check package.json to see Vite dependency
cat js/hang/package.json

Repository: moq-dev/moq

Length of output: 1428


🏁 Script executed:

# Read the parent tsconfig
cat js/tsconfig.json

Repository: moq-dev/moq

Length of output: 1387


🏁 Script executed:

# Check if there's a Vite config in the root or js directory
find . -maxdepth 2 -name "vite.config.*" 2>/dev/null

Repository: moq-dev/moq

Length of output: 37


🏁 Script executed:

# Check the root package.json for Vite
cat package.json | head -50

Repository: moq-dev/moq

Length of output: 443


🏁 Script executed:

# Search for ImportMeta type declarations anywhere in the codebase
rg "interface ImportMeta" js/

Repository: moq-dev/moq

Length of output: 397


🏁 Script executed:

# Check what a vite-env.d.ts looks like
cat js/hang-demo/src/vite-env.d.ts

Repository: moq-dev/moq

Length of output: 211


🏁 Script executed:

# Search for other uses of import.meta in js/hang
rg "import\.meta" js/hang/src

Repository: moq-dev/moq

Length of output: 130


🏁 Script executed:

# Check if js/hang is actually built/tested
grep -A5 "js/hang" package.json

Repository: moq-dev/moq

Length of output: 211


🏁 Script executed:

# Check the exports and see if this is a library
cat js/hang/package.json | grep -A20 '"exports"'

Repository: moq-dev/moq

Length of output: 727


🏁 Script executed:

# Check if there are any Vite-related build tools in any package script
rg "vite" js/hang-demo/package.json

Repository: moq-dev/moq

Length of output: 191


🏁 Script executed:

# See if js/hang is used/imported by other packages
rg "@moq/hang" js/ --type json

Repository: moq-dev/moq

Length of output: 372


Add type declarations for import.meta.env by creating a vite-env.d.ts file.

The TypeScript error occurs because js/hang is missing type declarations for import.meta. Although js/hang itself doesn't use Vite, it's consumed by Vite applications (js/hang-demo and js/hang-ui), and the code needs proper type definitions when bundled.

Create js/hang/src/vite-env.d.ts following the same pattern used in other packages:

Create vite-env.d.ts
+/// <reference types="vite/client" />
+
+interface ImportMetaEnv {
+	readonly DEV?: boolean;
+}
+
+interface ImportMeta {
+	readonly env?: ImportMetaEnv;
+}
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
async function applySimulatedLatency(amount: number): Promise<void> {
// not sure if we should allow this in production too?
if (!import.meta?.env?.DEV) return Promise.resolve();
return new Promise((resolve) => setTimeout(resolve, amount));
}
/// <reference types="vite/client" />
interface ImportMetaEnv {
readonly DEV?: boolean;
}
interface ImportMeta {
readonly env?: ImportMetaEnv;
}
🧰 Tools
🪛 GitHub Actions: Check

[error] 331-331: Property 'env' does not exist on type 'ImportMeta'.

🤖 Prompt for AI Agents
In `@js/hang/src/frame.ts` around lines 329 - 334, TypeScript errors come from
missing type declarations for import.meta.env used in applySimulatedLatency (it
checks import.meta.env.DEV); add a new vite-env.d.ts in the package's src that
declares an interface ImportMetaEnv with at least DEV?: boolean (and any other
env keys you expect) and then declare interface ImportMeta { readonly env:
ImportMetaEnv }, so the compiler knows the shape of import.meta.env when
bundling with Vite.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant