-
Notifications
You must be signed in to change notification settings - Fork 136
feat: Expose buffer information in Player UI #845
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
feat: Expose buffer information in Player UI #845
Conversation
- Convert earliestBufferTime and latestBufferTime to Signal<number | undefined> - Expose as readonly Getter properties following existing pattern - Update #updateBufferRange() to use signal setters - Remove debug console.log from #updateBufferRange() Milestone 1 of buffer-overlay-feature
- Add bufferEarliest and bufferLatest signals to Video.Source - Add bufferEarliest and bufferLatest signals to Audio.Source - Sync buffer signals from Frame.Consumer via effects - Reset signals to undefined on cleanup Milestones 2 & 3 of buffer-overlay-feature
- Add BufferOverlay component showing video/audio buffer durations - Add BufferButton to toggle overlay visibility - Add buffer state and duration signals to WatchUIContextProvider - Implement mutual exclusivity with stats panel - Move latency slider into buffer overlay - Add buffer overlay CSS styles Milestones 4-6 of buffer-overlay-feature
- Change VIDEO/AUDIO labels to Video/Audio - Add title tooltips explaining 'No data' state - Remove text-transform: uppercase from CSS
WalkthroughAdds a buffer overlay UI: new BufferButton and BufferOverlay components, CSS, and integration into playback controls. WatchUIContextProvider adds 🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches
🧹 Recent nitpick comments
📜 Recent review detailsConfiguration used: Organization UI Review profile: CHILL Plan: Pro 📒 Files selected for processing (1)
🧰 Additional context used📓 Path-based instructions (1)js/**/*.{ts,tsx,js,jsx,json}📄 CodeRabbit inference engine (CLAUDE.md)
Files:
🧬 Code graph analysis (1)js/hang/src/frame.ts (3)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
🔇 Additional comments (7)
✏️ Tip: You can disable this entire section by setting Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
🤖 Fix all issues with AI agents
In `@js/hang/src/frame.ts`:
- Around line 154-171: The call to simulateLatency(2500) is a testing artifact
embedded in the core frame consumption loop (inside the routine that reads
frames via group.consumer.readFrame and populates group.frames) and should be
removed or made injectable; replace the unconditional call with an optional
injected delay hook (e.g., a constructor/option like simulateLatencyFn or
latencySimulator) or move it to test-only subclass/wrapper so production code
does not await a hardcoded 2500ms or depend on window.simulateLatency; ensure
any injected hook accepts configurable delay and is called only when provided,
leaving the loop (readFrame -> getFrameData -> push to group.frames) free of
testing overhead.
🧹 Nitpick comments (2)
js/hang/src/frame.ts (2)
215-229: Potential logic inconsistency: conditional removal but unconditional sequence advancement.When latency is exceeded:
- Group removal (lines 218-225) only happens if
isSlowGroupis true- But
#activeSequenceNumberis updated unconditionally (line 229)If
isSlowGroupis false (meaningfirstGroup.consumer.sequence > this.#activeSequenceNumber), the group is kept in#groupsbut#activeSequenceNumberstill advances to it. This seems correct, but the current structure makes it harder to follow.Consider restructuring for clarity:
Suggested refactor
- if (isSlowGroup) { + if (isSlowGroup && this.#groups.length > 1) { this.#groups.shift(); console.warn(`skipping slow group: ${firstGroup.consumer.sequence} < ${this.#groups[0]?.consumer.sequence}`); firstGroup.consumer.close(); firstGroup.frames.length = 0; } - // Advance to the next known group. - // NOTE: Can't be undefined, because we checked above. - this.#activeSequenceNumber = this.#groups[0]?.consumer.sequence; + // Advance to the next known group if we removed the slow one. + if (this.#groups.length > 0) { + this.#activeSequenceNumber = this.#groups[0].consumer.sequence; + }
291-310: Consider semantics for fully-consumed groups.When a group's
framesarray is empty (all frames consumed), line 299 falls back togroup.latest. This means fully-consumed groups still contribute toearliestTime, which may not accurately represent the actual "buffered" (unconsumed) data range.If the intent is to show only unconsumed buffer, consider:
// Use the earliest unconsumed frame in the group. - const frame = group.frames.at(0)?.timestamp ?? group.latest; + const firstFrame = group.frames.at(0); + if (!firstFrame) continue; // Skip fully consumed groups + const frame = firstFrame.timestamp;Alternatively, if the current behavior is intentional (showing the full received range), a clarifying comment would help.
📜 Review details
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
js/hang-ui/src/Components/watch/BufferOverlay.tsxjs/hang/src/frame.ts
🚧 Files skipped from review as they are similar to previous changes (1)
- js/hang-ui/src/Components/watch/BufferOverlay.tsx
🧰 Additional context used
📓 Path-based instructions (1)
js/**/*.{ts,tsx,js,jsx,json}
📄 CodeRabbit inference engine (CLAUDE.md)
Use
bunfor all TypeScript/JavaScript package management and script execution (not npm, yarn, or pnpm)
Files:
js/hang/src/frame.ts
🧬 Code graph analysis (1)
js/hang/src/frame.ts (3)
js/hang/src/catalog/container.ts (1)
Container(12-12)js/lite/src/time.ts (2)
Micro(14-14)Micro(16-25)js/signals/src/index.ts (2)
Signal(28-156)Getter(13-22)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: Check
🔇 Additional comments (5)
js/hang/src/frame.ts (5)
12-17: LGTM!The rename from
grouptogroupSequenceNumberimproves clarity by explicitly indicating the property holds a sequence number rather than a group object.
41-45: LGTM!Renaming
decodetogetFrameDataavoids confusion with theConsumer.decode()method and better describes the function's purpose of extracting frame data and timestamp from a buffer.
91-96: LGTM!Good pattern for exposing read-only signal access. The private
Signalallows internal mutation while the publicGetterprovides consumers withpeek(),changed(), andsubscribe()without allowing writes.
236-240: LGTM!Clean delegation to
getBufferedRangeForGroupwith proper signal updates.
264-269: LGTM!Good addition of
#updateBufferRange()call in the notify callback to ensure buffer state is updated when new frames arrive.
✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
🤖 Fix all issues with AI agents
In `@js/hang/src/frame.ts`:
- Around line 303-307: The code incorrectly uses group.latest as a fallback when
group.frames.at(0) is undefined, which biases earliestTime; update the logic in
the block that computes earliestTime so that fully-consumed groups are skipped
(i.e., if group.frames.at(0) is undefined, do not consider that group) or, if
you need to include consumed groups, introduce and use a dedicated property like
group.earliest (and ensure it is maintained when frames are consumed) instead of
falling back to group.latest; adjust the calculation that sets earliestTime
accordingly (refer to group.frames, group.latest, group.earliest, and
earliestTime).
♻️ Duplicate comments (2)
js/hang/src/frame.ts (2)
159-159: Testing artifact in production code path.The
simulateLatency(2500)call remains in the production code path. This was flagged in a previous review.
317-329: Testing utility should be gated or removed.This testing infrastructure was flagged in a previous review. Consider removing from production code or adding build-time gating.
🧹 Nitpick comments (2)
js/hang/src/frame.ts (2)
94-99: Consider usingTime.Microtype for buffer time signals.The buffer time signals use
number | undefinedwhile other timestamps in this file useTime.Micro. For type consistency and self-documentation, consider:- `#earliestBufferTime` = new Signal<number | undefined>(undefined); - readonly earliestBufferTime: Getter<number | undefined> = this.#earliestBufferTime; + `#earliestBufferTime` = new Signal<Time.Micro | undefined>(undefined); + readonly earliestBufferTime: Getter<Time.Micro | undefined> = this.#earliestBufferTime; - `#latestBufferTime` = new Signal<number | undefined>(undefined); - readonly latestBufferTime: Getter<number | undefined> = this.#latestBufferTime; + `#latestBufferTime` = new Signal<Time.Micro | undefined>(undefined); + readonly latestBufferTime: Getter<Time.Micro | undefined> = this.#latestBufferTime;This would require updating
getBufferedRangeForGroupreturn type as well.
180-186: Buffer range updates only on notify, not on every frame.
#updateBufferRange()is called in thefinallyblock (line 191) and inside#notify(line 271), but not after pushing each frame to the buffer. This means the UI won't see incremental buffer growth during active frame consumption until a notify event occurs. If real-time buffer visualization is desired, consider calling#updateBufferRange()aftergroup.frames.push(frameInfo).
📜 Review details
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
js/hang/src/frame.ts
🧰 Additional context used
📓 Path-based instructions (1)
js/**/*.{ts,tsx,js,jsx,json}
📄 CodeRabbit inference engine (CLAUDE.md)
Use
bunfor all TypeScript/JavaScript package management and script execution (not npm, yarn, or pnpm)
Files:
js/hang/src/frame.ts
🧬 Code graph analysis (1)
js/hang/src/frame.ts (2)
js/lite/src/time.ts (2)
Micro(14-14)Micro(16-25)js/signals/src/index.ts (2)
Signal(28-156)Getter(13-22)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: Check
🔇 Additional comments (6)
js/hang/src/frame.ts (6)
3-3: LGTM!The
Gettertype import is correctly added and used for exposing read-only signal accessors for buffer times.
129-138: LGTM!The group sequence tracking logic is correct and the naming is consistent with the updated private field.
241-245: LGTM!Clean encapsulation of buffer range signal updates.
269-274: LGTM!Good integration of buffer range updates within the notify callback, ensuring the UI receives updates when the decoder awaits new frames.
12-17: Improved naming clarity.Renaming
grouptogroupSequenceNumberbetter conveys the property's purpose. All consumers of theFrameinterface have been updated to use the new property name.
41-48: Function renamed for clarity.
getFrameDatabetter describes the function's purpose than the previousdecodename. All callers have been updated—internal usage at line 164 uses the new name, and external consumers in the hang module only callFrame.encode.
✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.
| const firstGroup = this.#groups[0]; | ||
| const isSlowGroup = | ||
| this.#activeSequenceNumber !== undefined && firstGroup.consumer.sequence <= this.#activeSequenceNumber; | ||
|
|
||
| if (isSlowGroup) { | ||
| this.#groups.shift(); | ||
|
|
||
| console.warn(`skipping slow group: ${first.consumer.sequence} < ${this.#groups[0]?.consumer.sequence}`); | ||
| console.warn( | ||
| `skipping slow group: ${firstGroup.consumer.sequence} < ${this.#groups[0]?.consumer.sequence}`, | ||
| ); | ||
|
|
||
| first.consumer.close(); | ||
| first.frames.length = 0; | ||
| firstGroup.consumer.close(); | ||
| firstGroup.frames.length = 0; | ||
| } | ||
|
|
||
| // Advance to the next known group. | ||
| // NOTE: Can't be undefined, because we checked above. | ||
| this.#active = this.#groups[0]?.consumer.sequence; | ||
| this.#activeSequenceNumber = this.#groups[0]?.consumer.sequence; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Potential logic error: #activeSequenceNumber updated unconditionally.
Line 234 updates #activeSequenceNumber regardless of whether isSlowGroup is true or false. If the first group is not a slow group (i.e., it's ahead of the active sequence), the code will:
- Skip the
shift()andclose()(correct) - Still reassign
#activeSequenceNumbertothis.#groups[0]?.consumer.sequence(potentially incorrect)
This could cause the active sequence to jump forward unexpectedly, skipping frames from the current group.
🐛 Suggested fix
if (isSlowGroup) {
this.#groups.shift();
console.warn(
`skipping slow group: ${firstGroup.consumer.sequence} < ${this.#groups[0]?.consumer.sequence}`,
);
firstGroup.consumer.close();
firstGroup.frames.length = 0;
- }
- // Advance to the next known group.
- // NOTE: Can't be undefined, because we checked above.
- this.#activeSequenceNumber = this.#groups[0]?.consumer.sequence;
+ // Advance to the next known group.
+ // NOTE: Can't be undefined, because we checked above.
+ this.#activeSequenceNumber = this.#groups[0]?.consumer.sequence;
+ }There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
js/hang/src/frame.ts (1)
193-199: Sequence gap can cause deadlock: advance to the next group's sequence instead of incrementing by 1.When groups are discarded via
Track.writeGroup()filtering (old sequences), subsequent groups can have non-contiguous sequences. IncrementingactiveSequenceNumberby 1 leavesdecode()waiting on a condition that will never be true (when the next group's sequence > activeSequenceNumber + 1), becausedecode()gates ongroups[0].sequence <= activeSequenceNumber. Advance to the next available group's sequence to match the logic already used in#checkLatency()(line 234).Suggested fix
if (group.consumer.sequence === this.#activeSequenceNumber) { // Advance to the next group. - this.#activeSequenceNumber += 1; + this.#activeSequenceNumber = this.#groups[0]?.consumer.sequence ?? (this.#activeSequenceNumber + 1); this.#notify?.(); this.#notify = undefined; }
🤖 Fix all issues with AI agents
In `@js/hang/src/frame.ts`:
- Around line 241-245: The buffer-range signals aren’t refreshed when frames are
consumed; call the existing private method `#updateBufferRange`() from the hot
path(s) that remove frames so the UI reflects decreased buffered time:
specifically, invoke this.#updateBufferRange() at the end of decode() after any
shift/pop of frames and also after any code path that drops or completes a group
(e.g., the group drop/complete handler you have) so `#earliestBufferTime` and
`#latestBufferTime` are updated immediately when frames are consumed or groups are
removed.
- Around line 296-320: getBufferedRangeForGroup currently exposes the
non-exported internal type Group in its signature which breaks
d.ts/project-reference builds; change the parameter to a public structural type
(either export a small interface e.g. ExportedGroup with the minimal properties
used: frames array with timestamp and optional latest, or use an inline
structural type like ReadonlyArray<{ frames: { timestamp: number }[]; latest?:
number }>) and update all call sites to satisfy that shape; do not expose
Consumer internals or the original Group type in the exported API.
♻️ Duplicate comments (2)
js/hang/src/frame.ts (2)
217-235:#activeSequenceNumberreassignment semantics need a quick re-check (previous review concern still applies).
#activeSequenceNumber = this.#groups[0]?.consumer.sequence;happens even whenisSlowGroupis false, which can jump the active pointer forward unexpectedly. This was already flagged in earlier reviews; worth confirming intended behavior.
155-160: Remove or hard-gatesimulateLatencyfrom the production read loop.
await simulateLatency(2500);is still an unconditional await in#runGroup, and the globalWindow.simulateLatencyhook still ships from this module. This was already flagged in earlier reviews.Proposed direction (inject + dev-only gating)
export interface ConsumerProps { // Target latency in milliseconds (default: 0) latency?: Signal<Time.Milli> | Time.Milli; container?: Catalog.Container; + // Dev/test-only hook; keep core loop free of window/global coupling. + simulateLatencyMs?: number; + simulateLatencyFn?: (amountMs: number) => Promise<void>; } async `#runGroup`(group: Group) { try { let isKeyframe = true; - await simulateLatency(2500); + if (props?.simulateLatencyFn && props?.simulateLatencyMs) { + await props.simulateLatencyFn(props.simulateLatencyMs); + } for (;;) { const nextFrame = await group.consumer.readFrame();Also applies to: 322-334
🧹 Nitpick comments (2)
js/hang/src/frame.ts (2)
3-3: ConfirmSignalis a validGetterand runtime encapsulation is acceptable.
Typingreadonly earliestBufferTime: Getter<...> = this.#earliestBufferTime;relies onSignalimplementingGetter(and exposes the actualSignalinstance at runtime, so it’s still “settable” viaas any). If the intent is hard encapsulation, return a wrapper object exposing onlypeek/subscribe/changed.Also applies to: 95-100
300-316: Tiny simplification: the?? group.latestfallback is now redundant.
You already skipgroup.frames.length === 0, sogroup.frames.at(0)shouldn’t be undefined. This can be tightened to avoid implying “latest-as-earliest” is valid.
📜 Review details
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
js/hang/src/frame.ts
🧰 Additional context used
📓 Path-based instructions (1)
js/**/*.{ts,tsx,js,jsx,json}
📄 CodeRabbit inference engine (CLAUDE.md)
Use
bunfor all TypeScript/JavaScript package management and script execution (not npm, yarn, or pnpm)
Files:
js/hang/src/frame.ts
🧬 Code graph analysis (1)
js/hang/src/frame.ts (2)
js/hang/src/catalog/container.ts (1)
Container(12-12)js/signals/src/index.ts (1)
Getter(13-22)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: Check
🔇 Additional comments (1)
js/hang/src/frame.ts (1)
41-48: The API rename is consistent throughout the codebase.getFrameDataandgroupSequenceNumberare properly exported and all internal usages (lines 164-170) correctly reference the new names. No references to the old API names remain.
✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.
| #updateBufferRange() { | ||
| const { earliestTime, latestTime } = getBufferedRangeForGroup(this.#groups); | ||
| this.#earliestBufferTime.set(earliestTime); | ||
| this.#latestBufferTime.set(latestTime); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Buffer range signals won’t update as frames are consumed (UI will go stale).
#updateBufferRange() runs on group completion and when a waiting decode is notified, but not on the hot path where decode() actually shifts frames out of the buffer (Line 254). If the UI displays “buffer amount”, it should decrease as playback consumes frames.
Minimal fix: update range when consuming / dropping groups
async decode(): Promise<Frame | undefined> {
for (;;) {
if (
this.#groups.length > 0 &&
this.#activeSequenceNumber !== undefined &&
this.#groups[0].consumer.sequence <= this.#activeSequenceNumber
) {
const frame = this.#groups[0].frames.shift();
- if (frame) return frame;
+ if (frame) {
+ this.#updateBufferRange();
+ return frame;
+ }
const isGroupDone = this.#activeSequenceNumber > this.#groups[0].consumer.sequence;
if (isGroupDone) {
this.#groups.shift();
+ this.#updateBufferRange();
continue;
}
}Also applies to: 247-256, 269-274
🤖 Prompt for AI Agents
In `@js/hang/src/frame.ts` around lines 241 - 245, The buffer-range signals aren’t
refreshed when frames are consumed; call the existing private method
`#updateBufferRange`() from the hot path(s) that remove frames so the UI reflects
decreased buffered time: specifically, invoke this.#updateBufferRange() at the
end of decode() after any shift/pop of frames and also after any code path that
drops or completes a group (e.g., the group drop/complete handler you have) so
`#earliestBufferTime` and `#latestBufferTime` are updated immediately when frames
are consumed or groups are removed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
but not on the hot path where decode() actually shifts frames out of the buffer (Line 254). If the UI displays “buffer amount”, it should decrease as playback consumes frames.
I implemented this change locally, but it seemed to break any buffer tracking.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
@jdreetz Thank you for trying the change! To help diagnose why buffer tracking broke, I need more details about what you observed:
What specific symptoms did you see?
- Did the buffer values become
undefinedin the UI? - Did they show incorrect/stale numbers?
- Did the UI stop updating entirely?
- Were there any console errors?
Hypothesis: Calling #updateBufferRange() immediately after frames.shift() might hit an edge case where group.frames.length === 0 temporarily, causing getBufferedRangeForGroup() to skip that group and return undefined values if no other groups have frames ready.
Potential approaches to investigate:
-
Call it more selectively - Only update after removing groups, not after every frame shift:
const isGroupDone = this.#activeSequenceNumber > this.#groups[0].consumer.sequence; if (isGroupDone) { this.#groups.shift(); this.#updateBufferRange(); // Only here continue; }
-
Modify
getBufferedRangeForGroup()logic - Consider groups with 0 frames but knownlatesttimestamp (since they might have just been consumed). -
Debounce/batch updates - If called too frequently, it might cause timing issues with the reactive system.
Could you share what specific behavior changed when you added the call? That will help me suggest a more targeted fix.
🏁 Script executed:
#!/bin/bash
# Let's check how the buffer signals are consumed in the UI to understand the impact
rg -n --type=ts --type=tsx -C3 'bufferEarliest|bufferLatest|BufferDuration' js/hang-ui/
Length of output: 27
🏁 Script executed:
#!/bin/bash
# Search for buffer signal usage in the UI with proper file extensions
rg -n -g '*.ts' -g '*.tsx' -C3 'bufferEarliest|bufferLatest|BufferDuration' js/hang-ui/
echo "---"
echo "Also checking the context provider implementation:"
rg -n -g '*.tsx' -g '*.ts' -A5 'videoBufferDuration|audioBufferDuration' js/hang-ui/src/Components/watch/
Length of output: 11289
🏁 Script executed:
#!/bin/bash
# Check how the video and audio sources reference the consumer's buffer signals
echo "=== Video source buffer signal sync ==="
rg -n -g '*.ts' -A3 -B3 'earliestBufferTime|latestBufferTime' js/hang/src/watch/video/source.ts
echo ""
echo "=== Audio source buffer signal sync ==="
rg -n -g '*.ts' -A3 -B3 'earliestBufferTime|latestBufferTime' js/hang/src/watch/audio/source.ts
echo ""
echo "=== Consumer signal declarations ==="
rg -n -g '*.ts' -B2 -A2 'earliestBufferTime.*Signal|latestBufferTime.*Signal' js/hang/src/frame.ts
Length of output: 1034
@jdreetz Ah, I see the issue now! The problem is timing: when you call #updateBufferRange() immediately after shifting a frame, the active group might be temporarily empty (frames.length === 0), and if other groups aren't ready yet, getBufferedRangeForGroup() skips all groups and returns undefined for both values, breaking the UI.
The root cause:
// In getBufferedRangeForGroup():
if (!group.latest || group.frames.length === 0) continue; // Skips empty groups!When the last frame is shifted from a group, it becomes empty but hasn't been removed yet, causing the function to skip it and potentially return all undefined.
Working solution: Only update when removing groups, not when shifting individual frames:
async decode(): Promise<Frame | undefined> {
for (;;) {
if (
this.#groups.length > 0 &&
this.#activeSequenceNumber !== undefined &&
this.#groups[0].consumer.sequence <= this.#activeSequenceNumber
) {
const frame = this.#groups[0].frames.shift();
- if (frame) return frame;
+ if (frame) {
+ // Don't update here - the group might be temporarily empty
+ return frame;
+ }
const isGroupDone = this.#activeSequenceNumber > this.#groups[0].consumer.sequence;
if (isGroupDone) {
this.#groups.shift();
+ this.#updateBufferRange(); // Update after removing the group
continue;
}
}This avoids the race condition while still updating the buffer as groups are consumed. It's less granular (updates per-group instead of per-frame) but maintains correctness.
Alternative fix (more complex): Modify getBufferedRangeForGroup() to tolerate the active group being empty and look at group.latest even when frames.length === 0 for groups currently being consumed. Would you like me to explore that approach?
…ot exported), update simulated latency function. dont have separate props for buffer start/end readonly
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
js/hang/src/frame.ts (1)
247-282: Buffer signals not updated on hot path frame consumption.
#updateBufferRange()is only called in the notify callback (line 272), which fires when the consumer waits for new frames. When frames are consumed from an already-buffered queue (lines 254, 261), the signals are not updated, causing the UI to display stale buffer information.🐛 Suggested fix: update buffer range when consuming frames
async decode(): Promise<Frame | undefined> { for (;;) { if ( this.#groups.length > 0 && this.#activeSequenceNumber !== undefined && this.#groups[0].consumer.sequence <= this.#activeSequenceNumber ) { const frame = this.#groups[0].frames.shift(); - if (frame) return frame; + if (frame) { + this.#updateBufferRange(); + return frame; + } const isGroupDone = this.#activeSequenceNumber > this.#groups[0].consumer.sequence; if (isGroupDone) { this.#groups.shift(); + this.#updateBufferRange(); continue; } }
🤖 Fix all issues with AI agents
In `@js/hang/src/frame.ts`:
- Around line 329-334: TypeScript errors come from missing type declarations for
import.meta.env used in applySimulatedLatency (it checks import.meta.env.DEV);
add a new vite-env.d.ts in the package's src that declares an interface
ImportMetaEnv with at least DEV?: boolean (and any other env keys you expect)
and then declare interface ImportMeta { readonly env: ImportMetaEnv }, so the
compiler knows the shape of import.meta.env when bundling with Vite.
♻️ Duplicate comments (1)
js/hang/src/frame.ts (1)
217-234: Logic error:#activeSequenceNumberupdated unconditionally outsideisSlowGroupblock.Line 234 executes regardless of the
isSlowGroupcondition. WhenisSlowGroupisfalse(meaning the first group is ahead of the active sequence), the code:
- Does NOT shift or close the first group (correct)
- Still reassigns
#activeSequenceNumbertothis.#groups[0]?.consumer.sequence(incorrect)This causes the active sequence to jump forward unexpectedly, potentially skipping frames from the current in-progress group.
🐛 Suggested fix: move the assignment inside the conditional
if (isSlowGroup) { this.#groups.shift(); console.warn( `skipping slow group: ${firstGroup.consumer.sequence} < ${this.#groups[0]?.consumer.sequence}`, ); firstGroup.consumer.close(); firstGroup.frames.length = 0; - } - // Advance to the next known group. - // NOTE: Can't be undefined, because we checked above. - this.#activeSequenceNumber = this.#groups[0]?.consumer.sequence; + // Advance to the next known group. + // NOTE: Can't be undefined, because we checked above. + this.#activeSequenceNumber = this.#groups[0]?.consumer.sequence; + }
🧹 Nitpick comments (1)
js/hang/src/frame.ts (1)
297-321: LGTM with minor observation.The function correctly filters out fully-consumed groups (lines 302-307). The fallback
?? group.lateston line 310 is now unreachable sincegroup.frames.length === 0groups are skipped, but it's harmless.♻️ Optional: remove redundant fallback
- const frame = group.frames.at(0)?.timestamp ?? group.latest; + const frame = group.frames.at(0)!.timestamp;
📜 Review details
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
js/hang/src/frame.ts
🧰 Additional context used
📓 Path-based instructions (1)
js/**/*.{ts,tsx,js,jsx,json}
📄 CodeRabbit inference engine (CLAUDE.md)
Use
bunfor all TypeScript/JavaScript package management and script execution (not npm, yarn, or pnpm)
Files:
js/hang/src/frame.ts
🧬 Code graph analysis (1)
js/hang/src/frame.ts (3)
js/hang/src/catalog/container.ts (1)
Container(12-12)js/lite/src/time.ts (2)
Micro(14-14)Micro(16-25)js/signals/src/index.ts (1)
Signal(28-156)
🪛 GitHub Actions: Check
js/hang/src/frame.ts
[error] 331-331: Property 'env' does not exist on type 'ImportMeta'.
🔇 Additional comments (5)
js/hang/src/frame.ts (5)
12-17: LGTM!The rename from
grouptogroupSequenceNumberimproves clarity by making the field's purpose explicit.
41-48: LGTM!Renaming
decodetogetFrameDatadisambiguates this utility from theConsumer.decode()method.
94-96: LGTM!Public signals for buffer time exposure enable the UI to reactively display buffer information as intended.
154-158: LGTM!The latency simulation is now properly gated behind the
window.simulateLatencycheck before the async call, addressing the previous concern about unnecessary overhead.
323-327: Acceptable for development tooling.The global
Windowextension forsimulateLatencyis reasonable given it's gated behind the DEV environment check inapplySimulatedLatency.
✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.
| async function applySimulatedLatency(amount: number): Promise<void> { | ||
| // not sure if we should allow this in production too? | ||
| if (!import.meta?.env?.DEV) return Promise.resolve(); | ||
|
|
||
| return new Promise((resolve) => setTimeout(resolve, amount)); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
# Check for Vite and TypeScript configuration
fd -type f \( -name "tsconfig.json" -o -name "vite.config.*" \) js/hang --exec cat {}Repository: moq-dev/moq
Length of output: 225
🏁 Script executed:
# Check the current frame.ts file to see the actual code and context
wc -l js/hang/src/frame.tsRepository: moq-dev/moq
Length of output: 79
🏁 Script executed:
# Read the relevant section of frame.ts around lines 329-334
sed -n '320,340p' js/hang/src/frame.tsRepository: moq-dev/moq
Length of output: 414
🏁 Script executed:
# Find tsconfig and vite config files
find js/hang -name "tsconfig.json" -o -name "vite.config.*"Repository: moq-dev/moq
Length of output: 76
🏁 Script executed:
# Read tsconfig.json
cat js/hang/tsconfig.jsonRepository: moq-dev/moq
Length of output: 177
🏁 Script executed:
# Read entire frame.ts to check for any ImportMeta declarations
cat js/hang/src/frame.tsRepository: moq-dev/moq
Length of output: 8814
🏁 Script executed:
# Check package.json to see Vite dependency
cat js/hang/package.jsonRepository: moq-dev/moq
Length of output: 1428
🏁 Script executed:
# Read the parent tsconfig
cat js/tsconfig.jsonRepository: moq-dev/moq
Length of output: 1387
🏁 Script executed:
# Check if there's a Vite config in the root or js directory
find . -maxdepth 2 -name "vite.config.*" 2>/dev/nullRepository: moq-dev/moq
Length of output: 37
🏁 Script executed:
# Check the root package.json for Vite
cat package.json | head -50Repository: moq-dev/moq
Length of output: 443
🏁 Script executed:
# Search for ImportMeta type declarations anywhere in the codebase
rg "interface ImportMeta" js/Repository: moq-dev/moq
Length of output: 397
🏁 Script executed:
# Check what a vite-env.d.ts looks like
cat js/hang-demo/src/vite-env.d.tsRepository: moq-dev/moq
Length of output: 211
🏁 Script executed:
# Search for other uses of import.meta in js/hang
rg "import\.meta" js/hang/srcRepository: moq-dev/moq
Length of output: 130
🏁 Script executed:
# Check if js/hang is actually built/tested
grep -A5 "js/hang" package.jsonRepository: moq-dev/moq
Length of output: 211
🏁 Script executed:
# Check the exports and see if this is a library
cat js/hang/package.json | grep -A20 '"exports"'Repository: moq-dev/moq
Length of output: 727
🏁 Script executed:
# Check if there are any Vite-related build tools in any package script
rg "vite" js/hang-demo/package.jsonRepository: moq-dev/moq
Length of output: 191
🏁 Script executed:
# See if js/hang is used/imported by other packages
rg "@moq/hang" js/ --type jsonRepository: moq-dev/moq
Length of output: 372
Add type declarations for import.meta.env by creating a vite-env.d.ts file.
The TypeScript error occurs because js/hang is missing type declarations for import.meta. Although js/hang itself doesn't use Vite, it's consumed by Vite applications (js/hang-demo and js/hang-ui), and the code needs proper type definitions when bundled.
Create js/hang/src/vite-env.d.ts following the same pattern used in other packages:
Create vite-env.d.ts
+/// <reference types="vite/client" />
+
+interface ImportMetaEnv {
+ readonly DEV?: boolean;
+}
+
+interface ImportMeta {
+ readonly env?: ImportMetaEnv;
+}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| async function applySimulatedLatency(amount: number): Promise<void> { | |
| // not sure if we should allow this in production too? | |
| if (!import.meta?.env?.DEV) return Promise.resolve(); | |
| return new Promise((resolve) => setTimeout(resolve, amount)); | |
| } | |
| /// <reference types="vite/client" /> | |
| interface ImportMetaEnv { | |
| readonly DEV?: boolean; | |
| } | |
| interface ImportMeta { | |
| readonly env?: ImportMetaEnv; | |
| } |
🧰 Tools
🪛 GitHub Actions: Check
[error] 331-331: Property 'env' does not exist on type 'ImportMeta'.
🤖 Prompt for AI Agents
In `@js/hang/src/frame.ts` around lines 329 - 334, TypeScript errors come from
missing type declarations for import.meta.env used in applySimulatedLatency (it
checks import.meta.env.DEV); add a new vite-env.d.ts in the package's src that
declares an interface ImportMetaEnv with at least DEV?: boolean (and any other
env keys you expect) and then declare interface ImportMeta { readonly env:
ImportMetaEnv }, so the compiler knows the shape of import.meta.env when
bundling with Vite.
Description
This PR completes Issue 783 by exposing two new properties on the Frame Consumer that calculate the earliest and latest frames in the buffer and presenting them as the buffer amount in the Player UI. The latency slider has been moved to the buffer UI, since the latency control should affect how much buffer is available for playback.
How to Test
Since it's hard to force buffering without actually having latency, I'm leaving in a utility to simulate latency in the Frame consumer when a frame is read. To enable it, set
window.simulateLatencyto true and increase the latency slider. This should cause a larger buffer to be created and the UI to display the amount.