feat(chat): frontend bridge for streaming A2UI envelopes#261
Merged
Conversation
Accepts four argument shapes observed in the streaming-envelope-tool
spike: canonical {envelopes:[...]}, singular {envelope:[...]}, positional
keys {0,1,...}, and flat single envelope. Returns a canonical envelope
list or null. Shared shape with the Python normalizer in PR 2.
Adds a live-stream feed alongside apply(). Records the tool_call_id so downstream consumers (content classifier) can short-circuit duplicate envelope dispatch when the final wrapped AIMessage arrives carrying the same content the live stream already applied.
Subscribes to LangGraph custom events ('a2ui-partial') and feeds the
A2UI surface store envelope-by-envelope as the parent LLM streams its
tool_call.arguments JSON. Uses @cacheplane/partial-json to extract
structurally-complete envelopes from the growing args string.
Safety net: synthesises beginRendering immediately after the first
surfaceUpdate (targeting component id='root' or the first component),
so the surface mounts and the per-component fallback gate (PR #252)
fires while dataModelUpdates stream in — without waiting for the LLM
to emit beginRendering at the end of its envelope list. Repeated
beginRendering on the same surface is idempotent in the store.
Poison detection: a JSON-prefix validator runs ahead of the partial
parser to detect structurally invalid streams (the partial-json parser
silently halts on bad input rather than throwing), so malformed
arg streams are dropped and subsequent valid pushes are ignored.
The chat composition subscribes to LangGraph custom events of name a2ui-partial and forwards the cumulative args strings to the bridge. Effect tracks last-processed index so re-renders don't re-dispatch. Adapters that don't expose a customEvents signal (the runtime-neutral Agent contract makes it optional) are tolerated via feature detection; those continue to rely on the wrapped final-message classifier path.
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
3 tasks
blove
added a commit
that referenced
this pull request
May 12, 2026
…rk (#271) Brings the canonical smoke checklist current with 29 PRs that landed between Phase 7 (#239) and today without checklist updates. Specifically: Updated sections: - chat-debug devtools — replaced bottom-drawer model with floating launcher + status pill + switch (PRs #249, #251) - Control palette — palette v2 (status pill, shadcn-styled panel, PR #244) - Generative UI / A2UI surfaces — single-bubble invariant (PR #255), parent-emits-envelopes architecture (PR #259), wrapped-content + tool_calls coexistence (PR #255), envelope reorder - Server-side wire format — tool_calls preserved on the final AI - Replaced 'Multi-thread' section with 'Sidenav (thread management)' reflecting the permanent semantic <nav> + Active/Archived sections (PR #253) and removing the old palette-toggled drawer model Added sections: - Cmd+K history search — palette open/search/select/close, archived result subtitle, keyboard navigation (PR #253) - Per-row thread actions — kebab menu order per state (active, pinned, archived), rename + pin/unpin + archive/unarchive + delete flows (PRs #258, #260, #267) - Thread titles — first-user-message derivation, idempotent writes, manual rename precedence (PR #242) - Progressive A2UI streaming — per-component fallback transition observable during streaming window (PRs #252, #261, #262, #268, #269) - Inline checkpoint markers — render between messages during multi-step runs (PR #243) - Responsive sidenav — viewport breakpoints, auto-collapse behavior (PR #240) Total: ~58 new check items across 6 new sections, plus rewrites to 5 existing sections. Original 333-line checklist → 391 lines / 237 check items.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Adds the frontend half of progressive A2UI streaming. The chat composition subscribes to LangGraph custom events named
a2ui-partialand feeds them through a newPartialArgsBridgethat uses@cacheplane/partial-jsonto extract structurally-complete envelopes from the cumulativetool_call.argumentsstring the parent LLM streams. Each envelope flows into the A2UI surface store via the newapplyPartialArgs(toolCallId, envelopes)entry point.The bridge synthesises a
beginRenderingenvelope immediately after the first completesurfaceUpdate(targeting componentid='root'or the first component as a safety net), so the surface mounts and the per-component fallback gate (#252) actually fires while dataModelUpdates stream in.This PR ships dormant code: no backend yet emits
a2ui-partialevents. PR for the backend lands separately (envelope-tool refactor + callback handler).Spec:
docs/superpowers/specs/2026-05-12-genui-streaming-sub-llm-design.md.Test plan
nx test chatgreen (envelope-normalizer 6 tests; surface-store 3 new tests; partial-args-bridge 8 tests; chat composition 2 wiring tests)nx build chat+nx lint chatgreen