feat(examples-chat): coalesce GenUI emit into tool-call AI + reorder envelopes#255
Merged
Conversation
Two changes to the GenUI emit node: 1. In-place AIMessage replacement. The node returns an AIMessage with the SAME id as the upstream tool-call AI; LangGraph's add_messages reducer matches by id and replaces in place. The thread now carries one AI message per GenUI turn (with both tool_calls AND the wrapped surface content) instead of two — the user sees a single bubble that transforms from skeleton to surface, not a skeleton bubble followed by a separate surface bubble. 2. Envelope reorder. surfaceUpdate -> beginRendering -> dataModelUpdate x N. The surface store gates surface materialization on beginRendering; emitting it early lets the frontend mount the (initially empty) surface and reveal per-component fallbacks as dataModelUpdate envelopes flow in. Progressive UX without changing the protocol's semantic intent. Preserved on the replacement AI: tool_calls (for frontend isGenuiTurn detection + time-travel linkage), additional_kwargs (token counts, reasoning, citations), response_metadata (model + finish-reason). Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
4 tasks
blove
added a commit
that referenced
this pull request
May 12, 2026
…opes (#259) * feat(examples-chat): Pydantic A2UI envelope tool Parent-LLM-bound tool render_a2ui_surface accepts envelopes as typed structured arguments. OpenAI strict-mode bind_tools will validate against the Pydantic schema. Body serializes via model_dump(exclude_none=True) so optional fields don't leak as null into the wire JSON. * feat(examples-chat): Python envelope-args normalizer Parity with libs/chat envelope-normalizer.ts. Accepts envelopes/envelope/ positional/flat shapes the parent LLM may emit under non-strict tool binding. Returns a canonical envelope list or None. * feat(examples-chat): parent LLM emits A2UI envelopes directly Replaces the two-LLM hop (parent → tool body's sub-LLM) with a single parent LLM bound to render_a2ui_surface(envelopes: list[A2uiEnvelope]) under OpenAI strict mode. The A2UI v1 schema prompt is appended to the parent's system prompt with explicit envelope-emission ordering so the surface mounts as soon as surfaceUpdate parses. emit_generated_surface keeps its PR #255 job: read the validated envelope list from ToolMessage, apply the static reorder for defence-in-depth, wrap with A2UI_PREFIX, replace upstream AI message in place (single-bubble invariant). The new envelope-emission flow already streams natively via the parent LLM's tool_call_chunks; PR 3 attaches a callback handler that sidebands the chunks as a2ui-partial custom events for the live UX.
This was referenced May 12, 2026
blove
added a commit
that referenced
this pull request
May 12, 2026
…rk (#271) Brings the canonical smoke checklist current with 29 PRs that landed between Phase 7 (#239) and today without checklist updates. Specifically: Updated sections: - chat-debug devtools — replaced bottom-drawer model with floating launcher + status pill + switch (PRs #249, #251) - Control palette — palette v2 (status pill, shadcn-styled panel, PR #244) - Generative UI / A2UI surfaces — single-bubble invariant (PR #255), parent-emits-envelopes architecture (PR #259), wrapped-content + tool_calls coexistence (PR #255), envelope reorder - Server-side wire format — tool_calls preserved on the final AI - Replaced 'Multi-thread' section with 'Sidenav (thread management)' reflecting the permanent semantic <nav> + Active/Archived sections (PR #253) and removing the old palette-toggled drawer model Added sections: - Cmd+K history search — palette open/search/select/close, archived result subtitle, keyboard navigation (PR #253) - Per-row thread actions — kebab menu order per state (active, pinned, archived), rename + pin/unpin + archive/unarchive + delete flows (PRs #258, #260, #267) - Thread titles — first-user-message derivation, idempotent writes, manual rename precedence (PR #242) - Progressive A2UI streaming — per-component fallback transition observable during streaming window (PRs #252, #261, #262, #268, #269) - Inline checkpoint markers — render between messages during multi-step runs (PR #243) - Responsive sidenav — viewport breakpoints, auto-collapse behavior (PR #240) Total: ~58 new check items across 6 new sections, plus rewrites to 5 existing sections. Original 333-line checklist → 391 lines / 237 check items.
1 task
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Two changes to
emit_generated_surfacein the Python graph:In-place AIMessage replacement. Returns an AIMessage with the SAME id as the upstream tool-call AI. LangGraph's
add_messagesreducer matches by id and replaces — the thread carries ONE AI message per GenUI turn (with bothtool_callsAND the wrapped surface content).Envelope reorder.
surfaceUpdate -> beginRendering -> dataModelUpdate x N. The surface store gates surface materialization onbeginRendering; emitting it early lets the frontend mount the (initially empty) surface and reveal per-component fallbacks as data envelopes flow.Pairs with the render fallback PR + the chat composition cleanup PR to deliver: one bubble per GenUI turn, surface mounts on the first envelope with per-component skeletons in the shape of the eventual UI, components swap from skeleton to real in place as their data bindings populate.
Spec:
docs/superpowers/specs/2026-05-11-progressive-genui-bubble-coalescing-design.md.Test plan
pytest tests/test_graph_smoke.py::TestEmitGeneratedSurfaceCoalescing— 2 tests greenpytest tests/— all existing tests still pass