fix(chat): replace LLM-based a2ui graph with hardcoded JSONL#120
Merged
Conversation
The a2ui graph used an LLM + system prompt to generate A2UI JSONL,
but LLMs cannot reliably produce exact protocol-level output — they
prepend conversational text before the ---a2ui_JSON--- prefix, causing
the content classifier to classify as markdown instead of a2ui.
Replace with hardcoded JSONL using the v0.9 envelope format
({"createSurface": {...}} instead of {"type": "createSurface", ...}).
The LLM is only used for conversational responses to form submissions,
where exact format is not required.
Removed _bindings from component definitions per v0.9 spec — the
renderer generates bindings automatically from path references.
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
blove
added a commit
that referenced
this pull request
Apr 12, 2026
All chat/* standalone LangGraph Cloud deployments are unreachable (TCP timeout). The streaming deployment is healthy and already has all 12 graphs consolidated (PR #113). Changes: - examples-middleware.ts: route all chat/* paths to 'streaming' key - a2ui environment: use 'a2ui_form' (streaming deployment's name) - a2ui_graph.py in streaming deployment: update to hardcoded v0.9 JSONL matching PR #120's fix Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
3 tasks
blove
added a commit
that referenced
this pull request
Apr 12, 2026
All chat/* standalone LangGraph Cloud deployments are unreachable (TCP timeout). The streaming deployment is healthy and already has all 12 graphs consolidated (PR #113). Changes: - examples-middleware.ts: route all chat/* paths to 'streaming' key - a2ui environment: use 'a2ui_form' (streaming deployment's name) - a2ui_graph.py in streaming deployment: update to hardcoded v0.9 JSONL matching PR #120's fix Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
blove
added a commit
that referenced
this pull request
Apr 12, 2026
All chat/* standalone LangGraph Cloud deployments are unreachable (TCP timeout). The streaming deployment is healthy and already has all 12 graphs consolidated (PR #113). Changes: - examples-middleware.ts: route all chat/* paths to 'streaming' key - a2ui environment: use 'a2ui_form' (streaming deployment's name) - a2ui_graph.py in streaming deployment: update to hardcoded v0.9 JSONL matching PR #120's fix Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
The a2ui graph used an LLM + system prompt to generate A2UI JSONL, but LLMs prepend conversational text before the
---a2ui_JSON---prefix, causing the content classifier to classify as markdown instead of a2ui. The contact form renders as raw JSON text instead of interactive UI components.Fix: Replace LLM-based generation with hardcoded JSONL using v0.9 envelope format. The LLM is only used for conversational responses to form submission events.
Changes
createSurface,updateDataModel,updateComponentsuse v0.9 envelope:{"createSurface": {...}}_bindingsfrom components (renderer auto-generates from path refs)sendDataModel: truetocreateSurfacefor form submission supportTest plan
nx test agent— passesnx test chat— passes🤖 Generated with Claude Code