Tip
If Gen-UI-Lang helps you prototype UIs faster, please star ⭐ the repo!
Gen-UI-Lang is a compact, LLM-friendly language for describing UIs in a few readable lines. Write a concise UI expression once and render it to HTML, React JSX, Gradio, or other targets—perfect for rapid prototyping, demos, and LLM-driven UI generation.
- LLM-first design: Intentionally concise and predictable allowing LLMs to generate UI code reliably with fewer tokens.
- Multi-target output: Author once, render everywhere — quick HTML previews, React components, Gradio demos, and more.
- Consistent cross-target language: Use the same snippet to target multiple frameworks, reducing duplication and cognitive load.
- Faster iteration: Compact syntax makes prototyping and experimentation quicker than hand-authoring verbose HTML or JSX.
Gen-UI-Lang sits at the sweet spot between human-readable code and a machine-actionable code. It's small enough to iterate with and structured enough to transform programmatically — ideal for prototypes, demos, and LLM-driven Generative UI workflows.
Elevator pitch Describe a UI in a single expression and render it to multiple targets. Example:
genui(
row(
text("Sales Overview"),
btn("Load", on_load=lambda: get_graph(2001, 2002))
),
chart(type="line", data="sales_q4"),
)
This expression is an AST built from Node factories (genui, row, text, btn, chart) that can be converted to HTML or other formats using the included renderers.
Core strengths
- Minimal syntax: Factory functions map directly to component concepts (
genui,row,col,card,text,btn,input,table,graph/chart), making UIs expressive and compact. - Extensible core: The AST is implemented with
Nodeobjects (seegen_ui_lang/core.py), so adding nodes or renderers is straightforward. - LLM integration: Helpers in
gen_ui_lang/utils/llm_utils.pymake it simple to ask an LLM to returnui(...)snippets when available.
Key features:
- Node factories:
genui,row,col,card,text,btn,input,table,graph/chart. - Simple renderer:
to_html(node)(seegen_ui_lang/core.py). - LLM helper:
get_response(messages, use_genui=True)to request Gen-UI-Lang formatted replies (seegen_ui_lang/utils/llm_utils.py).
Install locally:
pip install -e .Create and render a UI (Python):
from gen_ui_lang import genui, row, text, btn, chart, to_html
n = genui(
row(text("Sales Overview"), btn("Load")),
chart(type="line", data="sales_q4"),
)
print(to_html(n)) # quick HTML previewRun a demo server:
python examples/demo_server.pyRun the chatbot UI (calls an LLM if configured):
python examples/chat_server.pyNotes:
- The servers in
examples/require additional packages likefastapianduvicorn(andtiktokenfor the chat UI). - For LLM calls via
get_response(...), setOPENAI_API_KEYandOPENAI_MODELin your environment (a.envfile is supported).
- The code is implemented as
Nodeobjects ingen_ui_lang/core.py. - Renderers convert nodes into output strings;
to_htmlis intentionally small and easy to extend. - Add new nodes by creating a factory function and supporting it in your target renderer.
Please star ⭐ the repository to encourage further development!

