Skip to content

karol-pieciukiewicz/langchain-sql-agent-with-memory

Repository files navigation

SQL AI App as MCP + Open WebUI Frontend

This project provides:

  • LangGraph/LangChain SQL AI agent
  • FastAPI backend API (/chat, /health) for diagnostics
  • MCP server facade (MCP over Streamable HTTP) exposing chat_agent
  • Open WebUI as primary frontend (using Azure AI Foundry / Azure OpenAI model provider)
  • Postgres-backed LangGraph short-term + long-term memory

MCP Tool Contract

Tool name: chat_agent

Inputs:

  • user_id (required)
  • message (required)
  • thread_id (optional)
  • db_path (optional, SQLite file path)
  • db_uri (optional, SQLAlchemy URI e.g. SQL Server)

Outputs:

  • answer
  • user_id
  • thread_id

Quickstart

  1. Configure env:
cp .env.example .env
  1. Download sample SQLite DB:
bash scripts/download_adventureworks.sh
  1. Ensure .env has Azure model connection values:
  • AZURE_OPENAI_API_KEY
  • OPENWEBUI_OPENAI_BASE_URL (OpenAI-compatible base URL, e.g. https://<resource>.openai.azure.com/openai/v1)
  • OPENWEBUI_MODEL_ID set to your Foundry deployment (for example gpt-5.2)
  1. Start stack:
docker compose up -d --build
  1. If you previously ran Open WebUI with other providers, reset persisted UI config:
docker compose down -v
docker compose up -d --build
  1. Open Open WebUI:
  • http://localhost:3000
  1. Register MCP server in Open WebUI:
  • URL: http://mcp-server:8001/mcp (from inside Docker network)
  • If your Open WebUI MCP form expects host URL from browser network, use http://localhost:8001/mcp.
  1. In Open WebUI, select your Azure Foundry model/deployment and invoke MCP tools from chat.

Open WebUI Tool-Routing Policy (Smart Auto-Call)

Set your model/system prompt in Open WebUI to include this policy:

You have an MCP tool named chat_agent. Use it whenever user memory or SQL may be involved.

Always call chat_agent for:
- user profile statements: "my name is ...", "I prefer ...", "remember that ..."
- memory recall requests: "what is my name?", "what do you remember about me?"
- Company Store SQL/database questions.

Do not answer from your own context if memory may be relevant. Ask chat_agent first.
Pass through user_id and thread_id exactly as provided by tool context.
Reuse returned thread_id for follow-up turns.

Why: with smart auto-call, Open WebUI may skip tool calls if the tool looks SQL-only. The MCP metadata and this policy make memory-capture/recall turns route to chat_agent.

Database Options (SQLite + SQL Server)

You can run the agent against either database type:

  • SQLite (default): use bundled APP_DEFAULT_SQLITE_DB_PATH or pass db_path per request/tool call.
  • SQL Server: set APP_DEFAULT_SQL_URI (SQLAlchemy URI) or pass db_uri per request/tool call.

Selection precedence:

  1. db_uri from request/tool call
  2. APP_DEFAULT_SQL_URI (if configured)
  3. db_path from request/tool call or APP_DEFAULT_SQLITE_DB_PATH

Restrict to GPT-5.2

  • Set OPENWEBUI_MODEL_ID=gpt-5.2 in .env (or your exact Foundry deployment name).
  • Restart with clean config:
docker compose down -v
docker compose up -d --build
  • Optional hard filter in UI (recommended to hide all others): Admin Settings -> Connections -> OpenAI -> Model IDs (Filter) -> gpt-5.2

Run MCP server locally (without Docker Open WebUI)

mcp-server

Optional backend API (diagnostics)

uvicorn backend.main:app --reload --port 8000

Troubleshooting Azure OpenAI 404

If tool calls fail with 404 Resource not found, check:

  • AZURE_OPENAI_ENDPOINT should be https://<resource>.openai.azure.com
  • AZURE_OPENAI_DEPLOYMENT must be exact deployment name
  • AZURE_OPENAI_API_VERSION must match deployment support

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages