Skip to content

Conversation

@dsfaccini
Copy link
Contributor

@dsfaccini dsfaccini commented Nov 17, 2025

web-based chat interface for Pydantic AI agents

  1. new module pydantic_ai.ui.web
  2. new method Agent.to_web()

fastapi

  • app = create_chat_app(agent)

  • the following endpoints come preconfigured:

    • GET / and /:id - serve the chat UI
    • POST /api/chat - Main chat endpoint using VercelAIAdapter
    • GET /api/configure - Returns available models and builtin tools
    • GET /api/health - Health check
    • NOTE: I'm counting on FastAPI to complain if the user tried adding conflicting routes, otherwise we could add a warning on the respective docs.

options and example

NOTE: the module for options is currently pydantic_ai.ui.web.

  • pre-configured model options:

    • anthropic:claude-sonnet-4-5
    • openai-responses:gpt-5
    • google-gla:gemini-2.5-pro
  • supported builtin tools:

    • web_search
    • code_execution
    • image_generation
# app.py
import logfire
from pydantic_ai import Agent

logfire.configure(send_to_logfire='if-token-present')
logfire.instrument_pydantic_ai()

agent = Agent('openai:gpt-5')

@agent.tool
def get_weather(city: str) -> str:
    return f"The weather in {city} is sunny"

app = agent.to_web()

logfire.instrument_fastapi(app, capture_headers=True)

# Run with: uvicorn app:app

testing

  • 7 tests in tests/test_ui_web.py

notes

  • UI is served from CDN: @pydantic/[email protected]
  • Uses Vercel AI protocol for chat streaming
  • TODO: add clai web command to launch from the CLI (as in uvx pydantic-work without the whole URL magic)
  • TODO: should I add a new doc at docs/ui/to_web.md? I'd also reference this in docs/ui/overview.md and docs/agents.md

EDIT: if you try it out it's worth noting that the current hosted UI doesn't handle ErrorChunks, so you will get no spinner and no response when there's a model-level error and fastapi will return a 200 any way.
This will happen for instance when you use a model for which you don't have a valid API key in your environment
I opened a PR for the error chunks here pydantic/ai-chat-ui#4.

Closes #3295

@@ -0,0 +1,186 @@
"""Agent discovery using AST parsing to find pydantic_ai.Agent objects."""
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is cool but I think it's too much magic, at least for the first version of this feature (I wouldn't throw the code away; we could consider it as a separate PR later). I think the existing -a AGENT flag is sufficient for now.

args = parser.parse_args(args_list)

# Handle web subcommand
if args.command == 'web':
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe it should be --web so it doesn't conflict with the prompt arg?


self._get_toolset().apply(_set_sampling_model)

def to_web(self) -> Any:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We're gonna need some args here -- have a look at the to_a2a and to_ag_ui methods. Not saying we need all of those args, but some may be useful

This method returns a pre-configured FastAPI application that provides a web-based
chat interface for interacting with the agent. The UI is served from a CDN and
includes support for model selection and builtin tool configuration.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The is that the UI would be downloaded and cached on first use, so it's not meant to always be "served from a CDN"

@@ -0,0 +1,75 @@
"""Model and builtin tool configurations for the web chat UI."""
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The goal is for the developer to be able to pass a list of models and builtin tools to to_web, instead of anything hard-coded they may not have keys for. We'd need a new field like supported_builtin_tools on ModelProfile to store for each model/provider which builtin tools it supports. Then we can automatically generate the structures below based on the user-provided data and which tools work with which models.

name: str


BUILTIN_TOOL_DEFS: list[BuiltinTool] = [
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Friendly names for builtin tools can go in builtin_tools.py

from .agent_options import AIModel, BuiltinTool
from .api import create_api_router

CDN_URL = 'https://cdn.jsdelivr.net/npm/@pydantic/ai-chat-ui/dist/index.html'
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should pin a specific version, in case we make backward-incompatible changes at some point

async def index(request: Request): # pyright: ignore[reportUnusedFunction]
"""Serve the chat UI from CDN."""
async with httpx.AsyncClient() as client:
response = await client.get(CDN_URL)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should cache this somewhere

response = client.get('/api/configure')
assert response.status_code == 200
data = response.json()
assert 'models' in data
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Use a snapshot please :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Web chat interface for any agent

2 participants