AG-UI: The Event Protocol That Connects AI Agents to User Interfaces
A deep dive into AG-UI (Agent-User Interaction Protocol), the open event-based protocol from CopilotKit that standardizes real-time streaming, state synchronization, and human-in-the-loop patterns between AI agents and frontend applications.
Building a user interface that works with an AI agent—not just after it—is fundamentally different from building a traditional request-response UI. The agent is thinking, calling tools, updating state, and generating output all at the same time. If your UI blocks until the agent finishes, you've recreated the spinner. If you try to stream tokens directly, you lose tool call visibility, shared state, and any hope of letting the user interact mid-flight.
AG-UI (Agent-User Interaction Protocol), announced by CopilotKit on May 12, 2025, solves this with a single clean abstraction: a unified event stream. Your client makes one HTTP POST to the agent endpoint, then listens to a sequence of typed JSON events. Each event carries the minimum payload needed for the UI to update in real time—partial text, tool call signals, state patches, lifecycle markers. The UI reacts to events as they arrive, not when the agent finishes.
With 13,000+ GitHub stars and adapters for LangChain, LangGraph, Mastra, CrewAI, Pydantic AI, AG2, and more, AG-UI has become the de facto standard for streaming AI agent interactions to web frontends.
The Core Problem: Why Existing Approaches Fall Short
Before AG-UI, teams building agent-integrated UIs faced a painful menu of options:
Raw WebSockets with custom JSON: Works, but every team invents their own message schema, thread IDs, error handling, and cancellation protocol. Migrating to a new agent framework means rewriting the entire transport layer.
SSE token streaming (like OpenAI's streaming API): Great for text, useless for tool calls, state management, multi-agent orchestration, or anything that isn't a flat text stream.
Polling the agent backend: Simple to implement, terrible UX. Users wait seconds between updates.
Custom WebSocket protocols per framework: LangChain has one protocol, CrewAI has another, LangGraph has a third. Building a UI that works with multiple agents means maintaining multiple adapters.
AG-UI provides a single protocol that any agent framework can emit and any frontend can consume. The analogy is HTTP: you don't need to know whether a server is running Express, FastAPI, or Rails—you just speak HTTP. AG-UI is HTTP for agent-UI communication.
The Event Model
AG-UI defines approximately 16 standard event types, organized into logical groups:
Lifecycle Events
// Agent starts processing
{ type: 'RUN_STARTED', threadId: 'thread_abc', runId: 'run_xyz' }
// Agent finishes (success)
{ type: 'RUN_FINISHED', threadId: 'thread_abc', runId: 'run_xyz' }
// Agent errored
{ type: 'RUN_ERROR', threadId: 'thread_abc', message: 'Context limit exceeded', code: 'CONTEXT_OVERFLOW' }These are the bookends of every agent run. threadId tracks the conversation; runId tracks the specific inference call within that thread. If a user fires multiple queries, each gets its own runId under the same threadId.
Text Streaming Events
// Start of a new assistant message
{ type: 'TEXT_MESSAGE_START', messageId: 'msg_1', role: 'assistant' }
// Incremental token chunk
{ type: 'TEXT_MESSAGE_CONTENT', messageId: 'msg_1', delta: 'Here is a summary of' }
// Message complete
{ type: 'TEXT_MESSAGE_END', messageId: 'msg_1' }This three-event pattern (start, delta, end) enables reliable UI buffering. The messageId allows multiple concurrent messages to interleave without the UI losing track of which delta belongs to which bubble.
Tool Call Events
// Agent begins a tool call
{
type: 'TOOL_CALL_START',
toolCallId: 'tc_1',
toolName: 'search_web',
parentMessageId: 'msg_1'
}
// Tool arguments stream in (LLMs generate JSON arguments incrementally)
{
type: 'TOOL_CALL_ARGS',
toolCallId: 'tc_1',
delta: '{"query": "A2UI pr'
}
// Tool arguments complete
{
type: 'TOOL_CALL_END',
toolCallId: 'tc_1'
}The UI can react to TOOL_CALL_START immediately—showing a "Searching the web..." spinner before the arguments have even finished streaming. When TOOL_CALL_END fires, the UI knows arguments are complete and can show a structured "tool: args" display.
State Events
// Replace the entire shared state snapshot
{
type: 'STATE_SNAPSHOT',
snapshot: {
plan: ['Analyze data', 'Generate report', 'Format output'],
currentStep: 0,
artifacts: []
}
}
// Incremental JSON Patch update to shared state
{
type: 'STATE_DELTA',
delta: [
{ op: 'replace', path: '/currentStep', value: 1 },
{ op: 'add', path: '/artifacts/-', value: { id: 'report_1', type: 'markdown' } }
]
}This is where AG-UI goes far beyond simple streaming. The agent and frontend share a mutable state object that both can read and write. The agent emits STATE_SNAPSHOT at the start of a run and STATE_DELTA (JSON Patch format) as state evolves. The frontend applies patches and re-renders reactively. This enables: live progress tracking, multi-step plan visualization, artifact management, and any pattern where the UI needs to reflect the agent's internal state.
Human-in-the-Loop Events
// Agent pauses, waiting for user input
{
type: 'INTERRUPT',
interruptId: 'int_1',
reason: 'confirmation_required',
prompt: 'I found 47 matching records. Should I delete all of them?',
options: ['yes', 'no', 'show_me_first']
}INTERRUPT is the event that makes human-in-the-loop workflows practical. The agent pauses, the UI shows a confirmation prompt or an approval dialog, the user responds, and the run resumes. No polling required; no special backend logic beyond emitting the event and awaiting the response.
Transport Layer
AG-UI is transport-agnostic by design. The reference implementation uses standard HTTP with Server-Sent Events (SSE):
- Client POSTs to
<agent_endpoint>with the message payload - Server responds with
Content-Type: text/event-stream - Events stream as
data: <JSON>\n\npairs untilRUN_FINISHEDorRUN_ERROR
For performance-critical applications, AG-UI also supports a binary serialization format (MessagePack) over WebSocket, which reduces payload size by 30–60% for high-frequency state delta workloads.
The single-POST design is important: it works through any reverse proxy, load balancer, or CDN that supports streaming responses. No upgrade handshake, no special WebSocket routing rules. If your infrastructure can serve a streaming HTTP response, it can serve AG-UI.
Thread and Run Management
Each conversation is a thread (threadId). Each turn within the conversation is a run (runId). The client tracks both:
const currentThread = useRef<string>(crypto.randomUUID());
async function sendMessage(content: string) {
const runId = crypto.randomUUID();
await postToAgent({
threadId: currentThread.current,
runId,
messages: [...history, { role: 'user', content }]
});
}Continue reading AG-UI: The Event Protocol That Connects AI Agents to User Interfaces
Sign in or create a free account to read the rest of this article and all premium content.
Sign in to continue readingRelated articles
- AI & FrontendA2UI vs WebMCP vs AG-UI: Choosing the Right AI-UI Protocol
A detailed comparison of the three leading protocols for integrating AI agents with web frontends—A2UI, WebMCP, and AG-UI—covering architecture, use cases, security, maturity, and decision frameworks for frontend engineers.
Read article - AI & FrontendA2UI: Google's Protocol for Agent-Driven User Interfaces
A deep dive into A2UI, Google's open declarative protocol that lets AI agents generate rich, interactive, native UIs across web, mobile, and desktop without executing arbitrary code.
Read article - AI & FrontendWebMCP: Bringing the Model Context Protocol to the Browser
How the W3C's WebMCP draft specification lets web pages expose tools to AI agents through navigator.modelContext, and what it means for the future of AI-integrated web applications.
Read article - AI & FrontendBuilding a Full Stack Blog App with AI: A Practical Guide for Frontend Developers
Discover how to harness the power of AI to build a full stack blog application using React, Express, PostgreSQL, and Factory AI.
Read article