Skip to content

Frontend Integration Overview

Frontend integration connects your Helix Agents backend to user interfaces. The @helix-agents/ai-sdk package transforms Helix's streaming protocol into the Vercel AI SDK UI Data Stream format, enabling real-time chat UIs with useChat and other AI SDK hooks.

Why Frontend Integration Matters

AI agents produce:

  • Streaming text - Token-by-token responses
  • Tool invocations - Tools being called and their results
  • Thinking/reasoning - Internal reasoning traces
  • Custom events - Application-specific data
  • Final outputs - Structured results

Frontend integration delivers these events to your UI in real-time, enabling:

  • Progressive text rendering
  • Tool call visualization
  • Thinking/reasoning displays
  • Status indicators
  • Error handling

Architecture

mermaid
graph TB
    subgraph Backend ["Backend"]
        Executor["<b>Agent Executor</b><br/>Helix Chunks"]
        StreamMgr["<b>Stream Manager</b><br/>Helix Chunks"]
        Handler["<b>Frontend Handler</b><br/>Transform ↓<br/>AI SDK Events"]

        Executor --> StreamMgr --> Handler
    end

    Handler -->|"SSE (Server-Sent Events)"| UseChat

    subgraph Frontend ["Frontend"]
        UseChat["<b>useChat Hook (Vercel AI SDK)</b><br/>messages, input, handleSubmit, ..."]

        subgraph Components ["React Components"]
            C1["Message list"]
            C2["Tool call displays"]
            C3["Input form"]
        end

        UseChat --> Components
    end

Deployment Modes

The FrontendHandler works in two deployment modes:

ModeWhen to UseBackend
DirectAPI routes run in same process as agentRedis, Memory, or direct DO binding
DO ClientFrontend separate from Durable ObjectsHTTP to AgentServer DO

Direct Mode is simpler - your API route has direct access to stores:

typescript
// Same process: direct store access
const handler = createFrontendHandler({ executor, streamManager, stateStore, agent });

DO Mode uses HTTP client wrappers for remote access:

typescript
// Separate frontend: use DO client wrappers
const executor = new DOAgentExecutor({ client: doClient, agentName: 'chat' });
const handler = createFrontendHandler({ executor, streamManager, stateStore, agent });

See FrontendHandler Deployment Modes for detailed setup.

Server-Sent Events (SSE)

SSE provides real-time, server-to-client streaming:

mermaid
sequenceDiagram
    participant Server
    participant Client

    Server->>Client: text-delta: "Hello "
    Server->>Client: text-delta: "world"
    Server->>Client: tool-input-available
    Server->>Client: tool-output-available
    Server->>Client: finish

Benefits over WebSockets:

  • Simpler (one-way communication)
  • Automatic reconnection
  • Works through proxies/CDNs
  • Native browser support

Stream Protocol Transformation

Helix internal chunks are transformed to AI SDK UI events:

Helix ChunkAI SDK EventNotes
text_deltatext-deltaToken-by-token text
thinkingreasoning-deltaReasoning traces
tool_starttool-input-availableTool call complete
tool_endtool-output-availableTool result
customdata-{eventName}Custom events
errorerrorError events
outputdata-outputFinal structured output

Quick Start

Backend Setup

typescript
import { createFrontendHandler } from '@helix-agents/ai-sdk';
import { JSAgentExecutor } from '@helix-agents/runtime-js';
import { InMemoryStateStore, InMemoryStreamManager } from '@helix-agents/store-memory';
import { VercelAIAdapter } from '@helix-agents/llm-vercel';

// Create stores and executor
const stateStore = new InMemoryStateStore();
const streamManager = new InMemoryStreamManager();
const executor = new JSAgentExecutor(stateStore, streamManager, new VercelAIAdapter());

// Create frontend handler
const handler = createFrontendHandler({
  streamManager,
  executor,
  agent: MyAgent,
  stateStore, // Optional: for getMessages()
});

// Use with your framework
// See Framework Examples for Express, Hono, etc.

Frontend Setup

typescript
import { useChat } from 'ai/react';

function ChatComponent() {
  const { messages, input, handleInputChange, handleSubmit, isLoading } = useChat({
    api: '/api/chat',
  });

  return (
    <div>
      {messages.map((msg) => (
        <Message key={msg.id} message={msg} />
      ))}

      <form onSubmit={handleSubmit}>
        <input value={input} onChange={handleInputChange} />
        <button type="submit" disabled={isLoading}>Send</button>
      </form>
    </div>
  );
}

Package Contents

The @helix-agents/ai-sdk package provides:

FrontendHandler

Main class that handles HTTP requests:

typescript
const handler = createFrontendHandler({
  streamManager,
  executor,
  agent: MyAgent,
});

// Two modes:
// POST - Execute new agent and stream response
// GET  - Stream existing execution (resume)
const response = await handler.handleRequest({
  method: 'POST',
  body: { message: 'Hello' },
});

StreamTransformer

Converts Helix chunks to AI SDK events:

typescript
import { StreamTransformer } from '@helix-agents/ai-sdk';

const transformer = new StreamTransformer();

for await (const chunk of helixStream) {
  const { events } = transformer.transform(chunk);
  for (const event of events) {
    yield event;
  }
}

Message Converter

Converts Helix messages to AI SDK v6 UIMessage format:

typescript
import { convertToUIMessages } from '@helix-agents/ai-sdk';

const helixMessages = await stateStore.getMessages(sessionId);
const uiMessages = convertToUIMessages(helixMessages.messages);

// Use as initialMessages in useChat
const { messages } = useChat({
  initialMessages: uiMessages,
});

Typed Errors

Structured error handling:

typescript
import {
  FrontendHandlerError,
  ValidationError,
  StreamNotFoundError,
  StreamFailedError,
} from '@helix-agents/ai-sdk';

try {
  await handler.handleRequest(req);
} catch (error) {
  if (error instanceof FrontendHandlerError) {
    return Response.json({ error: error.message, code: error.code }, { status: error.statusCode });
  }
  throw error;
}

Framework Compatibility

The handler produces framework-agnostic responses:

typescript
interface FrontendResponse {
  status: number;
  headers: Record<string, string>;
  body: ReadableStream<Uint8Array> | string;
}

This works with any HTTP framework:

  • Express
  • Hono
  • Fastify
  • Cloudflare Workers
  • Vercel Functions
  • AWS Lambda

Stream Resumability

SSE supports automatic reconnection. The handler uses event IDs for resumability:

typescript
// Client reconnects with Last-Event-ID header
// Handler resumes from that position
const response = await handler.handleRequest({
  method: 'GET',
  streamId: 'run-123',
  resumeAt: lastEventId, // From Last-Event-ID
});

This enables:

  • Network resilience
  • Client reconnection
  • Page refresh handling

Next Steps

Released under the MIT License.