@helix-agents/ai-sdk
Vercel AI SDK UI binding layer for Helix Agents. Transforms Helix internal streaming protocol to AI SDK UI Data Stream protocol for use with useChat and other AI SDK React hooks.
Installation
npm install @helix-agents/ai-sdkFrontendHandler
Main handler for frontend requests.
createFrontendHandler
Factory function to create a handler.
import { createFrontendHandler } from '@helix-agents/ai-sdk';
const handler = createFrontendHandler({
streamManager, // StreamManager instance
executor, // AgentExecutor instance
agent, // Agent configuration
stateStore, // Optional: for getMessages()
transformerOptions, // Optional
logger, // Optional
});handleRequest
Handle incoming HTTP requests.
// POST - Execute new agent
const response = await handler.handleRequest({
method: 'POST',
body: {
message: 'Hello, agent!',
state: { userId: 'user-123' }, // Optional initial state
},
});
// GET - Stream existing execution
const response = await handler.handleRequest({
method: 'GET',
streamId: 'run-123',
resumeAt: 100, // Optional: resume from sequence
});Returns:
interface FrontendResponse {
status: number;
headers: Record<string, string>;
body: ReadableStream<Uint8Array> | string;
}getMessages
Load conversation history for useChat initialMessages.
const { messages, hasMore } = await handler.getMessages(sessionId, {
offset: 0,
limit: 50,
includeReasoning: true,
includeToolResults: true,
generateId: (index, msg) => `msg-${index}`,
});Returns: GetUIMessagesResult
interface GetUIMessagesResult {
messages: UIMessage[];
hasMore: boolean;
}StreamTransformer
Transforms Helix stream chunks to AI SDK UI events.
import { StreamTransformer } from '@helix-agents/ai-sdk';
const transformer = new StreamTransformer({
generateMessageId: (agentId) => `msg-${agentId}`,
includeStepEvents: false,
chunkFilter: (chunk) => chunk.type !== 'state_patch',
logger: console,
// Streaming metadata options
startMetadata: { requestId: 'req-123', source: 'web-ui' },
finishMetadata: { model: 'claude-3', totalTokens: 150 },
});
// Transform chunks
for await (const chunk of helixStream) {
const { events, sequence } = transformer.transform(chunk);
for (const event of events) {
yield event;
}
}
// Finalize (closes open blocks)
const { events } = transformer.finalize();StreamTransformerOptions
| Option | Type | Description |
|---|---|---|
generateMessageId | (agentId: string) => string | Generate unique message IDs |
includeStepEvents | boolean | Include step-start/finish events (default: false) |
chunkFilter | (chunk: StreamChunk) => boolean | Filter chunks before transformation |
startMetadata | Record<string, unknown> | (agentId: string) => Record<string, unknown> | Metadata for start event |
finishMetadata | Record<string, unknown> | (agentId: string) => Record<string, unknown> | Metadata for finish event |
logger | Logger | Optional logger instance |
Streaming Metadata
Add custom metadata to the start and finish events:
// Static metadata
const transformer = new StreamTransformer({
startMetadata: { requestId: 'req-123', environment: 'prod' },
finishMetadata: { model: 'claude-3', totalTokens: 150 },
});
// Dynamic metadata
const transformer = new StreamTransformer({
startMetadata: (agentId) => ({
agentId,
startedAt: Date.now(),
}),
finishMetadata: (agentId) => ({
agentId,
finishedAt: Date.now(),
}),
});Metadata appears in the messageMetadata field of events:
// Start event
{
type: 'start',
messageId: 'msg-123',
messageMetadata: { requestId: 'req-123', environment: 'prod' }
}
// Finish event
{
type: 'finish',
messageId: 'msg-123',
finishReason: 'stop',
messageMetadata: { model: 'claude-3', totalTokens: 150 }
}Event Mapping
| Helix Chunk | AI SDK Events |
|---|---|
text_delta | text-start, text-delta |
thinking | reasoning-start, reasoning-delta, reasoning-end |
tool_start | text-end (if needed), tool-input-available |
tool_end | tool-output-available |
subagent_start | data-subagent-start |
subagent_end | data-subagent-end |
custom | data-{eventName} |
state_patch | data-state-patch |
error | error |
output | data-output |
Message Converter
Convert Helix messages to AI SDK v6 UIMessage format.
import { convertToAISDKMessages } from '@helix-agents/ai-sdk';
const uiMessages = convertToAISDKMessages(helixMessages, {
generateId: (index, msg) => `msg-${index}`,
includeReasoning: true,
mergeToolResults: true,
});UIMessage Format (AI SDK v6)
interface UIMessage {
id: string;
role: 'user' | 'assistant' | 'system';
parts: UIMessagePart[];
metadata?: Record<string, unknown>; // Passed through from source messages
}
type UIMessagePart = UIMessageTextPart | UIMessageReasoningPart | UIMessageToolInvocationPart;
interface UIMessageTextPart {
type: 'text';
text: string;
}
interface UIMessageReasoningPart {
type: 'reasoning';
text: string;
}
interface UIMessageToolInvocationPart {
type: `tool-${string}`; // e.g., 'tool-search'
toolCallId: string;
input: Record<string, unknown>;
state: ToolInvocationState;
output?: unknown;
errorText?: string;
}
type ToolInvocationState =
| 'input-streaming'
| 'input-available'
| 'output-available'
| 'output-error';Store Utilities
Load messages directly from a StateStore with automatic conversion to UI format.
loadUIMessages
Load messages with pagination, returning AI SDK v6 format.
import { loadUIMessages } from '@helix-agents/ai-sdk';
const { messages, hasMore } = await loadUIMessages(stateStore, sessionId, {
offset: 0,
limit: 50,
includeReasoning: true,
includeToolResults: true,
generateId: (index, msg) => `msg-${index}`,
});Note: Paginated loading may not correctly merge tool results when a tool call and its result span different pages. Use
loadAllUIMessagesfor guaranteed tool result merging.
loadAllUIMessages
Load all messages (auto-paginates), returning AI SDK v6 format. Guarantees correct tool result merging.
import { loadAllUIMessages } from '@helix-agents/ai-sdk';
const allMessages = await loadAllUIMessages(stateStore, sessionId, {
includeReasoning: true,
includeToolResults: true,
});
// Use with useChat
const { messages } = useChat({
initialMessages: allMessages,
});LoadUIMessagesOptions
interface LoadUIMessagesOptions {
// Pagination
offset?: number; // Starting position (default: 0)
limit?: number; // Max messages to return (default: 50)
// Conversion
includeReasoning?: boolean; // Include thinking content (default: true)
includeToolResults?: boolean; // Merge tool results (default: true)
generateId?: (index: number, message: Message) => string;
}LoadUIMessagesResult
interface LoadUIMessagesResult<T> {
messages: T[];
hasMore: boolean;
}UIMessageStore
Wrapper around StateStore for repeated UI message access.
createUIMessageStore
import { createUIMessageStore } from '@helix-agents/ai-sdk';
const uiStore = createUIMessageStore(stateStore);
const { messages, hasMore } = await uiStore.getUIMessages(sessionId, options);
const all = await uiStore.getAllUIMessages(sessionId, options);UIMessageStore Interface
interface UIMessageStore {
/** Access the underlying state store */
stateStore: SessionStateStore;
/** Load messages with pagination */
getUIMessages(sessionId: string, options?: LoadUIMessagesOptions): Promise<GetUIMessagesResult>;
/** Load all messages (auto-paginates) */
getAllUIMessages(sessionId: string, options?: Omit<LoadUIMessagesOptions, 'offset' | 'limit'>): Promise<UIMessage[]>;
}State Mapping
Functions for converting between core UIMessage format and AI SDK v6 format.
mapToolStateToAISDK
Map core UIToolState to AI SDK ToolInvocationState.
import { mapToolStateToAISDK } from '@helix-agents/ai-sdk';
mapToolStateToAISDK('pending'); // 'input-available'
mapToolStateToAISDK('executing'); // 'input-available'
mapToolStateToAISDK('completed'); // 'output-available'
mapToolStateToAISDK('error'); // 'output-error'| Core State | AI SDK State |
|---|---|
pending | input-available |
executing | input-available |
completed | output-available |
error | output-error |
convertCoreToAISDKMessage
Convert a single core UIMessage to AI SDK v6 format.
import { convertCoreToAISDKMessage } from '@helix-agents/ai-sdk';
const aiSdkMessage = convertCoreToAISDKMessage(coreMessage);convertCoreToAISDKMessages
Convert an array of core UIMessages to AI SDK v6 format.
import { convertCoreToAISDKMessages } from '@helix-agents/ai-sdk';
const aiSdkMessages = convertCoreToAISDKMessages(coreMessages);Multi-turn Conversations
Continue from Previous Run
const response = await handler.handleRequest({
method: 'POST',
body: {
message: 'Follow up question',
sessionId: 'session-123', // Continue conversation in this session
},
});Provide Message History Directly
const response = await handler.handleRequest({
method: 'POST',
body: {
message: 'Continue our conversation',
messages: [
{ role: 'user', content: 'Previous question' },
{ role: 'assistant', content: 'Previous answer' },
],
},
});Message Metadata
const response = await handler.handleRequest({
method: 'POST',
body: {
message: 'Hello',
metadata: { source: 'web-ui', userId: 'user-123' },
},
});Re-exported Utilities
The following are re-exported from @helix-agents/core for convenience:
Type Guards
import {
isUITextPart,
isUIReasoningPart,
isUIToolInvocationPart,
isUIAssistantMessage,
isUIUserMessage,
isUISystemMessage,
} from '@helix-agents/ai-sdk';Helper Functions
import {
getToolParts,
getTextContent,
hasPendingTools,
hasErroredTools,
} from '@helix-agents/ai-sdk';Converter Functions
import {
convertToCoreUIMessages, // Alias for core's convertToUIMessages
buildToolResultMap,
getAllToolInvocations,
hasActiveTools,
} from '@helix-agents/ai-sdk';SSE Response Builder
Build Server-Sent Events responses.
import { createSSEStream, createSSEHeaders, buildSSEResponse } from '@helix-agents/ai-sdk';
// Full response
const response = buildSSEResponse(eventsGenerator, {
headers: { 'X-Custom': 'value' },
});
// Manual construction
const headers = createSSEHeaders({ 'X-Custom': 'value' });
const stream = createSSEStream(eventsGenerator);SSE Format
id: 1
data: {"type":"text-delta","id":"block-1","delta":"Hello"}
id: 2
data: {"type":"text-delta","id":"block-1","delta":" world"}
data: {"type":"finish"}Header Utilities
import {
AI_SDK_UI_HEADER, // 'X-AI-SDK-UI'
AI_SDK_UI_HEADER_VALUE, // 'vercel-ai-sdk-ui'
extractResumePosition,
} from '@helix-agents/ai-sdk';
// Extract resume position from Last-Event-ID
const lastEventId = request.headers.get('Last-Event-ID');
const resumeAt = extractResumePosition(lastEventId);
// Check if request is from AI SDK UI
const isAISDK = request.headers.get(AI_SDK_UI_HEADER) === AI_SDK_UI_HEADER_VALUE;Errors
All errors extend FrontendHandlerError:
import {
FrontendHandlerError,
ValidationError, // 400: Missing/invalid params
StreamNotFoundError, // 404: Stream doesn't exist
StreamReaderError, // 500: Reader creation failed
StreamFailedError, // 410: Stream has failed
ConfigurationError, // 501: Missing configuration
ExecutionError, // 500: Agent execution failed
StreamCreationError, // 500: Stream creation failed
} from '@helix-agents/ai-sdk';
try {
await handler.handleRequest(req);
} catch (error) {
if (error instanceof FrontendHandlerError) {
return Response.json({ error: error.message, code: error.code }, { status: error.statusCode });
}
throw error;
}Error Properties
interface FrontendHandlerError extends Error {
code: string; // e.g., 'VALIDATION_ERROR'
statusCode: number; // HTTP status code
}Types
Event Types
import type {
AISDKUIEvent,
AISDKStartEvent,
AISDKFinishEvent,
AISDKTextStartEvent,
AISDKTextDeltaEvent,
AISDKTextEndEvent,
AISDKReasoningStartEvent,
AISDKReasoningDeltaEvent,
AISDKReasoningEndEvent,
AISDKToolInputAvailableEvent,
AISDKToolOutputAvailableEvent,
AISDKStartStepEvent,
AISDKFinishStepEvent,
AISDKDataEvent,
AISDKErrorEvent,
} from '@helix-agents/ai-sdk';Configuration Types
import type {
StreamTransformerOptions,
FrontendHandlerOptions,
FrontendRequest,
FrontendResponse,
TransformResult,
SequencedEvent,
MessageConvertOptions,
} from '@helix-agents/ai-sdk';Complete Example
Backend (Hono)
import { Hono } from 'hono';
import { cors } from 'hono/cors';
import { createFrontendHandler, FrontendHandlerError } from '@helix-agents/ai-sdk';
import { JSAgentExecutor } from '@helix-agents/runtime-js';
import { InMemoryStateStore, InMemoryStreamManager } from '@helix-agents/store-memory';
import { VercelAIAdapter } from '@helix-agents/llm-vercel';
import { MyAgent } from './agent.js';
const stateStore = new InMemoryStateStore();
const streamManager = new InMemoryStreamManager();
const executor = new JSAgentExecutor(stateStore, streamManager, new VercelAIAdapter());
const handler = createFrontendHandler({
streamManager,
executor,
agent: MyAgent,
stateStore,
});
const app = new Hono();
app.use(
'/api/*',
cors({
origin: ['http://localhost:3000'],
allowHeaders: ['Content-Type', 'Last-Event-ID'],
exposeHeaders: ['X-Session-Id'],
})
);
app.post('/api/chat', async (c) => {
try {
const body = await c.req.json();
const response = await handler.handleRequest({
method: 'POST',
body: { message: body.message, state: body.state },
});
return new Response(response.body, {
status: response.status,
headers: response.headers,
});
} catch (error) {
if (error instanceof FrontendHandlerError) {
return c.json({ error: error.message, code: error.code }, error.statusCode);
}
throw error;
}
});
app.get('/api/messages/:sessionId', async (c) => {
const sessionId = c.req.param('sessionId');
const { messages, hasMore } = await handler.getMessages(sessionId);
return c.json({ messages, hasMore });
});
export default app;Frontend (React)
import { useChat } from 'ai/react';
function Chat() {
const { messages, input, handleInputChange, handleSubmit, isLoading } = useChat({
api: '/api/chat',
});
return (
<div>
{messages.map((msg) => (
<div key={msg.id}>
{msg.parts.map((part, i) => (
<MessagePart key={i} part={part} />
))}
</div>
))}
<form onSubmit={handleSubmit}>
<input value={input} onChange={handleInputChange} disabled={isLoading} />
<button type="submit">Send</button>
</form>
</div>
);
}