Skip to content

v7 to v8 Migration Guide — FrontendHandler Removal

This guide covers the v8 release of @helix-agents/ai-sdk. v8 is a focused breaking release that removes the deprecated FrontendHandler class + createFrontendHandler factory and their Cloudflare convenience wrapper. All replacement APIs (handleChatStream, buildSnapshot, getUIMessages, createCloudflareChatHandler) shipped in v7 and have been the recommended path since then; v8 simply deletes the old surface.

If you migrated to the new APIs as part of the v6 → v7 upgrade, v8 is a no-op for you — bump the version, run npm install, and check your build. If you are still using createFrontendHandler directly or the Cloudflare convenience factory, this guide is for you.


Table of contents

  1. Overview / motivation
  2. What was removed
  3. Migration cookbook
  4. Observable behavior changes
  5. Validation checklist

Overview / motivation

The v7 stateless suspension redesign introduced a single orchestrator function — handleChatStream — that handles every dispatch path the old FrontendHandler class was juggling internally (seven paths in total: fresh session, continuing with a new user message, resume after submit, abandonment recovery, active-stream re-attach, completed retry, stale-runId rejection). Once that orchestrator landed, FrontendHandler had nothing left to do except shuffle deps + parameters into the orchestrator.

v7 kept the class as a deprecated convenience wrapper to soften the migration. v8 removes the wrapper.

The replacement surface is intentionally functional + dependency-injected:

  • handleChatStream(deps, params) returns a Response (web standard).
  • buildSnapshot(deps, params) reads durable state + stream metadata and returns a frontend snapshot.
  • getUIMessages(deps, params) reads conversation history and returns AI SDK UI messages.

There is no dispatch class. deps is a plain object literal ({ executor, stateStore, streamManager, agent, ... }) so consumers can swap individual deps in tests without subclassing.


What was removed

Classes / factories

  • FrontendHandler — the dispatch class
  • createFrontendHandler({...}) — its factory
  • createCloudflareFrontendHandler({...}) — Cloudflare DO convenience

Types

  • FrontendHandlerOptions
  • FrontendRequest
  • CloudflareFrontendHandlerOptions
  • AgentConfigOrMinimal

Tests deleted (covered by replacement suites)

  • packages/ai-sdk/src/__tests__/handler-factory.test.ts (77 cases) — replaced by packages/ai-sdk/src/__tests__/handler/snapshot.test.ts, get-messages.test.ts, and the cross-runtime e2e suites under packages/e2e/src/__tests__/* which exercise handleChatStream end-to-end across the JS / Temporal / Cloudflare DO / DBOS runtimes.
  • packages/ai-sdk/src/__tests__/handler/cloudflare-handler.test.ts — replaced by packages/ai-sdk/src/__tests__/cloudflare/cloudflare-chat-handler.test.ts.
  • Six *.integ.test.ts files in packages/ai-sdk/src/__tests__/integration/ that exclusively tested FrontendHandler behavior (block-id-collision, content-replay, handler-pipeline, message-loading, snapshot-partial-content, snapshot-stability). The equivalent code paths are covered by the e2e suites and the new smoke tests for buildSnapshot / getUIMessages.

What survived

  • FrontendHandlerError (base error class) — still re-exported and used by route handlers + the express adapter's catch blocks.
  • FrontendResponse — still used by buildSSEResponse and the express adapters. The name is kept for backwards-compat with existing consumers.
  • MinimalAgentConfig — used by BuildSnapshotDeps.
  • MinimalExecutor, MinimalStreamManager, MinimalStateStore — the minimal interface types still describe the dependency shape for adapter implementations.

Migration cookbook

A. Direct createFrontendHandlerhandleChatStream + helpers

Before (v7):

ts
import { createFrontendHandler } from '@helix-agents/ai-sdk';

const handler = createFrontendHandler({
  executor,
  stateStore,
  streamManager,
  agent: myAgent,
  contentReplay: { enabled: true },
});

// POST /chat
app.post('/chat', async (req, res) => {
  const response = await handler.handleRequest({
    method: 'POST',
    body: await readJson(req),
  });
  pipeToExpress(response, res);
});

// GET /chat?streamId=...
app.get('/chat', async (req, res) => {
  const response = await handler.handleRequest({
    method: 'GET',
    streamId: req.query.streamId,
    resumeAt: Number(req.query.resumeAt) || undefined,
  });
  pipeToExpress(response, res);
});

// Snapshot endpoint
app.get('/chat/:id/snapshot', async (req, res) => {
  const snapshot = await handler.getSnapshot(req.params.id);
  res.json(snapshot);
});

// Messages endpoint
app.get('/chat/:id/messages', async (req, res) => {
  const result = await handler.getMessages(req.params.id);
  res.json(result);
});

After (v8):

ts
import {
  handleChatStream,
  buildSnapshot,
  getUIMessages,
} from '@helix-agents/ai-sdk';
import { createWebResponseExpressMiddleware } from '@helix-agents/ai-sdk/adapters/express';

const deps = {
  executor,
  stateStore,
  streamManager,
  agent: myAgent,
  contentReplay: { enabled: true },
};

// POST or GET on /chat — handleChatStream returns a web Response.
app.use(
  '/chat',
  createWebResponseExpressMiddleware(async (req) => {
    const body = req.method === 'POST' ? await readJson(req) : undefined;
    return handleChatStream(deps, {
      sessionId: body?.sessionId ?? req.query.sessionId,
      messages: body?.messages,
      // ... other params per the v7 docs
    });
  }),
);

// Snapshot endpoint
app.get('/chat/:id/snapshot', async (req, res) => {
  const snapshot = await buildSnapshot(deps, { sessionId: req.params.id });
  res.json(snapshot);
});

// Messages endpoint
app.get('/chat/:id/messages', async (req, res) => {
  const result = await getUIMessages(deps, { sessionId: req.params.id });
  res.json(result);
});

Key differences:

  1. handleChatStream returns a web Response, not a FrontendResponse object — use the new createWebResponseExpressMiddleware / pipeWebResponseToExpress adapters to plug into Express.
  2. buildSnapshot / getUIMessages take deps as the first arg and params as the second; both are pure functions over the deps.
  3. There is no handler.handleRequest({...}) dispatch — POST vs GET vs submit-tool-result is dispatched inside handleChatStream based on the inbound messages / resume protocol.
  4. deps.agent is typed as AnyAgentConfig | MinimalAgentConfig. Pass the full agent definition in-process; pass MinimalAgentConfig ({ name, type }) when the agent lives in a remote runtime and only routing info is available (e.g. behind a DO).

B. Cloudflare DO → createCloudflareChatHandler

Before (v7):

ts
import { createCloudflareFrontendHandler } from '@helix-agents/ai-sdk/cloudflare';

const handler = createCloudflareFrontendHandler({
  namespace: env.AGENT_DO,
  agentName: 'planner',
  contentReplay: { enabled: true },
});

const response = await handler.handleRequest({
  method: 'POST',
  body: await c.req.json(),
});

After (v8):

ts
import { createCloudflareChatHandler } from '@helix-agents/ai-sdk/cloudflare';

const handler = createCloudflareChatHandler({
  namespace: env.AGENT_DO,
  agentName: 'planner',
  contentReplay: { enabled: true },
});

const body = await c.req.json();
const response = await handler.handleChat({
  sessionId: body.sessionId,
  messages: body.messages,
});

// Snapshot + messages helpers also live on the handler:
const snapshot = await handler.getSnapshot({ sessionId });
const messages = await handler.getMessages({ sessionId });

The Cloudflare factory wires the same DO client trio (DOFrontendExecutor, DOStateStoreClient, DOStreamManagerClient) into deps. The DO clients themselves are unchanged from v7.

C. Express adapter

If your code uses pipeToExpress (which took a FrontendResponse), keep using it — FrontendResponse is still exported and buildSSEResponse still produces it. The only new option is the web Response-based pair, both exported from @helix-agents/ai-sdk/adapters/express:

ts
export function pipeWebResponseToExpress(
  response: Response,
  res: import('express').Response,
): Promise<void>;

export function createWebResponseExpressMiddleware(
  handler: (req: import('express').Request) => Promise<Response>,
): import('express').RequestHandler;
  • pipeWebResponseToExpress(response, res) forwards a web Response (status, headers, and SSE body) into the Express ServerResponse.
  • createWebResponseExpressMiddleware(handler) is a middleware factory: it calls handler(req), pipes the resulting Response via pipeWebResponseToExpress, and forwards errors to next(err) — so FrontendHandlerError is surfaced through the standard Express error-handler chain.

Use the web Response variants when consuming handleChatStream directly (which returns a web Response).


Observable behavior changes

Three behaviors changed between the legacy FrontendHandler and the new handleChatStream. They are pre-existing in v7 (the orchestrator shipped in v7) but consumers who only used FrontendHandler would not have noticed them until they migrate. Audit tests for these differences before upgrading to v8:

1. Missing-stream response: 204 → 200 + empty SSE

When a GET request hits a streamId that doesn't exist (or has already ended and been GC'd), the legacy FrontendHandler returned HTTP 204 (No Content). handleChatStream returns HTTP 200 with an empty SSE body that closes the stream without emitting any agent events.

Both signal "no content" to clients; the change is purely transport-level. AI SDK v6 useChat handles both the same way.

Test impact: assertions like expect(response.status).toBe(204) must be relaxed to expect([200, 204]).toContain(response.status) or updated to expect 200. The e2e suites use the relaxed form.

2. No ValidationError class on bad-request rejection

The legacy FrontendHandler threw a typed ValidationError (code: 'VALIDATION_ERROR') when required fields were missing on a request (e.g., GET without streamId). handleChatStream throws a plain Error from the underlying input validation — the error class is now an implementation detail and may change between minor versions.

Test impact: assertions like expect(...).rejects.toThrow(ValidationError) must drop the class check and use expect(...).rejects.toThrow() (assert rejection only). Route-handler catch blocks should still use FrontendHandlerError for the typed-status mapping — that base class is unchanged.

3. generateMessageId is now derived for multi-turn de-duplication

The legacy FrontendHandler exposed a custom generateMessageId option on StreamTransformerOptions that overrode the message-id stamping in the transformer. handleChatStream derives a deterministic id from currentRun.startUIMessageCount (e.g. msg-1, msg-3) so multi-turn sessions don't render duplicate assistant bubbles.

The custom callback still runs when no current run exists (the fallback path). For path 6 (active-stream re-attach with content replay), a client-supplied existingMessageId header overrides both — but only when content replay is enabled.

Test impact: apps that depended on a custom id generator for multi-turn sessions will see the derived ids instead. If you need a custom generator, fork the transformer (StreamTransformerOptions.generateMessageId); the orchestrator only overrides the id when it can derive a stable one.


Validation checklist

After upgrading to v8:

  • [ ] npm install succeeds and tsc --noEmit passes — no lingering FrontendHandler / createFrontendHandler imports.
  • [ ] grep -r "createFrontendHandler\|FrontendHandler\b" src/ returns either zero matches or only FrontendHandlerError references in catch blocks (the error class is kept).
  • [ ] Snapshot + messages endpoints return the same shapes (use buildSnapshot + getUIMessages; signatures match the deleted FrontendHandler methods).
  • [ ] GET-with-missing-streamId returns 200 (empty SSE) and clients handle it. If clients hard-coded a 204 check, relax to 200 | 204 or update to expect 200.
  • [ ] If you relied on a custom generateMessageId for multi-turn sessions, confirm the new derived ids don't break your store / UI.

If anything is unclear, check the changeset (.changeset/ai-sdk-frontend-handler-removal.md) or open an issue.

Released under the MIT License.