Skip to content

OpenNext + Cloudflare DO

This example demonstrates a production-ready Next.js chat application deployed to Cloudflare Workers via OpenNext, with Helix Agents running in Durable Objects for resumable streaming.

Runnable Example: The complete source code is available in the monorepo at examples/opennext-cloudflare-do/.

Deployment Mode: Direct (same-worker DO binding). The Next.js app and Durable Object are deployed together, with direct access via env.AGENTS.

What This Demonstrates

  1. Single-Worker Deployment - Next.js and Durable Objects in one Cloudflare Worker
  2. Resumable Streams - Page refresh mid-stream resumes seamlessly
  3. SSR Hydration - Messages are server-rendered for instant display
  4. Direct DO Access - API routes access DOs via binding, not HTTP
  5. Content Replay - Partial content preserved across refreshes
  6. Multi-Turn Filtering - Follow-up messages stream only new content

Architecture

┌─────────────────────────────────────────────────────────────────────┐
│                    Single Cloudflare Worker                          │
│                                                                      │
│  ┌──────────────────────────────────────────────────────────────┐   │
│  │ OpenNext (Next.js)                                            │   │
│  │  - UI routes: /, /chat/[sessionId]                            │   │
│  │  - API routes: /api/chat/[sessionId]                          │   │
│  └──────────────────────────────────────────────────────────────┘   │
│                              │                                       │
│                              │ env.AGENTS (direct DO binding)        │
│                              ▼                                       │
│  ┌──────────────────────────────────────────────────────────────┐   │
│  │ ChatAgentServer (Durable Object)                              │   │
│  │  - Executes agents with unlimited streaming                   │   │
│  │  - SQLite storage for state & streams                         │   │
│  │  - SSE endpoints for real-time updates                        │   │
│  └──────────────────────────────────────────────────────────────┘   │
│                                                                      │
└─────────────────────────────────────────────────────────────────────┘

vs. Resumable Streams (Next.js)

FeatureResumable StreamsOpenNext + DO
RuntimeJS (Node.js)Cloudflare DO
PersistenceRedisSQLite (DO)
DeploymentTraditional hostingCloudflare edge
Streaming LimitRequest timeoutUnlimited
Cold StartsDepends on hostFast (edge)
Use CaseStandard Next.js appsEdge-first apps

Choose Resumable Streams (Next.js) if you're deploying to Vercel, Railway, or traditional hosting with Redis.

Choose OpenNext + Cloudflare DO if you want edge deployment with unlimited streaming duration.

Key Implementation Details

Direct DO Access

API routes access the Durable Object directly without HTTP:

typescript
// src/lib/agent-client.ts
import { getCloudflareContext } from '@opennextjs/cloudflare';

export function getDOStub(sessionId: string) {
  const { env } = getCloudflareContext();
  const doId = env.AGENTS.idFromName(`session:${sessionId}`);
  return env.AGENTS.get(doId);
}

// In API route
const stub = getDOStub(sessionId);
const response = await stub.fetch('https://do/start', { ... });

DO Export Injection

OpenNext generates a worker from Next.js. A post-build script injects the DO export:

javascript
// scripts/inject-durable-objects.mjs
// Appends to .open-next/worker.js:
export { ChatAgentServer } from "../src/agent/index.ts";

"Last Message Wins" Pattern

The agent server implements a hook that interrupts existing execution when a new message arrives:

typescript
hooks: {
  beforeStart: async ({ executionState }) => {
    if (executionState.isExecuting) {
      await executionState.interrupt('superseded');
    }
  },
}

API Routes

POST /api/chat/[sessionId]

Send a message and stream the response.

typescript
// Request
{ message: "Hello!" }
// or AI SDK v6 format
{ messages: [{ role: 'user', parts: [{ type: 'text', text: 'Hello!' }] }] }

// Response: SSE stream with AI SDK events

GET /api/chat/[sessionId]

Resume an existing stream.

typescript
// Headers
X-Resume-From-Sequence: 42
X-Existing-Message-Id: msg-123

// Response: SSE stream from sequence 42

GET /api/chat/[sessionId]/snapshot

Get current state snapshot.

typescript
// Response
{
  state: { ... },
  messages: UIMessage[],
  streamSequence: number,
  startSequence: number,
  status: 'active' | 'paused' | 'ended' | 'failed'
}

Running Locally

bash
# Install dependencies
npm install

# Set up environment (symlinks to root .env)
echo "OPENAI_API_KEY=sk-..." >> ../../.env

# Run with Miniflare (full DO access)
npm run preview

Note: npm run dev (Next.js dev mode) won't have access to Durable Objects.

Deploying

bash
# Login to Cloudflare
npx wrangler login

# Set secrets
npx wrangler secret put OPENAI_API_KEY

# Deploy
npm run deploy

Testing

The example includes comprehensive Playwright E2E tests:

bash
npm run test:e2e

Tests cover:

  • Basic chat functionality
  • Mid-stream page refresh
  • Follow-up message streaming
  • Tool call rendering
  • Content replay

Next Steps

Released under the MIT License.