Skip to main content
This example shows a full-stack chat application: a Vibes streaming agent on the backend, connected to a React frontend using Vercel AI UI’s useChat hook. Messages persist across turns using Vibes’ built-in message history.

What you’ll learn

  • agent.stream() for streaming responses
  • toDataStreamResponse() to convert streams to Vercel AI data protocol
  • useChat React hook for real-time streaming UIs
  • Multi-turn conversation with messageHistory

Prerequisites

  • ANTHROPIC_API_KEY set in your environment
  • For the Deno server: Vibes installed (deno add npm:@vibesjs/sdk)
  • For the Next.js route: npm install @vibesjs/sdk @ai-sdk/anthropic ai

Complete example

import { Agent } from "npm:@vibesjs/sdk";
import { anthropic } from "npm:@ai-sdk/anthropic";

export const chatAgent = new Agent({
  model: anthropic("claude-sonnet-4-6"),
  systemPrompt:
    "You are a helpful assistant. Be concise and friendly. " +
    "Remember context from earlier in the conversation.",
});

Run it

# Deno server
deno run --allow-net --allow-env server.ts

# Or: Next.js dev server
npm run dev

How it works

agent.stream(): Returns a result with textStream (async iterable of string chunks), partialOutput, and accumulated messages. toDataStreamResponse(): Converts the text stream to the Vercel AI data stream protocol - the format useChat expects. Imported from the "ai" package (already a dependency in your project). messageHistory: Pass previous messages to maintain conversation context across turns. Vibes accumulates new messages in result.messages - or use result.newMessages for only the new messages added in that run. useChat: Manages message state, sends POST requests to your API route, and streams tokens into the UI automatically. No manual state management needed.

Next steps