Zum Hauptinhalt springen
All notes
2026 · 04 · 7 min

Vercel AI SDK v6 for production agents: the patterns that hold up

AI SDK v6 plus the Vercel AI Gateway is the production-ready combination. Use provider-string model IDs through the gateway, lean on the SDK's tool-calling primitives, and treat streaming as the default UX.

A year into shipping AI features on Vercel, the stack I keep landing on is the same: AI SDK v6 for the client and server primitives, AI Gateway for routing and observability, and a small number of typed tools per surface. The pieces are stable enough to recommend without caveats.

The gateway is the underrated piece. Plain provider strings ('anthropic/claude-opus-4-7', 'openai/gpt-5') route through one API key, get logged in one dashboard, and fail over automatically when a provider misbehaves. The cost is near-zero and it pays for itself the first time a provider has a regional outage at 4pm on a Friday.

For tool calling: define every tool with a Zod schema, stream tool results back through the UI, and never trust the model's tool selection without server-side validation. Production agents fail on edge cases that look fine in chat, like the model calling a tool with a stringified number when the schema wants an int. Zod catches it; the model doesn't.

What I avoid: directly importing provider-specific packages unless the feature requires it. The temptation is real, especially when you want fine-grained control over Anthropic's tool-use chains or OpenAI's responses API. Save it for the cases that actually need it. The gateway-routed default keeps your code portable across providers, which matters more in 2026 than it did in 2024.

WRITTEN BY
Ibrahim Aly
SENIOR FS ENGINEER · BERLIN ↔ CAIRO