Implement A/B testing in Next.js by using middleware to assign users to variants based on cookies, deterministic hashing, or random assignment, then using NextResponse.rewrite to serve different page versions while keeping the original URL unchanged
A/B testing in Next.js using middleware and routing allows you to serve different page variants to users at the edge, without rebuilding your application or causing visible URL changes. The core pattern involves intercepting requests in middleware, assigning users to experiment variants (either randomly or deterministically), storing that assignment in a cookie for consistency, and using NextResponse.rewrite() to serve content from variant-specific routes while preserving the original URL in the browser. This approach ensures users have a consistent experience across sessions and allows you to measure conversion differences between variants .
To support multiple variants, organize your pages in variant-specific folders. For a homepage test, you might have a control page at app/page.tsx and a test variant at app/test/page.tsx. For more complex experiments involving multiple pages, you can create variant-specific route groups like app/(control)/ and app/(test)/ that contain entire parallel page structures. The middleware then rewrites to the appropriate group based on assignment .
For experiments where you want consistent assignment without relying on cookies (e.g., for SEO testing or first visits), you can use deterministic hashing based on user identifiers. This approach hashes the user ID or IP address combined with the experiment ID to produce a consistent variant assignment . This ensures users always see the same variant regardless of cookie acceptance, while maintaining a predictable split percentage.
A complete A/B testing solution requires tracking variant assignments. When middleware assigns a variant, you should make that information available to your analytics tools. Common approaches include: setting a cookie that client-side analytics can read, adding the variant to response headers for server-side tracking, or making a non-blocking fetch to your analytics API from middleware . The variant information should also be passed to the client via props or context so that conversion events can be attributed correctly.
Use rewrites, not redirects: Redirects change the URL and break the user's expectation—rewrites keep the URL consistent while serving different content .
Store assignments in HttpOnly cookies: This prevents client-side tampering while maintaining consistency across sessions .
Handle bots and crawlers: Consider excluding known bots from experiments or always showing the control variant to avoid skewed data .
Implement a kill switch: Have a way to immediately show control to all users if the test variant has issues .
Test with force parameters: Add support for ?force_variant=test query parameters to allow QA and internal testing .
Monitor performance: Middleware must be fast—avoid blocking calls to external experiment services .
For production A/B testing, the Vercel Flags SDK provides a robust foundation . It handles variant assignment, cookie management, and integrates with popular providers. The SDK also supports precomputation for static sites, where multiple versions of pages are built at deploy time and served based on flags . This approach gives you the performance of static generation with the flexibility of runtime experiments.