The rari SSR Breakthrough: 12x Faster, 10x Higher Throughput Than Next.js
- Open Source
- React
- Rust
- Architecture
- Performance
Three months ago, I shipped rari with solid numbers: 4x faster component rendering than Next.js, 3.74x higher throughput, and 5.8x faster builds. But the architecture had gaps. The app router wasn't there. True server-side rendering wasn't there. The use directive semantics weren't quite right.
So I went back and built the missing pieces properly. With app router support, true SSR, and correct RSC semantics in place, the performance profile changed significantly:
- Response times dropped to 0.69ms (3.8x faster than Next.js)
- Throughput jumped to 20,226 req/sec (10.5x higher than Next.js)
- P99 latency under load: 4ms (12x faster than Next.js)
- Bundle sizes: 68% smaller (27.6 KB vs 85.9 KB)
These aren't incremental improvements. This is what happens when your architecture matches React's design intentions.
The Three Missing Pieces
rari could render React Server Components before, but it wasn't doing it the right way. Here's what I added.
1. The App Router
The app router is more than file-based routing—it's how the framework understands your application. During build, rari now analyzes every route and component, determining what needs to run on the server and what needs to hydrate on the client.
This enables:
- Automatic code splitting based on route boundaries
- Intelligent prefetching of dependencies
- Zero-config RSC rendering directly from routes
- Proper dependency graph analysis across the entire app
// app/layout.tsx - Server component by default
export default function Layout({ children }) {
return (
<html>
<body>{children}</body>
</html>
)
}
// app/dashboard/page.tsx - Server component by default
export default async function Dashboard() {
const data = await fetchUserData()
return (
<div>
<h1>{data.name}</h1>
<DashboardClient initialData={data} />
</div>
)
}
Once this was in place, bundle sizes dropped. The framework finally knew what belonged where.
2. Correct Use Directive Semantics
I had the directives wrong—not broken, but wrong enough to cause confusion.
'use server' doesn't mark server components. It marks server functions. Server components are the default. No directive needed.
// Server function - handles mutations, database access
'use server'
export async function updateUser(userId: string, data: FormData) {
await db.users.update(userId, Object.fromEntries(data))
revalidatePath('/dashboard')
}
// Server component - fetches data, renders markup (no directive needed)
export async function UserCard({ userId }: { userId: string }) {
const user = await fetchUser(userId)
return (
<div>
<h2>{user.name}</h2>
<UpdateForm userId={userId} onSubmit={updateUser} />
</div>
)
}
// Client component - handles interactivity
'use client'
export function UpdateForm({
userId,
onSubmit
}: {
userId: string
onSubmit: (userId: string, data: FormData) => Promise<void>
}) {
const [isPending, setIsPending] = useState(false)
async function handleSubmit(e: React.FormEvent<HTMLFormElement>) {
e.preventDefault()
setIsPending(true)
await onSubmit(userId, new FormData(e.currentTarget))
setIsPending(false)
}
return (
<form onSubmit={handleSubmit}>
<input name="name" required />
<button disabled={isPending}>
{isPending ? 'Saving...' : 'Save'}
</button>
</form>
)
}
Aligning with React's actual design simplified the mental model and made the code clearer.
3. True Server-Side Rendering
Before, I was doing what I'd call "RSC-flavored rendering." Components ran on the server, but the hydration story wasn't complete.
Now everything renders on the Rust runtime by default. The client receives pre-rendered HTML and RSC payloads, then hydrates only the interactive bits. This eliminates three performance bottlenecks:
- Client-side hydration overhead — The tree doesn't re-render to become interactive
- JavaScript parsing delay — Interactive components are already wired up
- Cascading waterfall requests — Data is fetched server-side during render
The Rust runtime handles RSC rendering with minimal overhead, delivering HTML directly to the client.
The New Numbers
With these three pieces in place, here's what the benchmarks show.
Response Time: 3.8x Faster Than Next.js
| Metric | rari | Next.js | Improvement |
|---|---|---|---|
| Avg Response | 0.69ms | 2.58ms | 3.8x faster |
| P95 | 1.15ms | 3.37ms | 2.9x faster |
| Bundle Size | 27.6 KB | 85.9 KB | 68% smaller |
Throughput Under Load: 10.5x Better
Stress-tested with 50 concurrent connections for 30 seconds:
| Metric | rari | Next.js | Improvement |
|---|---|---|---|
| Throughput | 20,226 req/sec | 1,934 req/sec | 10.5x higher |
| Avg Latency | 2.04ms | 25.25ms | 12.4x faster |
| P99 Latency | 4ms | 48ms | 12x faster |
| Errors | 0 | 0 | Stable |
Build Performance: Stayed Consistent
| Metric | rari | Next.js |
|---|---|---|
| Build Time | 1.64s | 9.11s |
| Bundle Size | 273 KB | 742 KB |
Build times barely changed because I was already leveraging Rolldown and tsgo optimally. The real wins were in runtime performance and bundle size.
Why Performance Improved
Better architecture plus correct semantics equals better performance.
Smaller Bundles (68% reduction): Only interactive components ship to the client. Server components stay on the server. The app router eliminates dead code. The framework knows what the client actually needs.
Lower Latency (3.8x faster): SSR means no client-side hydration wait. The Rust runtime renders faster than JavaScript. RSC rendering starts immediately with no layers between components and the client.
Higher Throughput (10.5x): The persistent Rust runtime handles connection pooling efficiently. Concurrent request handling is optimized at the runtime level with no per-request overhead from spinning up Node.js contexts.
Predictable Performance: No garbage collection pauses. No event loop blocking. Linear scaling with more instances.
What This Means in Practice
For users: HTML arrives immediately from the server. Interactive components hydrate instantly. The client only runs code for interactive features, and the server handles the heavy lifting under load.
For developers: 'use server' and 'use client' directives work as expected. HMR works for both server and client components. Full TypeScript support across the client/server boundary. Build, bundle, and RSC setup are automatic.
For infrastructure: Smaller client JavaScript means fewer resources needed. The Rust runtime handles 10.5x more concurrent requests. No garbage collection pauses under load.
Getting Started
The API hasn't changed. If you've used rari before, upgrading is a single version bump:
npm install rari@latest
npm run dev
New to rari:
npm create rari-app@latest my-app
cd my-app
npm run dev
The framework handles hot module replacement for both server and client components, TypeScript inference across the client/server boundary, error boundaries, Suspense, form actions, and revalidation.
Benchmarks and Reproducibility
All benchmarks are reproducible. The test applications and scripts are at github.com/rari-build/benchmarks.
Test Environment:
- Hardware: MacBook Pro M2, 8GB RAM
- Test Duration: 30 seconds with warmup
- Concurrent Users: 50 simultaneous connections
- Real-world patterns: data fetching, nested components, client interactivity
I encourage you to run them yourself and share your results.
What's Next
With SSR and app router in place, I'm focusing on:
- Streaming RSC for progressive rendering (framework support added, testing in progress)
- Advanced RSC patterns for suspense and edge-case support
- Ecosystem integrations with popular libraries and services
- Production case studies from teams using rari in real applications
Get Involved
- Try rari at rari.build
- Star us at github.com/rari-build/rari
- Contribute to the framework or ecosystem
- Join our Discord for real-time discussions and support