ofetch vs undici in 2026: Server-Side HTTP Compared
·PkgPulse Team
TL;DR
ofetch for universal Nuxt/Nitro apps; undici for maximum Node.js HTTP performance. ofetch (~5M weekly downloads) is the unified HTTP client from the Nuxt team — works in browsers, Node.js, and edge runtimes with consistent behavior. undici (~18M downloads) is Node.js's official HTTP/1.1 client built for performance — faster than Node's built-in http module and the basis for native fetch in Node.js 18+.
Key Takeaways
- undici: ~18M weekly downloads — ofetch: ~5M (npm, March 2026)
- undici is Node.js official — included as the basis for
node:fetch - ofetch works universally — browser, Node.js, and edge runtimes
- undici is faster — designed for high-throughput Node.js servers
- ofetch has better DX — automatic JSON, retries, error handling
What Each Is For
undici:
- Node.js official HTTP client (from Node.js core team)
- Maximum HTTP performance for Node.js servers
- Low-level control over connection pools, pipelining
- Powers Node.js 18+ built-in fetch
- Best for: high-throughput server-to-server requests
ofetch:
- Universal fetch wrapper from Nuxt team
- Consistent behavior across environments
- Used internally by Nuxt, Nitro, and H3
- Smart defaults (JSON, error handling, retries)
- Best for: universal apps, Nuxt projects, readable code
API Comparison
// undici — low-level but fast
import { fetch, request, pipeline, stream } from 'undici';
// Using undici's high-performance request
const { statusCode, body } = await request('https://api.example.com/users', {
method: 'GET',
headers: { Authorization: `Bearer ${token}` },
});
const data = await body.json();
// Connection pool for high-throughput (key undici advantage)
import { Pool } from 'undici';
const pool = new Pool('https://api.example.com', {
connections: 10, // 10 concurrent connections
pipelining: 3, // HTTP pipelining
});
// 100 requests reusing the pool:
const results = await Promise.all(
ids.map(id => pool.request({ path: `/users/${id}`, method: 'GET' })
.then(({ body }) => body.json())
)
);
// ofetch — developer-friendly universal client
import { ofetch } from 'ofetch';
// GET — automatic JSON, type-safe
const users = await ofetch<User[]>('/api/users', {
baseURL: 'https://api.example.com',
headers: { Authorization: `Bearer ${token}` },
});
// users is User[] — typed
// POST — automatic JSON serialization
const user = await ofetch<User>('/api/users', {
method: 'POST',
body: { name: 'Alice', email: 'alice@example.com' },
// body is automatically stringified to JSON
});
// Error handling — throws FetchError on 4xx/5xx
try {
await ofetch('/api/missing');
} catch (err) {
if (err.response) {
console.log(err.status); // 404
console.log(err.data); // Parsed error response body
}
}
Nuxt / Nitro Integration
// ofetch is used internally by Nuxt and $fetch is the global
// No import needed in Nuxt components:
const data = await $fetch('/api/users');
// In Nuxt server routes (auto-imported):
export default defineEventHandler(async (event) => {
// $fetch works on both server and client
const user = await $fetch<User>(`/api/users/${event.context.params.id}`);
return user;
});
// useFetch composable (wraps ofetch + reactivity):
const { data, pending, error } = await useFetch('/api/users');
Performance
Benchmark: 10,000 POST requests to localhost (Node.js)
Client | Req/sec | Notes
----------------|----------|--------
undici Pool | 45,000 | Connection pooling
undici fetch | 38,000 | Standard fetch API
node-fetch | 22,000 | Legacy wrapper
ofetch | 20,000 | Universal, some overhead
Axios | 15,000 | XMLHttpRequest based
undici is 2-3x faster than alternatives for server-to-server.
For typical web app use, the difference is negligible.
When to Choose
Choose undici when:
- High-throughput Node.js microservice (10K+ requests/second)
- You need connection pooling for repeated requests to same host
- Maximum raw HTTP performance is the priority
- You're making HTTP/2 or HTTP pipelining optimizations
Choose ofetch when:
- Building with Nuxt or Nitro framework
- Universal app that runs on browser and server
- Developer ergonomics over raw performance
- You want consistent behavior across environments
- You need automatic retry, JSON parsing, and error handling
Compare ofetch and undici package health on PkgPulse.
See the live comparison
View ofetch vs. undici on PkgPulse →