ioredis vs node-redis vs Upstash Redis: Redis Clients for Node.js (2026)
TL;DR
For most Node.js applications with a traditional server: ioredis is the most reliable, feature-complete Redis client — excellent TypeScript support, built-in cluster and sentinel support, and battle-tested in production at Vercel, Alibaba, and large tech companies. node-redis (the official Redis client) is a solid alternative with a clean async/await API. Upstash Redis is the right choice for serverless and edge environments — it uses HTTP instead of persistent connections, making it perfect for Vercel Functions, Cloudflare Workers, and Next.js.
Key Takeaways
- ioredis: ~4.5M weekly downloads — most feature-complete, built-in cluster support, TypeScript-first
- redis (node-redis): ~5.8M weekly downloads — official Redis client, clean v4 API
- @upstash/redis: ~600K weekly downloads — HTTP-based, serverless-first, no persistent connections
- For serverless (Vercel, Cloudflare, AWS Lambda): use Upstash Redis — TCP connections are expensive
- For traditional Node.js servers: ioredis or node-redis — both excellent
- Upstash's 500K requests/day free tier works for most small projects
Download Trends
| Package | Weekly Downloads | Connection Type | Serverless-Ready |
|---|---|---|---|
redis (node-redis) | ~5.8M | TCP | ❌ |
ioredis | ~4.5M | TCP | ❌ |
@upstash/redis | ~600K | HTTP/REST | ✅ |
Why Connection Type Matters for Serverless
Redis was designed for persistent TCP connections. In serverless environments:
Traditional Server:
App starts → create Redis connection pool → serve many requests → pool stays alive
Connection overhead: once per server startup
Serverless Function (Lambda, Vercel, CF Workers):
Request arrives → function "starts" → create connection → handle request → function sleeps/dies
If you use TCP Redis: connection overhead on EVERY request
Cold start + TCP handshake + Redis AUTH + your logic = slow first requests
Upstash Redis (HTTP):
Request arrives → HTTP call to Redis (connection handled by Upstash infrastructure)
No persistent connection needed — works like any other API call
ioredis
ioredis is the community favorite for production Node.js Redis usage:
import Redis from "ioredis"
// Single instance:
const redis = new Redis({
host: process.env.REDIS_HOST,
port: 6379,
password: process.env.REDIS_PASSWORD,
tls: process.env.NODE_ENV === "production" ? {} : undefined,
lazyConnect: true, // Don't connect immediately
retryStrategy: (times) => Math.min(times * 100, 3000), // Retry with backoff
maxRetriesPerRequest: 3,
enableReadyCheck: true,
keepAlive: 30000, // Send keepalive every 30s
})
// Basic operations:
await redis.set("key", "value")
await redis.set("key-with-ttl", "value", "EX", 3600) // Expire in 1 hour
await redis.set("key-nx", "value", "NX") // Only set if not exists
const value = await redis.get("key")
const exists = await redis.exists("key")
await redis.del("key1", "key2")
// Hashes:
await redis.hset("package:react", {
name: "react",
version: "18.2.0",
downloads: "25000000",
updatedAt: new Date().toISOString(),
})
const pkg = await redis.hgetall("package:react")
// { name: "react", version: "18.2.0", ... }
// Lists:
await redis.rpush("recent-searches", "react", "vue", "solid")
const searches = await redis.lrange("recent-searches", 0, -1)
// Sorted sets (leaderboard):
await redis.zadd("downloads-leaderboard", 25000000, "react", 7000000, "vue", 5000000, "angular")
const topPackages = await redis.zrevrange("downloads-leaderboard", 0, 9, "WITHSCORES")
// Sets:
await redis.sadd("featured-tags", "react", "typescript", "testing")
const tags = await redis.smembers("featured-tags")
ioredis pipelining and transactions:
// Pipeline — batch commands, single roundtrip:
const pipeline = redis.pipeline()
pipeline.set("pkg:react:views", 0)
pipeline.set("pkg:vue:views", 0)
pipeline.set("pkg:solid:views", 0)
const results = await pipeline.exec()
// Each result: [error, value]
// Multi/exec — atomic transactions:
const [viewCount, _] = await redis
.multi()
.incr("pkg:react:views")
.expire("pkg:react:views", 86400)
.exec() ?? []
// Lua scripting — atomic operations without network roundtrips:
const rateLimitScript = `
local current = redis.call('incr', KEYS[1])
if current == 1 then
redis.call('expire', KEYS[1], ARGV[1])
end
return current
`
const count = await redis.eval(rateLimitScript, 1, `rate_limit:${userId}`, "3600")
ioredis Redis Cluster:
import { Cluster } from "ioredis"
const cluster = new Cluster([
{ host: "node1.redis.internal", port: 6379 },
{ host: "node2.redis.internal", port: 6379 },
{ host: "node3.redis.internal", port: 6379 },
], {
redisOptions: {
password: process.env.REDIS_PASSWORD,
tls: {},
},
scaleReads: "slave", // Read from replicas
})
// Cluster usage is identical to single-instance:
await cluster.set("key", "value")
const value = await cluster.get("key")
node-redis (redis v4+)
The official Redis client (v4) was rewritten with async/await-first API:
import { createClient, createCluster } from "redis"
const client = createClient({
url: process.env.REDIS_URL, // redis://user:password@host:port
socket: {
tls: process.env.NODE_ENV === "production",
reconnectStrategy: (retries) => Math.min(retries * 100, 3000),
},
})
// Must connect explicitly (unlike ioredis):
client.on("error", (err) => console.error("Redis error:", err))
await client.connect()
// Basic operations — similar to ioredis:
await client.set("key", "value")
await client.set("key-with-ttl", "value", { EX: 3600 }) // Different option syntax
const value = await client.get("key")
await client.del("key")
// Type helpers (node-redis v4):
await client.hSet("package:react", {
name: "react",
version: "18.2.0",
downloads: 25000000, // Numbers auto-serialized
})
// Note: hGetAll returns Record<string, string> — no auto-parsing
// Commands use camelCase:
await client.rPush("list", "item1", "item2")
await client.lRange("list", 0, -1)
await client.zAdd("leaderboard", [
{ score: 25000000, value: "react" },
{ score: 7000000, value: "vue" },
])
// Cleanup:
await client.quit()
node-redis in Next.js (singleton pattern):
// lib/redis.ts — prevent multiple connections in development:
import { createClient } from "redis"
declare global {
var redisClient: ReturnType<typeof createClient> | undefined
}
const client = global.redisClient ?? createClient({ url: process.env.REDIS_URL })
if (process.env.NODE_ENV !== "production") {
global.redisClient = client
}
if (!client.isOpen) {
await client.connect()
}
export { client as redis }
Upstash Redis
Upstash Redis uses HTTP for all operations — ideal for serverless:
import { Redis } from "@upstash/redis"
// Initialize with Upstash REST API credentials:
const redis = new Redis({
url: process.env.UPSTASH_REDIS_REST_URL!,
token: process.env.UPSTASH_REDIS_REST_TOKEN!,
})
// Same Redis commands — HTTP under the hood:
await redis.set("key", "value")
await redis.set("key-with-ttl", "value", { ex: 3600 })
const value = await redis.get<string>("key") // Type parameter for auto-parsing!
// Type-safe JSON values (unique to Upstash SDK):
interface PackageData {
name: string
downloads: number
version: string
}
await redis.set<PackageData>("pkg:react", {
name: "react",
downloads: 25000000,
version: "18.2.0",
})
const pkg = await redis.get<PackageData>("pkg:react")
// pkg is typed as PackageData | null — no JSON.parse needed
// Pipelines (batched HTTP request):
const pipeline = redis.pipeline()
pipeline.set("key1", "value1")
pipeline.set("key2", "value2")
pipeline.get("key1")
const results = await pipeline.exec()
Upstash in Next.js API Route or Server Action:
// app/api/package/[name]/route.ts
import { Redis } from "@upstash/redis"
import { NextRequest, NextResponse } from "next/server"
const redis = new Redis({
url: process.env.UPSTASH_REDIS_REST_URL!,
token: process.env.UPSTASH_REDIS_REST_TOKEN!,
})
export async function GET(req: NextRequest, { params }: { params: { name: string } }) {
const cacheKey = `pkg:${params.name}`
// Check cache first:
const cached = await redis.get<PackageData>(cacheKey)
if (cached) {
return NextResponse.json(cached, { headers: { "X-Cache": "HIT" } })
}
// Fetch from npm:
const data = await fetchFromNpm(params.name)
await redis.set(cacheKey, data, { ex: 300 }) // Cache 5 minutes
return NextResponse.json(data, { headers: { "X-Cache": "MISS" } })
}
Upstash in Cloudflare Workers:
// Works in edge runtime — no TCP connections needed:
import { Redis } from "@upstash/redis/cloudflare"
export default {
async fetch(request: Request, env: Env): Promise<Response> {
const redis = new Redis({
url: env.UPSTASH_REDIS_REST_URL,
token: env.UPSTASH_REDIS_REST_TOKEN,
})
const cached = await redis.get("counter")
return new Response(`Counter: ${cached}`)
}
}
Feature Comparison
| Feature | ioredis | node-redis | @upstash/redis |
|---|---|---|---|
| Protocol | TCP | TCP | HTTP/REST |
| Serverless support | ❌ (TCP) | ❌ (TCP) | ✅ Native |
| Edge runtime support | ❌ | ❌ | ✅ |
| Cluster support | ✅ Built-in | ✅ Built-in | ✅ (managed) |
| Sentinel support | ✅ | ✅ | N/A |
| TypeScript | ✅ Excellent | ✅ Good | ✅ Excellent |
| Auto JSON parsing | ❌ | ❌ | ✅ |
| Pipelining | ✅ | ✅ | ✅ (batched HTTP) |
| Pub/Sub | ✅ | ✅ | ✅ |
| Lua scripting | ✅ | ✅ | ✅ |
| Streams | ✅ | ✅ | ✅ |
| Free tier | N/A | N/A | 500K req/day |
Rate Limiting Patterns
A common Redis use case — works with all three:
// ioredis rate limiting:
async function rateLimit(redis: Redis, userId: string, limit = 100, window = 60) {
const key = `rate_limit:${userId}:${Math.floor(Date.now() / 1000 / window)}`
const [count] = await redis
.multi()
.incr(key)
.expire(key, window)
.exec() ?? []
const currentCount = (count as [null, number] | null)?.[1] ?? 0
return {
allowed: currentCount <= limit,
remaining: Math.max(0, limit - currentCount),
reset: Math.floor(Date.now() / 1000 / window + 1) * window,
}
}
// Upstash has @upstash/ratelimit — built for this:
import { Ratelimit } from "@upstash/ratelimit"
import { Redis } from "@upstash/redis"
const ratelimit = new Ratelimit({
redis: Redis.fromEnv(),
limiter: Ratelimit.slidingWindow(100, "60 s"),
prefix: "api",
})
const { success, limit, reset, remaining } = await ratelimit.limit(userId)
When to Use Each
Choose ioredis if:
- Long-running Node.js server (Express, Fastify, NestJS)
- Need Redis Cluster or Sentinel for high availability
- Your team values battle-tested library with extensive enterprise usage
- Advanced features: Lua scripts, streams, pub/sub at scale
Choose node-redis if:
- Prefer official support from Redis Labs
- New project with no existing preference
- Clean v4 async/await API is appealing
Choose @upstash/redis if:
- Serverless deployment (Vercel, Netlify, AWS Lambda)
- Edge runtime (Cloudflare Workers, Vercel Edge)
- You want managed Redis with a generous free tier
- HTTP is acceptable (slight latency overhead vs TCP)
Methodology
Download data from npm registry (weekly average, February 2026). Feature comparison based on ioredis 5.x, node-redis 4.x, and @upstash/redis 1.x documentation. Performance characteristics reflect community benchmarks.