Skip to main content

connect-redis vs rate-limit-redis vs ioredis: Redis Session & Rate Limiting in Node.js (2026)

·PkgPulse Team

TL;DR

connect-redis stores Express sessions in Redis instead of memory — sessions survive server restarts and work across multiple server instances. rate-limit-redis adds a Redis store to express-rate-limit — enforces rate limits across a cluster of servers, not just per-process. ioredis is the Redis client that powers both — high-performance, Cluster support, Lua scripting, and pipeline batching. In 2026: use ioredis as your Redis client, connect-redis for session storage, and rate-limit-redis for distributed rate limiting.

Key Takeaways

  • connect-redis: ~300K weekly downloads — Redis-backed session store for express-session
  • rate-limit-redis: ~100K weekly downloads — Redis store for express-rate-limit
  • ioredis: ~8M weekly downloads — the Redis client that powers both
  • In-memory sessions break when you scale to multiple servers — Redis fixes this
  • In-memory rate limits only protect individual server instances — Redis coordinates across all
  • ioredis supports Redis Cluster, Sentinel, pipelining, and Lua scripts

The Problem: Why Redis?

Without Redis (in-memory):
  Server A: sessions stored in memory → User hits Server B → session gone (logged out!)
  Server A: rate limit counter = 50 → User hits Server B → counter = 0 (bypassed!)

With Redis (shared state):
  Server A ─┐
  Server B ─┤─→ Redis ─→ Shared sessions, shared rate limits
  Server C ─┘

  User hits any server → same session, same rate limit counter

ioredis (The Redis Client)

ioredis — full-featured Redis client:

Setup

import Redis from "ioredis"

// Single instance:
const redis = new Redis({
  host: "127.0.0.1",
  port: 6379,
  password: process.env.REDIS_PASSWORD,
  db: 0,
  maxRetriesPerRequest: 3,
  retryStrategy(times) {
    const delay = Math.min(times * 200, 2000)
    return delay  // Return null to stop retrying
  },
})

// Redis URL (common in cloud):
const redis = new Redis(process.env.REDIS_URL)
// → "redis://:password@host:6379/0"

// With TLS (Upstash, ElastiCache):
const redis = new Redis(process.env.REDIS_URL, {
  tls: { rejectUnauthorized: false },
})

Common operations

// Strings:
await redis.set("package:react:score", "92.5", "EX", 3600)  // TTL 1 hour
const score = await redis.get("package:react:score")          // "92.5"

// JSON (store objects):
await redis.set("package:react", JSON.stringify({
  name: "react",
  score: 92.5,
  downloads: 5_000_000,
}), "EX", 3600)
const pkg = JSON.parse(await redis.get("package:react") ?? "{}")

// Hash (fields):
await redis.hset("package:react", {
  name: "react",
  score: "92.5",
  downloads: "5000000",
})
const name = await redis.hget("package:react", "name")

// Sets (unique values):
await redis.sadd("user:123:tracked", "react", "vue", "svelte")
const isTracked = await redis.sismember("user:123:tracked", "react")  // 1

// Sorted sets (leaderboard):
await redis.zadd("package:scores", 92.5, "react", 89.2, "vue", 87.1, "svelte")
const top10 = await redis.zrevrange("package:scores", 0, 9, "WITHSCORES")

Pipeline (batch operations)

// Pipeline sends multiple commands in one round-trip:
const pipeline = redis.pipeline()
pipeline.get("package:react:score")
pipeline.get("package:vue:score")
pipeline.get("package:svelte:score")
const results = await pipeline.exec()
// results = [[null, "92.5"], [null, "89.2"], [null, "87.1"]]

connect-redis (Session Storage)

connect-redis — Redis session store:

Setup

npm install express-session connect-redis ioredis
import express from "express"
import session from "express-session"
import RedisStore from "connect-redis"
import Redis from "ioredis"

const redis = new Redis(process.env.REDIS_URL)

const app = express()

app.use(session({
  store: new RedisStore({
    client: redis,
    prefix: "sess:",         // Redis key prefix
    ttl: 86400,              // Session TTL in seconds (24 hours)
    disableTouch: false,     // Update TTL on every request
  }),
  secret: process.env.SESSION_SECRET!,
  resave: false,
  saveUninitialized: false,
  cookie: {
    secure: process.env.NODE_ENV === "production",
    httpOnly: true,
    maxAge: 86400 * 1000,    // 24 hours in milliseconds
    sameSite: "lax",
  },
}))

Using sessions

// Store user data in session:
app.post("/auth/login", async (req, res) => {
  const user = await AuthService.authenticate(req.body.email, req.body.password)

  // Session stored in Redis automatically:
  req.session.userId = user.id
  req.session.role = user.role

  res.json({ success: true })
})

// Read session data:
app.get("/api/profile", (req, res) => {
  if (!req.session.userId) {
    return res.status(401).json({ error: "Not authenticated" })
  }

  // Session was loaded from Redis:
  res.json({ userId: req.session.userId, role: req.session.role })
})

// Destroy session (logout):
app.post("/auth/logout", (req, res) => {
  req.session.destroy((err) => {
    if (err) return res.status(500).json({ error: "Logout failed" })
    res.clearCookie("connect.sid")
    res.json({ success: true })
  })
})

What's stored in Redis

# Redis CLI — inspect session:
redis-cli
> KEYS sess:*
1) "sess:abc123def456"

> GET sess:abc123def456
"{\"cookie\":{\"originalMaxAge\":86400000,\"expires\":\"2026-03-10T...\",
  \"secure\":true,\"httpOnly\":true,\"sameSite\":\"lax\"},
  \"userId\":42,\"role\":\"admin\"}"

> TTL sess:abc123def456
(integer) 82341   # Seconds remaining

rate-limit-redis (Distributed Rate Limiting)

rate-limit-redis — Redis store for express-rate-limit:

Setup

npm install express-rate-limit rate-limit-redis ioredis
import rateLimit from "express-rate-limit"
import RedisStore from "rate-limit-redis"
import Redis from "ioredis"

const redis = new Redis(process.env.REDIS_URL)

// Global rate limit:
const globalLimiter = rateLimit({
  store: new RedisStore({
    sendCommand: (...args: string[]) => redis.call(...args),
    prefix: "rl:global:",
  }),
  windowMs: 60 * 1000,     // 1 minute window
  max: 100,                 // 100 requests per window
  standardHeaders: "draft-7",
  legacyHeaders: false,
  message: { error: "Too many requests, please try again later" },
})

app.use(globalLimiter)

Per-route rate limits

// Strict limit for auth endpoints:
const authLimiter = rateLimit({
  store: new RedisStore({
    sendCommand: (...args: string[]) => redis.call(...args),
    prefix: "rl:auth:",
  }),
  windowMs: 15 * 60 * 1000,  // 15 minutes
  max: 5,                     // 5 attempts per 15 minutes
  keyGenerator: (req) => req.ip ?? "unknown",
  handler: (req, res) => {
    res.status(429).json({
      error: "Too many login attempts",
      retryAfter: res.getHeader("Retry-After"),
    })
  },
})

app.post("/auth/login", authLimiter, loginHandler)

// Generous limit for public API:
const apiLimiter = rateLimit({
  store: new RedisStore({
    sendCommand: (...args: string[]) => redis.call(...args),
    prefix: "rl:api:",
  }),
  windowMs: 60 * 1000,
  max: 200,
  keyGenerator: (req) => {
    // Rate limit by API key if present, otherwise by IP:
    return req.headers["x-api-key"]?.toString() ?? req.ip ?? "unknown"
  },
})

app.use("/api", apiLimiter)

Sliding window (more fair)

import { RedisStore } from "rate-limit-redis"

// Fixed window: resets at minute boundaries
// → User can send 100 at :59 + 100 at :00 = 200 in 1 second

// Sliding window: smoother enforcement
const slidingLimiter = rateLimit({
  store: new RedisStore({
    sendCommand: (...args: string[]) => redis.call(...args),
    prefix: "rl:sliding:",
  }),
  windowMs: 60 * 1000,
  max: 100,
  // express-rate-limit v7+ uses sliding window by default with Redis store
})

Full Production Setup

import express from "express"
import session from "express-session"
import RedisStore from "connect-redis"
import rateLimit from "express-rate-limit"
import RateLimitRedisStore from "rate-limit-redis"
import Redis from "ioredis"

// Single Redis connection for everything:
const redis = new Redis(process.env.REDIS_URL)

const app = express()

// 1. Rate limiting (first — reject abusers early):
app.use(rateLimit({
  store: new RateLimitRedisStore({
    sendCommand: (...args: string[]) => redis.call(...args),
    prefix: "rl:",
  }),
  windowMs: 60_000,
  max: 100,
}))

// 2. Session management:
app.use(session({
  store: new RedisStore({ client: redis, prefix: "sess:" }),
  secret: process.env.SESSION_SECRET!,
  resave: false,
  saveUninitialized: false,
}))

// 3. Application cache:
async function getCachedPackage(name: string) {
  const cached = await redis.get(`cache:pkg:${name}`)
  if (cached) return JSON.parse(cached)

  const data = await PackageService.fetch(name)
  await redis.set(`cache:pkg:${name}`, JSON.stringify(data), "EX", 300)
  return data
}

// Single Redis instance handles: rate limits + sessions + cache

Feature Comparison

Featureconnect-redisrate-limit-redisioredis
PurposeSession storageRate limitingRedis client
Works withexpress-sessionexpress-rate-limitEverything
Multi-server
TTL management
Sliding windowN/AManual
Redis Cluster✅ (via ioredis)✅ (via ioredis)✅ Native
Weekly downloads~300K~100K~8M

When to Use Each

Use connect-redis when:

  • Running Express with sessions across multiple server instances
  • Sessions must survive server restarts
  • Need centralized session management (logout from all devices)

Use rate-limit-redis when:

  • Running express-rate-limit behind a load balancer
  • Need rate limits that work across all server instances
  • Per-API-key rate limiting in distributed systems

Use ioredis as the foundation:

  • Powers both connect-redis and rate-limit-redis
  • Also use directly for caching, pub/sub, queues, leaderboards
  • Supports Redis Cluster, Sentinel, and TLS

Alternatives to consider:

  • Upstash Redis — serverless Redis with HTTP API (great for edge/serverless)
  • Hono + Upstash — if not using Express, Upstash has native rate limiting SDKs

Methodology

Download data from npm registry (weekly average, February 2026). Feature comparison based on connect-redis v8.x, rate-limit-redis v4.x, and ioredis v5.x.

Compare Redis, session, and rate limiting packages on PkgPulse →

Comments

Stay Updated

Get the latest package insights, npm trends, and tooling tips delivered to your inbox.