Skip to main content

Guide

HTTP Proxy Libraries for Node.js 2026: Compared

Compare http-proxy-middleware, node-http-proxy, and fastify-http-proxy for building reverse proxies and API gateways in Node.js. Request forwarding, path.

·PkgPulse Team·
0

TL;DR

http-proxy-middleware is the Express/Connect proxy middleware — used by Create React App and Vite for dev server proxying, supports path rewriting and WebSocket forwarding. node-http-proxy (http-proxy) is the foundational library underneath — low-level, handles raw HTTP/HTTPS/WS proxying, maximum control. @fastify/http-proxy is Fastify's built-in proxy plugin — uses undici for high-performance proxying with Fastify's hook system. In 2026: http-proxy-middleware for Express dev-server proxying, node-http-proxy for custom proxy servers, @fastify/http-proxy for Fastify API gateways.

Key Takeaways

  • http-proxy-middleware: ~8M weekly downloads — Express middleware, powers CRA/Vite dev proxy
  • node-http-proxy (http-proxy): ~5M weekly downloads — foundational, low-level proxy engine
  • @fastify/http-proxy: ~200K weekly downloads — undici-based, Fastify hooks, high performance
  • http-proxy-middleware wraps node-http-proxy — adds middleware API, path rewriting, logging
  • @fastify/http-proxy uses undici (not http-proxy) — different foundation, faster for Fastify
  • Common use cases: dev server API proxy, microservice gateway, BFF (backend for frontend)

Why Proxy in Node.js?

Use case 1: Development proxy (avoid CORS):
  Frontend (localhost:3000) → Proxy → Backend API (localhost:8080)
  React/Vite dev server proxies /api/* to backend

Use case 2: API gateway:
  Client → Gateway (Node.js) → Service A (/api/packages)
                              → Service B (/api/users)
                              → Service C (/api/billing)

Use case 3: BFF (Backend for Frontend):
  Mobile app → BFF → combines data from multiple services
  Web app → BFF → aggregates + transforms responses

Use case 4: Load balancing / A-B testing:
  Client → Proxy → 90% → Production service
                 → 10% → Canary service

http-proxy-middleware

http-proxy-middleware — Express/Connect proxy:

Basic setup

import express from "express"
import { createProxyMiddleware } from "http-proxy-middleware"

const app = express()

// Proxy /api/* requests to backend:
app.use("/api", createProxyMiddleware({
  target: "http://localhost:8080",
  changeOrigin: true,
}))

// Request: GET http://localhost:3000/api/packages/react
// Proxied: GET http://localhost:8080/api/packages/react

app.listen(3000)

Path rewriting

app.use("/api", createProxyMiddleware({
  target: "http://localhost:8080",
  changeOrigin: true,
  pathRewrite: {
    "^/api/v2": "/v2",          // /api/v2/packages → /v2/packages
    "^/api": "",                 // /api/packages → /packages
  },
}))

Multiple targets

// Route different paths to different services:
app.use("/api/packages", createProxyMiddleware({
  target: "http://package-service:3001",
  changeOrigin: true,
  pathRewrite: { "^/api/packages": "/packages" },
}))

app.use("/api/users", createProxyMiddleware({
  target: "http://user-service:3002",
  changeOrigin: true,
  pathRewrite: { "^/api/users": "/users" },
}))

app.use("/api/billing", createProxyMiddleware({
  target: "http://billing-service:3003",
  changeOrigin: true,
}))

WebSocket proxying

app.use("/ws", createProxyMiddleware({
  target: "http://localhost:8080",
  ws: true,          // Enable WebSocket proxying
  changeOrigin: true,
}))

Request/response modification

app.use("/api", createProxyMiddleware({
  target: "http://localhost:8080",
  changeOrigin: true,

  on: {
    // Modify outgoing request:
    proxyReq(proxyReq, req, res) {
      proxyReq.setHeader("X-Forwarded-For", req.ip)
      proxyReq.setHeader("X-Request-ID", crypto.randomUUID())

      // Add auth header:
      proxyReq.setHeader("Authorization", `Bearer ${getServiceToken()}`)
    },

    // Modify incoming response:
    proxyRes(proxyRes, req, res) {
      proxyRes.headers["x-proxy"] = "pkgpulse-gateway"
    },

    // Handle proxy errors:
    error(err, req, res) {
      console.error("Proxy error:", err.message)
      res.writeHead(502, { "Content-Type": "application/json" })
      res.end(JSON.stringify({ error: "Bad gateway", upstream: err.message }))
    },
  },
}))

Vite dev server proxy (built-in)

// vite.config.ts — uses http-proxy-middleware under the hood:
import { defineConfig } from "vite"

export default defineConfig({
  server: {
    proxy: {
      "/api": {
        target: "http://localhost:8080",
        changeOrigin: true,
        rewrite: (path) => path.replace(/^\/api/, ""),
      },
      "/ws": {
        target: "ws://localhost:8080",
        ws: true,
      },
    },
  },
})

node-http-proxy

node-http-proxy — low-level proxy:

Basic proxy server

import httpProxy from "http-proxy"
import http from "node:http"

// Create a proxy instance:
const proxy = httpProxy.createProxyServer({
  target: "http://localhost:8080",
  changeOrigin: true,
  ws: true,
})

// Handle proxy errors:
proxy.on("error", (err, req, res) => {
  console.error("Proxy error:", err.message)
  if (res.writeHead) {
    res.writeHead(502, { "Content-Type": "application/json" })
    res.end(JSON.stringify({ error: "Bad gateway" }))
  }
})

// Create HTTP server:
const server = http.createServer((req, res) => {
  proxy.web(req, res)
})

// Handle WebSocket upgrade:
server.on("upgrade", (req, socket, head) => {
  proxy.ws(req, socket, head)
})

server.listen(3000)

Dynamic routing

import httpProxy from "http-proxy"
import http from "node:http"

const proxy = httpProxy.createProxyServer()

const routes: Record<string, string> = {
  "/api/packages": "http://package-service:3001",
  "/api/users": "http://user-service:3002",
  "/api/billing": "http://billing-service:3003",
}

const server = http.createServer((req, res) => {
  // Find matching route:
  const route = Object.keys(routes).find(prefix => req.url?.startsWith(prefix))

  if (route) {
    proxy.web(req, res, { target: routes[route] })
  } else {
    res.writeHead(404)
    res.end("Not found")
  }
})

server.listen(3000)

Load balancing

import httpProxy from "http-proxy"
import http from "node:http"

const targets = [
  "http://app-1:3000",
  "http://app-2:3000",
  "http://app-3:3000",
]

let current = 0

const proxy = httpProxy.createProxyServer()

const server = http.createServer((req, res) => {
  // Round-robin:
  const target = targets[current % targets.length]
  current++

  proxy.web(req, res, { target })
})

server.listen(80)

SSL termination

import httpProxy from "http-proxy"
import https from "node:https"
import fs from "node:fs"

const proxy = httpProxy.createProxyServer({
  target: "http://localhost:8080",  // Backend is HTTP
  changeOrigin: true,
})

// HTTPS frontend → HTTP backend:
const server = https.createServer({
  key: fs.readFileSync("certs/key.pem"),
  cert: fs.readFileSync("certs/cert.pem"),
}, (req, res) => {
  proxy.web(req, res)
})

server.listen(443)

@fastify/http-proxy

@fastify/http-proxy — Fastify proxy plugin:

Basic setup

import Fastify from "fastify"
import proxy from "@fastify/http-proxy"

const fastify = Fastify({ logger: true })

// Proxy /api/* to backend:
await fastify.register(proxy, {
  upstream: "http://localhost:8080",
  prefix: "/api",
  rewritePrefix: "/",     // /api/packages → /packages
})

await fastify.listen({ port: 3000 })

Multiple upstreams (API gateway)

import Fastify from "fastify"
import proxy from "@fastify/http-proxy"

const fastify = Fastify({ logger: true })

// Package service:
await fastify.register(proxy, {
  upstream: "http://package-service:3001",
  prefix: "/api/packages",
  rewritePrefix: "/packages",
})

// User service:
await fastify.register(proxy, {
  upstream: "http://user-service:3002",
  prefix: "/api/users",
  rewritePrefix: "/users",
})

// Billing service:
await fastify.register(proxy, {
  upstream: "http://billing-service:3003",
  prefix: "/api/billing",
  rewritePrefix: "/billing",
})

await fastify.listen({ port: 3000 })

With Fastify hooks (auth, logging)

await fastify.register(proxy, {
  upstream: "http://package-service:3001",
  prefix: "/api/packages",

  // Pre-handler — runs before proxying:
  preHandler: async (request, reply) => {
    // Authenticate:
    const token = request.headers.authorization
    if (!token) {
      return reply.code(401).send({ error: "Unauthorized" })
    }

    // Add request ID:
    request.headers["x-request-id"] = crypto.randomUUID()
  },

  // Modify reply headers:
  replyOptions: {
    onResponse(request, reply, res) {
      reply.header("x-proxy", "pkgpulse-gateway")
      reply.send(res)
    },
  },
})

WebSocket proxying

await fastify.register(proxy, {
  upstream: "http://localhost:8080",
  prefix: "/ws",
  websocket: true,   // Enable WebSocket proxying
})

Feature Comparison

Featurehttp-proxy-middlewarenode-http-proxy@fastify/http-proxy
FrameworkExpress/ConnectAnyFastify only
Underlying enginehttp-proxySelfundici
Path rewritingManual✅ (rewritePrefix)
WebSocket proxy
Request modification✅ (onProxyReq)✅ (proxyReq event)✅ (preHandler)
Load balancingManual❌ (use with fastify-reply-from)
SSL termination
PerformanceGoodGoodBest (undici)
Weekly downloads~8M~5M~200K

When to Use Each

Choose http-proxy-middleware if:

  • Express application — industry standard proxy middleware
  • Dev server proxying (avoiding CORS during development)
  • Simple API gateway with path rewriting
  • Using Vite/CRA dev proxy (uses this under the hood)

Choose node-http-proxy if:

  • Building a standalone proxy server (not inside an app framework)
  • Need SSL termination, load balancing, or custom routing logic
  • Want maximum control over the proxy behavior
  • Implementing A/B testing or canary deployments at the proxy level

Choose @fastify/http-proxy if:

  • Building an API gateway with Fastify
  • Need Fastify's hook system for auth, logging, rate limiting
  • Want undici's performance for high-throughput proxying
  • Microservice architecture with Fastify

Methodology

Download data from npm registry (weekly average, February 2026). Feature comparison based on http-proxy-middleware v3.x, http-proxy v1.x, and @fastify/http-proxy v10.x.

Performance Characteristics and When They Matter

The performance difference between these proxy libraries is relevant for production API gateway use cases but largely irrelevant for dev server proxying. For a development proxy where you're forwarding a few requests per second during local development, the difference between http-proxy-middleware and @fastify/http-proxy is noise. For a production API gateway handling thousands of requests per second, the underlying HTTP client used matters.

@fastify/http-proxy uses undici — the modern HTTP client built into Node.js's internals — as its proxy backend via @fastify/reply-from. undici uses connection pooling, HTTP/1.1 keep-alive, and pipelining by default, which means the connection between your gateway and the upstream service is reused efficiently. For an API gateway forwarding to microservices that support persistent connections, this translates to meaningfully lower latency and CPU overhead compared to the naive per-request connection approach.

node-http-proxy (the library underlying http-proxy-middleware) uses Node.js's built-in http.request() for each proxied request. It manages connection persistence through Node.js's default agent pooling, which is functional but less aggressive than undici's connection management. For high-throughput production gateways, this is the argument for either using @fastify/http-proxy or replacing node-http-proxy's agent with a custom undici agent.

http-proxy-middleware's performance profile is primarily determined by the underlying node-http-proxy engine, plus the overhead of Express middleware processing. The middleware chaining model means every request passes through all registered middleware, which adds overhead relative to Fastify's schema-based request routing. For dev server proxying this doesn't matter, but for production use, benchmarks from 2025 show @fastify/http-proxy handling 2-3x more requests/second on equivalent hardware for simple forwarding workloads.

Headers, CORS, and Security Considerations in Production Proxies

Production proxy deployments require careful attention to header handling that the quick-start examples don't cover. The changeOrigin: true option in http-proxy-middleware (and its equivalent in node-http-proxy) rewrites the Host header in proxied requests to match the target's hostname. This is necessary for most backends that validate the Host header, but it means the upstream service loses visibility into the original request's hostname. Use the X-Forwarded-Host header convention (set in the proxyReq event) to preserve it.

The X-Forwarded-For header chain is important for accurate IP logging and rate limiting. When your Node.js proxy sits behind a CDN or load balancer, the req.ip value is the CDN's IP, not the client's. The correct approach is to read the existing X-Forwarded-For header and append your proxy's address rather than replacing it, trusting the upstream chain up to the first trusted hop. http-proxy-middleware's on.proxyReq callback is where this header manipulation belongs.

For API gateways that add authentication headers (bearer tokens, HMAC signatures) to outbound requests to internal services, ensure those credentials are never echoed back in error responses and never logged in access logs. node-http-proxy's proxyReq event fires before the request leaves your gateway, making it the appropriate place to inject credentials. A pattern to avoid: setting credentials as default headers on the proxy instance level rather than per-request, which can cause credential leakage if you proxy to multiple upstreams with different credentials using the same proxy instance.

CORS configuration belongs at the proxy layer, not the upstream services, in a gateway architecture. Configure CORS in your Express or Fastify app before the proxy middleware, and have upstream services not emit CORS headers at all — this avoids the "double CORS header" problem where both the gateway and the upstream service set Access-Control-Allow-Origin, causing browsers to reject the response.

Compare proxy, API gateway, and networking packages on PkgPulse →

Request Buffering, Streaming, and Large Payloads

A subtle but important behavior difference between these proxy libraries emerges when handling large request or response bodies: buffering vs. streaming. In a streaming proxy, request and response data flows through the proxy without being accumulated in memory — the proxy pipes bytes from client to upstream and back without materializing the full body. In a buffering proxy, the full request or response is held in memory before being forwarded or returned, enabling inspection and modification but at the cost of memory and latency.

node-http-proxy operates as a streaming proxy by default. Request data is piped directly from the incoming connection to the upstream without buffering. This means node-http-proxy can handle arbitrarily large file uploads or streaming responses without memory pressure. The tradeoff is that you cannot inspect or modify the request body in proxyReq events — the body has already started streaming before you can read it. If you need to modify a JSON request body before forwarding (a common API gateway requirement for request transformation), you must buffer the body explicitly before passing control to the proxy.

http-proxy-middleware inherits this behavior from node-http-proxy. When using Express with express.json() middleware, the body-parser middleware buffers and parses the request body into req.body before the proxy middleware runs. This parsed body is separate from the raw stream — http-proxy-middleware will re-stream the original request body, not the parsed req.body. To modify the body before proxying, you need to re-serialize and set the Content-Length header correctly in the proxyReq event, which is a common source of proxy middleware bugs.

@fastify/http-proxy uses @fastify/reply-from under the hood, which gives you hooks at the reply level to modify responses before they're sent to the client. For request modification, Fastify's preHandler hook fires before proxying and has access to the full parsed request body via Fastify's built-in body parsing. This integration between Fastify's request lifecycle and the proxy behavior is cleaner than the body-buffering workarounds required in Express-based setups.

When to Use Each

Use http-proxy-middleware if:

  • You are using Express, webpack-dev-server, or Connect
  • You need to proxy API requests during frontend development (CORS bypass)
  • You want a declarative path-matching proxy config with minimal setup
  • Your team is familiar with the http-proxy configuration model

Use node-http-proxy (http-proxy) directly if:

  • You need to build a custom proxy server with fine-grained control
  • You are writing a load balancer or reverse proxy from scratch
  • You need WebSocket proxying with custom upgrade handling
  • You want low-level access to proxy events (proxyReq, proxyRes, error)

Use @fastify/http-proxy if:

  • Your backend is already built on Fastify
  • You want a proxy plugin that integrates with Fastify's lifecycle hooks
  • You need high-throughput proxying with Fastify's performance characteristics
  • You want to combine proxy routes with authenticated Fastify routes

In 2026, http-proxy-middleware is almost universally used in the frontend development tooling layer (webpack-dev-server, Vite proxy config's underlying implementation). For production reverse proxy needs in Node.js, the pattern has shifted toward using a dedicated reverse proxy (Caddy, nginx, Cloudflare) in front of Node.js rather than implementing the proxy layer in application code.

See also: Fastify vs Koa and Fastify vs Hono, better-sqlite3 vs libsql vs sql.js.

The 2026 JavaScript Stack Cheatsheet

One PDF: the best package for every category (ORMs, bundlers, auth, testing, state management). Used by 500+ devs. Free, updated monthly.