Skip to main content

Guide

fflate vs pako vs Node.js zlib 2026

Compare fflate, pako, and Node.js built-in zlib for compression in JavaScript. Gzip, deflate, zip file creation, browser support, performance, bundle size.

·PkgPulse Team·
0

TL;DR

fflate is the fastest pure-JavaScript compression library — it handles gzip, deflate, zlib, and zip files, runs in browsers and Node.js, and is only 30KB. pako is the classic choice — a faithful port of zlib with a stable, well-known API, but fflate is now faster and smaller. Node.js built-in zlib (and node:zlib/promises) is the best option for server-side compression in Node.js — zero bundle cost, native C++ bindings, streaming support, and async/promise API. For browser-side compression or zip file creation: fflate. For Node.js server streaming: node:zlib. For simple compatibility with existing zlib-based code: pako.

Key Takeaways

  • fflate: ~6M weekly downloads — fastest pure-JS, 30KB, browser + Node.js, zip support
  • pako: ~30M weekly downloads — stable zlib port, wider legacy compatibility
  • node:zlib: built-in — native C++, streaming, best for Node.js server-side
  • fflate is 2-4x faster than pako for most workloads
  • Node.js 22.4+ adds built-in CompressionStream API (browser-compatible interface)
  • For HTTP response compression in Node.js, use framework middleware — don't compress manually

PackageWeekly DownloadsBundle SizeBrowserStreamingZip Files
fflate~6M~30KB
pako~30M~50KB
node:zlibbuilt-in0KB

fflate

fflate — fastest pure-JavaScript compression:

Basic compression/decompression

import { gzip, gunzip, deflate, inflate } from "fflate"
import { promisify } from "util"

// Async callbacks — wrap with promisify or use the promise wrappers:
const gzipAsync = promisify(gzip)
const gunzipAsync = promisify(gunzip)

// Compress:
const data = new TextEncoder().encode("Hello, PkgPulse! ".repeat(1000))
const compressed = await gzipAsync(data, { level: 6 })
console.log(`Original: ${data.length} bytes`)
console.log(`Compressed: ${compressed.length} bytes`)
console.log(`Ratio: ${((1 - compressed.length / data.length) * 100).toFixed(1)}%`)

// Decompress:
const decompressed = await gunzipAsync(compressed)
const text = new TextDecoder().decode(decompressed)

Synchronous API

import { gzipSync, gunzipSync, deflateSync, inflateSync } from "fflate"

// Synchronous — blocks but faster for small data:
const data = new TextEncoder().encode("Compress me")
const compressed = gzipSync(data, { level: 9 })  // Max compression
const decompressed = gunzipSync(compressed)

// Levels: 0 (no compression) to 9 (max), default 6
// For speed over compression ratio:
const fast = gzipSync(data, { level: 1 })

Zip file creation

import { zip, unzip, strToU8, strFromU8 } from "fflate"
import { promisify } from "util"

const zipAsync = promisify(zip)
const unzipAsync = promisify(unzip)

// Create a zip archive:
const files = {
  "README.md": strToU8("# PkgPulse Export\n\nPackage health data."),
  "data/react.json": strToU8(JSON.stringify({ name: "react", score: 95 })),
  "data/vue.json": strToU8(JSON.stringify({ name: "vue", score: 91 })),
  "report.csv": strToU8("package,score\nreact,95\nvue,91"),
}

const zipped = await zipAsync(files)
// zipped is Uint8Array — serve as download or save to disk

// Read a zip archive:
const entries = await unzipAsync(zipped)
for (const [path, data] of Object.entries(entries)) {
  console.log(path, strFromU8(data))
}

Streaming compression (browser)

import { Gzip } from "fflate"

// Stream API — useful for large files:
const gzip = new Gzip({ level: 6 })

const chunks: Uint8Array[] = []

gzip.ondata = (chunk, final) => {
  chunks.push(chunk)
  if (final) {
    const compressed = new Uint8Array(chunks.reduce((a, b) => a + b.length, 0))
    let offset = 0
    for (const chunk of chunks) {
      compressed.set(chunk, offset)
      offset += chunk.length
    }
    console.log("Compression complete:", compressed.length, "bytes")
  }
}

// Push data in chunks:
gzip.push(new TextEncoder().encode("First chunk "))
gzip.push(new TextEncoder().encode("Second chunk "), true)  // true = last chunk

CompressionStream API (modern browsers + Node.js 22.4+)

// Native Web API — no library needed in modern environments:
async function compressWithStream(data: Uint8Array): Promise<Uint8Array> {
  const stream = new CompressionStream("gzip")
  const writer = stream.writable.getWriter()
  const reader = stream.readable.getReader()

  writer.write(data)
  writer.close()

  const chunks: Uint8Array[] = []
  while (true) {
    const { done, value } = await reader.read()
    if (done) break
    chunks.push(value)
  }

  return new Uint8Array(chunks.reduce((total, chunk) => {
    return [...total, ...chunk]
  }, [] as number[]))
}

pako

pako — the classic zlib port:

Basic usage

import pako from "pako"

// Deflate (compress):
const input = "Hello, World!".repeat(100)
const compressed = pako.deflate(input)  // Uint8Array
const compressedGzip = pako.gzip(input)  // With gzip wrapper

// Inflate (decompress):
const decompressed = pako.inflate(compressed)
const text = new TextDecoder().decode(decompressed)

// String convenience:
const textCompressed = pako.deflate("Hello!", { to: "string" })  // Raw string
const textDecompressed = pako.inflate(textCompressed, { to: "string" })

Streaming

import pako from "pako"

// Streaming deflate:
const deflator = new pako.Deflate({ level: 6 })

deflator.push("First chunk", false)
deflator.push(" Second chunk", false)
deflator.push(" Last chunk", true)  // true = flush/finish

if (deflator.err) {
  throw new Error(`Compression error: ${deflator.msg}`)
}

const compressed = deflator.result  // Uint8Array

// Streaming inflate:
const inflator = new pako.Inflate()
inflator.push(compressed, true)
const decompressed = new TextDecoder().decode(inflator.result)

Why fflate has replaced pako

// Benchmarks (rough, 1MB typical payload):
// fflate gzip: ~8ms
// pako gzip:   ~25ms (3x slower)
// fflate bundle: 30KB
// pako bundle:   50KB

// For new projects, fflate is strictly better:
// - Faster
// - Smaller bundle
// - Zip support (pako doesn't do zip)
// - Better TypeScript types

// Only choose pako for:
// - Legacy code already using it
// - Very specific zlib compatibility needs
// - You prefer the simpler streaming interface

Node.js Built-in zlib

For server-side Node.js, the built-in zlib module is the best choice:

Promise API (Node.js 16+)

import { gzip, gunzip, deflate, inflate, brotliCompress, brotliDecompress } from "zlib"
import { promisify } from "util"

const gzipAsync = promisify(gzip)
const gunzipAsync = promisify(gunzip)
const brotliCompressAsync = promisify(brotliCompress)

// Gzip:
const data = Buffer.from("Hello, World!".repeat(1000))
const compressed = await gzipAsync(data)
const decompressed = await gunzipAsync(compressed)

// Brotli (better compression than gzip, slower):
const brotli = await brotliCompressAsync(data)
console.log(`Gzip: ${compressed.length} bytes`)
console.log(`Brotli: ${brotli.length} bytes`)  // ~20% smaller than gzip

// Or use the direct promise API:
import * as zlibPromises from "zlib/promises"

const { buffer: gzipped } = await zlibPromises.gzip(data)

Streaming in HTTP (most common use case)

import { createGzip, createBrotliCompress, createGunzip } from "zlib"
import { createReadStream, createWriteStream } from "fs"
import { pipeline } from "stream/promises"

// Compress a file:
async function compressFile(input: string, output: string) {
  const readStream = createReadStream(input)
  const gzip = createGzip({ level: 6 })
  const writeStream = createWriteStream(output)

  await pipeline(readStream, gzip, writeStream)
  console.log(`Compressed ${input}${output}`)
}

// HTTP response compression (Express):
import express from "express"
import compression from "compression"

const app = express()
app.use(compression({ level: 6 }))  // Uses zlib internally

Decompress incoming request

import { createGunzip } from "zlib"
import { pipeline } from "stream/promises"

// Decompress gzip-encoded request body:
async function readGzipBody(req: NodeJS.ReadableStream): Promise<string> {
  const chunks: Buffer[] = []
  const gunzip = createGunzip()

  await pipeline(
    req,
    gunzip,
    async function*(source) {
      for await (const chunk of source) {
        chunks.push(chunk as Buffer)
        yield chunk
      }
    }
  )

  return Buffer.concat(chunks).toString("utf8")
}

Brotli (smaller, for static assets)

import { brotliCompressSync, brotliDecompressSync, constants } from "zlib"

// Best for compressing static files (HTML, CSS, JS):
const html = Buffer.from("<html>...</html>")

const brotli = brotliCompressSync(html, {
  params: {
    [constants.BROTLI_PARAM_QUALITY]: 11,     // Max quality (0-11)
    [constants.BROTLI_PARAM_LGWIN]: 22,        // Window size
    [constants.BROTLI_PARAM_MODE]: constants.BROTLI_MODE_TEXT,  // Optimize for text
  },
})

// Brotli-compressed files are ~15-20% smaller than gzip
// But brotli compression is ~10x slower than gzip — only for static assets

Feature Comparison

Featurefflatepakonode:zlib
Bundle size~30KB~50KB0 (built-in)
Browser
Node.js✅ Native C++
Performance⚡ Fastest pure-JSModerate⚡ Fastest (native)
Gzip
Deflate
Brotli
Zip files
Streaming
Promise API✅ (promisify)
TypeScript✅ @types

When to Use Each

Choose fflate if:

  • Browser-side compression (upload compression, client-side zip creation)
  • Creating zip archives in Node.js (fast-glob → zip for exports)
  • Worker thread or edge runtime where native modules aren't available
  • You need the fastest pure-JS option for both browser and Node.js

Choose pako if:

  • Existing codebase already using it — upgrade to fflate when convenient
  • You need the specific zlib compatibility quirks pako provides
  • Simple inflate/deflate without zip file needs

Use Node.js zlib if:

  • Server-side Node.js — it's native, faster than any pure-JS library
  • HTTP response compression (gzip/brotli middleware)
  • File compression pipelines with streams
  • Brotli compression for static assets (only Node.js built-in has brotli)

Use CompressionStream if:

  • Modern browser (Chrome 80+, Firefox 113+) and no zip files needed
  • Node.js 22.4+ and you want browser-compatible APIs
  • Simple gzip/deflate without needing zip or brotli

Compression Levels and the Speed/Size Tradeoff

The compression level setting (0-9 for gzip/deflate, 0-11 for brotli) controls the tradeoff between CPU time and output size. Level 0 is no compression — data passes through as-is with just the format header. Level 6 is the default and a good general-purpose balance: it achieves 60-80% of maximum compression at 20-30% of maximum compression time. Level 9 (maximum) can take 5-10x longer than level 6 for a marginal 5-15% improvement in compressed size on typical data. For real-time use cases — compressing responses on the fly, compressing WebSocket messages, or compressing user uploads as they stream in — level 1 is often the right choice because the CPU time saved allows the server to handle more concurrent requests. For pre-compressed static assets (HTML, CSS, JavaScript bundles served with the Content-Encoding: br header), level 9 or 11 is justified since compression happens once at build time and the savings are served to every visitor. The brotli format is worth understanding here: brotli level 4 achieves comparable compression to gzip level 9 at similar speed, while brotli level 11 achieves 15-20% better compression than gzip level 9 at the cost of being significantly slower. For static asset pre-compression, brotli level 11 is the clear winner.

HTTP Compression Middleware and Server Configuration

In Node.js web servers, you typically do not call the compression libraries directly — you use HTTP middleware that handles content negotiation, streaming compression, and the Content-Encoding and Accept-Encoding headers automatically. The compression npm package wraps Node.js's built-in zlib and integrates with Express and Koa. Fastify has a @fastify/compress plugin that supports gzip, deflate, and brotli via Node.js's built-in module. Hono (for edge runtimes) compresses responses through its compress middleware. For static file serving, pre-compressing assets at build time and serving the .gz or .br files is more efficient than dynamic compression because the CPU work happens once rather than on every request. Vite's vite-plugin-compression and webpack's CompressionPlugin automate this for frontend builds. The key HTTP detail is that servers should only compress responses when the client includes Accept-Encoding: gzip (or br, deflate) in the request — compressing for clients that cannot decompress is worse than no compression.

Edge Runtime Compatibility and WASM Compression

Edge runtimes (Cloudflare Workers, Vercel Edge, Deno Deploy) have restricted APIs that exclude Node.js built-ins like zlib. For compression in edge runtimes, the options are fflate (which is pure JavaScript and works anywhere) or the CompressionStream Web API (available in all modern edge runtimes). Cloudflare Workers have supported CompressionStream since 2023, Deno Deploy supports it, and Vercel Edge runtime is based on the same Web API standards. This means for edge-runtime code, fflate is the most portable choice — it compiles to a WASM-friendly pure JavaScript implementation that runs without modification in any JavaScript environment. pako similarly runs in edge runtimes since it is pure JavaScript, but fflate is faster and smaller. The main scenario where you would reach for fflate over CompressionStream in an edge runtime is zip file creation — CompressionStream only handles raw gzip/deflate streams, not the zip container format with directory entries and metadata.

Decompression and Content-Encoding Handling

The decompression use case — reading compressed data from an external source — is as common as compression. When fetching from APIs that return Content-Encoding: gzip responses, the fetch() API in browsers and modern Node.js (via the Fetch global in Node 18+) automatically decompresses the response body. When using the node:http or node:https modules directly, automatic decompression does not happen, and you need to pipe the response through createGunzip() from zlib based on the Content-Encoding header. fflate and pako are useful for decompressing data in browser environments where the response is already fully received as a Uint8Array — for example, decompressing a downloaded binary file before processing it in a Web Worker. For Node.js backend code that receives compressed request bodies (compressed file uploads, compressed API payloads from IoT devices), the zlib.createGunzip() stream is the most efficient choice since it handles the data as it arrives without buffering the entire payload.

Integration with Web Storage and Transfer APIs

Compression integrates with several browser and Node.js transfer primitives in ways that affect which library to use. For downloading large datasets from a browser application, creating a compressed zip archive with fflate and triggering a download via URL.createObjectURL(new Blob([compressed], { type: 'application/zip' })) is a zero-server pattern that processes data entirely client-side. For uploading compressed files from a browser to a Node.js server, fflate compresses in the browser, the compressed bytes are sent over HTTP with Content-Encoding: gzip, and Node.js's built-in zlib.createGunzip() decompresses on arrival. This pattern reduces upload bandwidth by 60-80% for text-heavy payloads like JSON exports. For server-to-server file transfer, Node.js's zlib streaming API with pipeline() handles compression without buffering the full file in memory, making it suitable for files larger than available RAM. The key architectural insight is that browser-side and server-side compression are separate concerns: browser code needs fflate or CompressionStream; server code should use Node.js's native zlib unless the code must run in both environments.

Methodology

Download data from npm registry (weekly average, February 2026). Performance benchmarks are approximate for 1MB text payloads. Feature comparison based on fflate v0.8.x, pako v2.x, and Node.js 22.x built-in zlib.

Compare compression and utility packages on PkgPulse →

See also: AVA vs Jest and pm2 vs node:cluster vs tsx watch, better-sqlite3 vs libsql vs sql.js.

The 2026 JavaScript Stack Cheatsheet

One PDF: the best package for every category (ORMs, bundlers, auth, testing, state management). Used by 500+ devs. Free, updated monthly.