fflate vs pako vs Node.js zlib: Compression Libraries in JavaScript (2026)
TL;DR
fflate is the fastest pure-JavaScript compression library — it handles gzip, deflate, zlib, and zip files, runs in browsers and Node.js, and is only 30KB. pako is the classic choice — a faithful port of zlib with a stable, well-known API, but fflate is now faster and smaller. Node.js built-in zlib (and node:zlib/promises) is the best option for server-side compression in Node.js — zero bundle cost, native C++ bindings, streaming support, and async/promise API. For browser-side compression or zip file creation: fflate. For Node.js server streaming: node:zlib. For simple compatibility with existing zlib-based code: pako.
Key Takeaways
- fflate: ~6M weekly downloads — fastest pure-JS, 30KB, browser + Node.js, zip support
- pako: ~30M weekly downloads — stable zlib port, wider legacy compatibility
- node:zlib: built-in — native C++, streaming, best for Node.js server-side
- fflate is 2-4x faster than pako for most workloads
- Node.js 22.4+ adds built-in
CompressionStreamAPI (browser-compatible interface) - For HTTP response compression in Node.js, use framework middleware — don't compress manually
Download Trends
| Package | Weekly Downloads | Bundle Size | Browser | Streaming | Zip Files |
|---|---|---|---|---|---|
fflate | ~6M | ~30KB | ✅ | ✅ | ✅ |
pako | ~30M | ~50KB | ✅ | ✅ | ❌ |
node:zlib | built-in | 0KB | ❌ | ✅ | ❌ |
fflate
fflate — fastest pure-JavaScript compression:
Basic compression/decompression
import { gzip, gunzip, deflate, inflate } from "fflate"
import { promisify } from "util"
// Async callbacks — wrap with promisify or use the promise wrappers:
const gzipAsync = promisify(gzip)
const gunzipAsync = promisify(gunzip)
// Compress:
const data = new TextEncoder().encode("Hello, PkgPulse! ".repeat(1000))
const compressed = await gzipAsync(data, { level: 6 })
console.log(`Original: ${data.length} bytes`)
console.log(`Compressed: ${compressed.length} bytes`)
console.log(`Ratio: ${((1 - compressed.length / data.length) * 100).toFixed(1)}%`)
// Decompress:
const decompressed = await gunzipAsync(compressed)
const text = new TextDecoder().decode(decompressed)
Synchronous API
import { gzipSync, gunzipSync, deflateSync, inflateSync } from "fflate"
// Synchronous — blocks but faster for small data:
const data = new TextEncoder().encode("Compress me")
const compressed = gzipSync(data, { level: 9 }) // Max compression
const decompressed = gunzipSync(compressed)
// Levels: 0 (no compression) to 9 (max), default 6
// For speed over compression ratio:
const fast = gzipSync(data, { level: 1 })
Zip file creation
import { zip, unzip, strToU8, strFromU8 } from "fflate"
import { promisify } from "util"
const zipAsync = promisify(zip)
const unzipAsync = promisify(unzip)
// Create a zip archive:
const files = {
"README.md": strToU8("# PkgPulse Export\n\nPackage health data."),
"data/react.json": strToU8(JSON.stringify({ name: "react", score: 95 })),
"data/vue.json": strToU8(JSON.stringify({ name: "vue", score: 91 })),
"report.csv": strToU8("package,score\nreact,95\nvue,91"),
}
const zipped = await zipAsync(files)
// zipped is Uint8Array — serve as download or save to disk
// Read a zip archive:
const entries = await unzipAsync(zipped)
for (const [path, data] of Object.entries(entries)) {
console.log(path, strFromU8(data))
}
Streaming compression (browser)
import { Gzip } from "fflate"
// Stream API — useful for large files:
const gzip = new Gzip({ level: 6 })
const chunks: Uint8Array[] = []
gzip.ondata = (chunk, final) => {
chunks.push(chunk)
if (final) {
const compressed = new Uint8Array(chunks.reduce((a, b) => a + b.length, 0))
let offset = 0
for (const chunk of chunks) {
compressed.set(chunk, offset)
offset += chunk.length
}
console.log("Compression complete:", compressed.length, "bytes")
}
}
// Push data in chunks:
gzip.push(new TextEncoder().encode("First chunk "))
gzip.push(new TextEncoder().encode("Second chunk "), true) // true = last chunk
CompressionStream API (modern browsers + Node.js 22.4+)
// Native Web API — no library needed in modern environments:
async function compressWithStream(data: Uint8Array): Promise<Uint8Array> {
const stream = new CompressionStream("gzip")
const writer = stream.writable.getWriter()
const reader = stream.readable.getReader()
writer.write(data)
writer.close()
const chunks: Uint8Array[] = []
while (true) {
const { done, value } = await reader.read()
if (done) break
chunks.push(value)
}
return new Uint8Array(chunks.reduce((total, chunk) => {
return [...total, ...chunk]
}, [] as number[]))
}
pako
pako — the classic zlib port:
Basic usage
import pako from "pako"
// Deflate (compress):
const input = "Hello, World!".repeat(100)
const compressed = pako.deflate(input) // Uint8Array
const compressedGzip = pako.gzip(input) // With gzip wrapper
// Inflate (decompress):
const decompressed = pako.inflate(compressed)
const text = new TextDecoder().decode(decompressed)
// String convenience:
const textCompressed = pako.deflate("Hello!", { to: "string" }) // Raw string
const textDecompressed = pako.inflate(textCompressed, { to: "string" })
Streaming
import pako from "pako"
// Streaming deflate:
const deflator = new pako.Deflate({ level: 6 })
deflator.push("First chunk", false)
deflator.push(" Second chunk", false)
deflator.push(" Last chunk", true) // true = flush/finish
if (deflator.err) {
throw new Error(`Compression error: ${deflator.msg}`)
}
const compressed = deflator.result // Uint8Array
// Streaming inflate:
const inflator = new pako.Inflate()
inflator.push(compressed, true)
const decompressed = new TextDecoder().decode(inflator.result)
Why fflate has replaced pako
// Benchmarks (rough, 1MB typical payload):
// fflate gzip: ~8ms
// pako gzip: ~25ms (3x slower)
// fflate bundle: 30KB
// pako bundle: 50KB
// For new projects, fflate is strictly better:
// - Faster
// - Smaller bundle
// - Zip support (pako doesn't do zip)
// - Better TypeScript types
// Only choose pako for:
// - Legacy code already using it
// - Very specific zlib compatibility needs
// - You prefer the simpler streaming interface
Node.js Built-in zlib
For server-side Node.js, the built-in zlib module is the best choice:
Promise API (Node.js 16+)
import { gzip, gunzip, deflate, inflate, brotliCompress, brotliDecompress } from "zlib"
import { promisify } from "util"
const gzipAsync = promisify(gzip)
const gunzipAsync = promisify(gunzip)
const brotliCompressAsync = promisify(brotliCompress)
// Gzip:
const data = Buffer.from("Hello, World!".repeat(1000))
const compressed = await gzipAsync(data)
const decompressed = await gunzipAsync(compressed)
// Brotli (better compression than gzip, slower):
const brotli = await brotliCompressAsync(data)
console.log(`Gzip: ${compressed.length} bytes`)
console.log(`Brotli: ${brotli.length} bytes`) // ~20% smaller than gzip
// Or use the direct promise API:
import * as zlibPromises from "zlib/promises"
const { buffer: gzipped } = await zlibPromises.gzip(data)
Streaming in HTTP (most common use case)
import { createGzip, createBrotliCompress, createGunzip } from "zlib"
import { createReadStream, createWriteStream } from "fs"
import { pipeline } from "stream/promises"
// Compress a file:
async function compressFile(input: string, output: string) {
const readStream = createReadStream(input)
const gzip = createGzip({ level: 6 })
const writeStream = createWriteStream(output)
await pipeline(readStream, gzip, writeStream)
console.log(`Compressed ${input} → ${output}`)
}
// HTTP response compression (Express):
import express from "express"
import compression from "compression"
const app = express()
app.use(compression({ level: 6 })) // Uses zlib internally
Decompress incoming request
import { createGunzip } from "zlib"
import { pipeline } from "stream/promises"
// Decompress gzip-encoded request body:
async function readGzipBody(req: NodeJS.ReadableStream): Promise<string> {
const chunks: Buffer[] = []
const gunzip = createGunzip()
await pipeline(
req,
gunzip,
async function*(source) {
for await (const chunk of source) {
chunks.push(chunk as Buffer)
yield chunk
}
}
)
return Buffer.concat(chunks).toString("utf8")
}
Brotli (smaller, for static assets)
import { brotliCompressSync, brotliDecompressSync, constants } from "zlib"
// Best for compressing static files (HTML, CSS, JS):
const html = Buffer.from("<html>...</html>")
const brotli = brotliCompressSync(html, {
params: {
[constants.BROTLI_PARAM_QUALITY]: 11, // Max quality (0-11)
[constants.BROTLI_PARAM_LGWIN]: 22, // Window size
[constants.BROTLI_PARAM_MODE]: constants.BROTLI_MODE_TEXT, // Optimize for text
},
})
// Brotli-compressed files are ~15-20% smaller than gzip
// But brotli compression is ~10x slower than gzip — only for static assets
Feature Comparison
| Feature | fflate | pako | node:zlib |
|---|---|---|---|
| Bundle size | ~30KB | ~50KB | 0 (built-in) |
| Browser | ✅ | ✅ | ❌ |
| Node.js | ✅ | ✅ | ✅ Native C++ |
| Performance | ⚡ Fastest pure-JS | Moderate | ⚡ Fastest (native) |
| Gzip | ✅ | ✅ | ✅ |
| Deflate | ✅ | ✅ | ✅ |
| Brotli | ❌ | ❌ | ✅ |
| Zip files | ✅ | ❌ | ❌ |
| Streaming | ✅ | ✅ | ✅ |
| Promise API | ✅ | ❌ | ✅ (promisify) |
| TypeScript | ✅ | ✅ @types | ✅ |
When to Use Each
Choose fflate if:
- Browser-side compression (upload compression, client-side zip creation)
- Creating zip archives in Node.js (fast-glob → zip for exports)
- Worker thread or edge runtime where native modules aren't available
- You need the fastest pure-JS option for both browser and Node.js
Choose pako if:
- Existing codebase already using it — upgrade to fflate when convenient
- You need the specific zlib compatibility quirks pako provides
- Simple inflate/deflate without zip file needs
Use Node.js zlib if:
- Server-side Node.js — it's native, faster than any pure-JS library
- HTTP response compression (gzip/brotli middleware)
- File compression pipelines with streams
- Brotli compression for static assets (only Node.js built-in has brotli)
Use CompressionStream if:
- Modern browser (Chrome 80+, Firefox 113+) and no zip files needed
- Node.js 22.4+ and you want browser-compatible APIs
- Simple gzip/deflate without needing zip or brotli
Methodology
Download data from npm registry (weekly average, February 2026). Performance benchmarks are approximate for 1MB text payloads. Feature comparison based on fflate v0.8.x, pako v2.x, and Node.js 22.x built-in zlib.