Skip to main content

nanotar vs tar-stream vs node-tar: Tar File Handling in Node.js (2026)

·PkgPulse Team

TL;DR

nanotar is the UnJS minimal tar utility — creates and extracts tar archives in memory, zero dependencies, ~2KB, works in any JavaScript runtime. tar-stream is the streaming tar library — low-level pack/extract streams, memory efficient for large archives, used by many npm tools. node-tar (tar) is npm's tar implementation — full-featured, gzip/brotli support, file permissions, used by npm for package tarballs. In 2026: node-tar for full tar.gz operations, tar-stream for streaming pipelines, nanotar for simple in-memory tar.

Key Takeaways

  • nanotar: ~3M weekly downloads — UnJS, in-memory, ~2KB, zero deps, any runtime
  • tar-stream: ~15M weekly downloads — streaming pack/extract, low-level, memory efficient
  • node-tar: ~20M weekly downloads — npm's tar, gzip/brotli, permissions, full featured
  • node-tar handles .tar.gz natively — most npm packages are .tgz files
  • tar-stream is the most flexible for streaming pipelines
  • nanotar is the simplest for creating/reading tar in memory

nanotar

nanotar — minimal tar utility:

Create tar in memory

import { createTar } from "nanotar"

// Create a tar archive from files:
const tarData = createTar([
  { name: "package.json", data: JSON.stringify({ name: "my-pkg", version: "1.0.0" }) },
  { name: "src/index.ts", data: 'export const hello = "world"' },
  { name: "README.md", data: "# My Package" },
])

// tarData is a Uint8Array
// Write to file:
import { writeFileSync } from "node:fs"
writeFileSync("archive.tar", tarData)

Extract tar in memory

import { parseTar } from "nanotar"

// Parse a tar archive:
const files = parseTar(tarData)

for (const file of files) {
  console.log(file.name)  // "package.json", "src/index.ts", etc.
  console.log(file.data)  // Uint8Array of file contents
  console.log(new TextDecoder().decode(file.data))  // String content
}

With gzip (manual)

import { createTar, parseTar } from "nanotar"
import { gzipSync, gunzipSync } from "node:zlib"

// Create .tar.gz:
const tar = createTar([
  { name: "data.json", data: JSON.stringify({ key: "value" }) },
])
const tgz = gzipSync(tar)
writeFileSync("archive.tar.gz", tgz)

// Extract .tar.gz:
const compressed = readFileSync("archive.tar.gz")
const decompressed = gunzipSync(compressed)
const files = parseTar(decompressed)

Why it's useful

nanotar:
  ✅ ~2KB — smallest tar implementation
  ✅ Zero dependencies
  ✅ Works in Node.js, Deno, Bun, browsers
  ✅ Simple API — createTar / parseTar
  ✅ In-memory — no filesystem required

  ❌ No streaming (entire archive in memory)
  ❌ No gzip built-in (use node:zlib)
  ❌ No file permissions / ownership
  ❌ No symlink support

Use for: small archives, config bundles, cross-runtime

tar-stream

tar-stream — streaming tar:

Pack (create tar)

import tar from "tar-stream"
import { createWriteStream } from "node:fs"
import { pipeline } from "node:stream/promises"

const pack = tar.pack()

// Add files:
pack.entry({ name: "package.json" }, JSON.stringify({ name: "my-pkg" }))
pack.entry({ name: "src/index.ts" }, 'export const hello = "world"')

// Add a file with metadata:
pack.entry({
  name: "bin/cli.js",
  mode: 0o755,     // Executable
  mtime: new Date(),
  uid: 1000,
  gid: 1000,
}, "#!/usr/bin/env node\nconsole.log('hello')")

pack.finalize()

// Pipe to file:
await pipeline(pack, createWriteStream("archive.tar"))

Extract (read tar)

import tar from "tar-stream"
import { createReadStream } from "node:fs"
import { pipeline } from "node:stream/promises"

const extract = tar.extract()

extract.on("entry", (header, stream, next) => {
  console.log(header.name)  // File name
  console.log(header.size)  // File size
  console.log(header.type)  // "file", "directory", "symlink"

  // Read file content:
  const chunks: Buffer[] = []
  stream.on("data", (chunk) => chunks.push(chunk))
  stream.on("end", () => {
    const content = Buffer.concat(chunks).toString()
    console.log(`${header.name}: ${content.slice(0, 50)}...`)
    next()  // Process next entry
  })
  stream.resume()
})

await pipeline(createReadStream("archive.tar"), extract)

With gzip (streaming)

import tar from "tar-stream"
import { createGzip, createGunzip } from "node:zlib"
import { createReadStream, createWriteStream } from "node:fs"
import { pipeline } from "node:stream/promises"

// Create .tar.gz with streaming:
const pack = tar.pack()
pack.entry({ name: "data.json" }, JSON.stringify({ key: "value" }))
pack.finalize()

await pipeline(pack, createGzip(), createWriteStream("archive.tar.gz"))

// Extract .tar.gz with streaming:
const extract = tar.extract()
extract.on("entry", (header, stream, next) => {
  // Process each file...
  stream.resume()
  next()
})

await pipeline(
  createReadStream("archive.tar.gz"),
  createGunzip(),
  extract,
)

Dynamic tar creation

import tar from "tar-stream"

// Stream entries — useful for large archives:
const pack = tar.pack()

// Add entries from a database or API:
for await (const record of db.stream("SELECT * FROM files")) {
  pack.entry({ name: record.path, size: record.size }, record.content)
}

pack.finalize()

node-tar

node-tar — npm's tar implementation:

Create tar.gz

import tar from "tar"

// Create a .tar.gz from files:
await tar.create(
  {
    gzip: true,
    file: "archive.tar.gz",
  },
  ["src/", "package.json", "README.md"],
)

// Create with options:
await tar.create(
  {
    gzip: true,
    file: "dist.tar.gz",
    cwd: "./build",      // Base directory
    prefix: "my-pkg/",   // Add prefix to all paths
    portable: true,       // Portable (no uid/gid)
    filter: (path) => !path.includes("node_modules"),
  },
  ["."],
)

Extract tar.gz

import tar from "tar"

// Extract to directory:
await tar.extract({
  file: "archive.tar.gz",
  cwd: "./output",      // Extract to this directory
})

// Extract with options:
await tar.extract({
  file: "archive.tar.gz",
  cwd: "./output",
  strip: 1,            // Remove first path component
  filter: (path) => path.endsWith(".js") || path.endsWith(".ts"),
  newer: true,          // Only extract newer files
})

List contents

import tar from "tar"

// List files in a tar.gz:
await tar.list({
  file: "archive.tar.gz",
  onReadEntry: (entry) => {
    console.log(`${entry.path} (${entry.size} bytes)`)
  },
})

Streaming API

import tar from "tar"
import { createReadStream } from "node:fs"

// Stream extract:
createReadStream("archive.tar.gz")
  .pipe(tar.extract({ cwd: "./output", strip: 1 }))

// Stream create:
tar.create({ gzip: true }, ["src/"])
  .pipe(createWriteStream("dist.tar.gz"))

How npm uses node-tar

// npm pack creates .tgz files using node-tar:
// npm publish sends the .tgz to the registry
// npm install extracts .tgz into node_modules

// Simplified npm pack:
await tar.create(
  {
    gzip: true,
    file: `${name}-${version}.tgz`,
    prefix: "package/",
    cwd: projectDir,
    portable: true,
  },
  files,  // Files listed in package.json "files" field
)

Feature Comparison

Featurenanotartar-streamnode-tar
Create tar✅ (memory)✅ (stream)✅ (file/stream)
Extract tar✅ (memory)✅ (stream)✅ (file/stream)
Gzip support❌ (manual)❌ (manual)✅ (built-in)
Brotli support
Streaming
File permissions
Symlinks
List contents
Strip paths
Filter files
Edge runtime
Dependencies00Few
Size~2KB~15KB~100KB
Weekly downloads~3M~15M~20M

When to Use Each

Use nanotar if:

  • Need simple in-memory tar creation/extraction
  • Want zero dependencies and tiny bundle
  • Building for edge runtimes (Workers, Deno)
  • Creating small config or data bundles

Use tar-stream if:

  • Building streaming tar pipelines
  • Need low-level control over tar entries
  • Processing large archives without loading into memory
  • Building custom archive tools

Use node-tar if:

  • Need full tar.gz/tar.br support out of the box
  • Working with npm package tarballs
  • Need file permissions, symlinks, and path stripping
  • Building build tools or package managers

Methodology

Download data from npm registry (weekly average, February 2026). Feature comparison based on nanotar v0.1.x, tar-stream v3.x, and tar (node-tar) v7.x.

Compare archive tools and developer utilities on PkgPulse →

Comments

Stay Updated

Get the latest package insights, npm trends, and tooling tips delivered to your inbox.