Skip to main content

middy vs AWS Lambda Powertools vs serverless-http 2026

·PkgPulse Team

middy vs AWS Lambda Powertools vs serverless-http: Node.js Lambda Middleware in 2026

TL;DR

Node.js Lambda functions grow fast — and without middleware, every handler becomes a copy-paste graveyard of auth checks, error handling, and JSON parsing. In 2026, three packages dominate this space: middy (the most popular Lambda middleware framework, Express-style middleware for AWS Lambda), @aws-lambda-powertools (AWS's official suite for observability, tracing, and structured logging), and serverless-http (runs your existing Express/Fastify/Hono app on Lambda with zero rewrites). They solve different problems and are often used together: middy for cross-cutting concerns, Powertools for observability, serverless-http if you're migrating an existing HTTP framework.

Key Takeaways

  • middy is the Express-style middleware framework for raw Lambda handlers — wraps your handler with a pipeline of middleware for auth, validation, error handling, CORS, warm-up
  • @aws-lambda-powertools is AWS's official toolkit for production Lambda functions — structured logging (JSON), tracing (X-Ray), metrics (CloudWatch), idempotency, batch processing
  • serverless-http wraps any Node.js HTTP framework (Express, Fastify, Hono, Koa) to run on Lambda — no Lambda-specific rewrites needed
  • Cold start impact: middy adds ~2-5ms; Powertools adds ~10-15ms; serverless-http adds ~5-10ms plus your framework's overhead
  • They compose: middy(handler).use(injectLambdaContext(logger)) — Powertools middleware integrates natively into middy's pipeline
  • Weekly downloads: middy ~800K; @aws-lambda-powertools/logger ~500K; serverless-http ~900K

The Lambda Middleware Problem

A raw AWS Lambda handler is just a function:

export const handler = async (event: APIGatewayProxyEventV2) => {
  // You're responsible for everything:
  // - Parsing the JSON body
  // - Validating the body schema
  // - Authenticating the request
  // - Handling errors consistently
  // - Returning the right status codes
  // - Logging with correlation IDs
  // - Tracing with X-Ray
  // - Returning CORS headers

  const body = JSON.parse(event.body ?? '{}')
  // ... 50 more lines of boilerplate
}

Without middleware, every handler duplicates this boilerplate. At 10 handlers it's manageable; at 50 it's a maintenance nightmare.


middy: The Lambda Middleware Framework

middy v5 is the most widely-used Node.js Lambda middleware framework. It wraps your handler with a composable pipeline — identical conceptually to Express middleware but purpose-built for Lambda's event/response model.

Core Architecture

import middy from '@middy/core'
import httpJsonBodyParser from '@middy/http-json-body-parser'
import httpErrorHandler from '@middy/http-error-handler'
import httpCors from '@middy/http-cors'
import validator from '@middy/validator'
import { transpileSchema } from '@middy/validator/transpile'

// Your actual business logic — no boilerplate
const baseHandler = async (event) => {
  const { name, email } = event.body  // Already parsed and validated
  const user = await createUser({ name, email })
  return { statusCode: 201, body: JSON.stringify(user) }
}

const schema = {
  type: 'object',
  properties: {
    body: {
      type: 'object',
      properties: {
        name: { type: 'string', minLength: 1 },
        email: { type: 'string', format: 'email' },
      },
      required: ['name', 'email'],
    },
  },
}

export const handler = middy(baseHandler)
  .use(httpJsonBodyParser())         // Parse JSON body
  .use(validator({ eventSchema: transpileSchema(schema) }))  // Validate
  .use(httpCors())                    // CORS headers on every response
  .use(httpErrorHandler())            // Consistent error responses

The Official Middleware Ecosystem

middy ships a rich ecosystem of official middleware packages:

import httpSecurityHeaders from '@middy/http-security-headers'
import inputOutputLogger from '@middy/input-output-logger'
import warmUp from '@middy/warmup'
import ssm from '@middy/ssm'
import secretsManager from '@middy/secrets-manager'
import rdsSigner from '@middy/rds-signer'

export const handler = middy(baseHandler)
  // Fetch SSM params at cold start, cache for 5 minutes
  .use(ssm({
    fetchData: {
      DB_URL: '/prod/myapp/database-url',
    },
    cacheExpiry: 5 * 60 * 1000,  // 5 minutes
    setToContext: true,
  }))
  // Fetch Secrets Manager secrets
  .use(secretsManager({
    fetchData: {
      stripeKey: 'prod/myapp/stripe-secret',
    },
    setToContext: true,
  }))
  // Respond immediately to keep-warm pings (prevents real handler logic on warm-up)
  .use(warmUp())
  // Security headers on all HTTP responses
  .use(httpSecurityHeaders())
  // Log all inputs and outputs (with redaction)
  .use(inputOutputLogger({
    logger: (message) => console.log(JSON.stringify(message)),
    omitPaths: ['event.body.password', 'event.headers.authorization'],
  }))

Error Handling with createError

middy's error handling pattern uses @middy/util's createError to produce HTTP-compatible errors that httpErrorHandler converts to proper API responses:

import createError from '@middy/util'

const baseHandler = async (event) => {
  const userId = event.pathParameters?.id
  const user = await db.users.findById(userId)

  if (!user) {
    throw createError(404, 'User not found', {
      expose: true,  // Safe to show to clients
    })
  }

  if (!user.isActive) {
    throw createError(403, 'Account suspended', { expose: true })
  }

  return {
    statusCode: 200,
    body: JSON.stringify(user),
  }
}

export const handler = middy(baseHandler)
  .use(httpErrorHandler())
// Result: 404 → { statusCode: 404, body: '{"message":"User not found"}' }
// Result: 500 → { statusCode: 500, body: '{"message":"Internal Server Error"}' }
//              (unexposed errors become generic 500s)

@aws-lambda-powertools: Production Observability

@aws-lambda-powertools is AWS's official TypeScript toolkit for production-grade Lambda functions. While middy handles request/response middleware, Powertools handles observability — structured logging, distributed tracing, and custom metrics.

Logger: Structured JSON Logging

Powertools Logger outputs structured JSON with Lambda context automatically injected:

import { Logger } from '@aws-lambda-powertools/logger'
import { injectLambdaContext } from '@aws-lambda-powertools/logger/middleware'
import middy from '@middy/core'

const logger = new Logger({
  serviceName: 'user-service',
  logLevel: 'INFO',
})

const baseHandler = async (event) => {
  logger.info('Processing user request', {
    userId: event.pathParameters.id,
    action: 'get-user',
  })

  const user = await getUser(event.pathParameters.id)

  logger.info('User fetched successfully', {
    userId: user.id,
    plan: user.plan,
  })

  return { statusCode: 200, body: JSON.stringify(user) }
}

export const handler = middy(baseHandler)
  .use(injectLambdaContext(logger, { clearState: true }))

The injectLambdaContext middleware enriches every log line with:

{
  "level": "INFO",
  "message": "Processing user request",
  "service": "user-service",
  "timestamp": "2026-03-09T10:00:00.000Z",
  "xray_trace_id": "1-abc123-def456",
  "cold_start": true,
  "function_name": "user-service-prod",
  "function_memory_size": "512",
  "function_arn": "arn:aws:lambda:...",
  "function_request_id": "uuid-here",
  "userId": "usr_abc123",
  "action": "get-user"
}

This makes CloudWatch Logs Insights queries trivial:

fields @timestamp, level, message, userId, cold_start
| filter level = "ERROR"
| stats count(*) by userId

Tracer: X-Ray Distributed Tracing

import { Tracer } from '@aws-lambda-powertools/tracer'
import { captureLambdaHandler } from '@aws-lambda-powertools/tracer/middleware'

const tracer = new Tracer({ serviceName: 'user-service' })

// Auto-patches AWS SDK clients and HTTP calls
const dynamodb = tracer.captureAWSv3Client(new DynamoDBClient({}))
const axios = tracer.captureHTTPsGlobal(require('https'))

const baseHandler = async (event) => {
  // Custom subsegments for business logic tracing
  const segment = tracer.getSegment()
  const subsegment = segment?.addNewSubsegment('## validateUser')

  try {
    const user = await db.users.findById(event.pathParameters.id)
    tracer.putAnnotation('userId', user.id)  // Searchable in X-Ray
    tracer.putMetadata('user', user)         // Non-searchable detail
    return { statusCode: 200, body: JSON.stringify(user) }
  } catch (err) {
    subsegment?.addError(err as Error)
    throw err
  } finally {
    subsegment?.close()
  }
}

export const handler = middy(baseHandler)
  .use(captureLambdaHandler(tracer))

Metrics: CloudWatch Custom Metrics

import { Metrics, MetricUnit } from '@aws-lambda-powertools/metrics'
import { logMetrics } from '@aws-lambda-powertools/metrics/middleware'

const metrics = new Metrics({
  namespace: 'MyApp',
  serviceName: 'user-service',
})

const baseHandler = async (event) => {
  const startTime = Date.now()
  const user = await processUserRequest(event)

  metrics.addMetric('UserRequestProcessed', MetricUnit.Count, 1)
  metrics.addMetric('ProcessingTime', MetricUnit.Milliseconds, Date.now() - startTime)
  metrics.addDimension('plan', user.plan)  // Filter metrics by plan in CloudWatch

  return { statusCode: 200, body: JSON.stringify(user) }
}

export const handler = middy(baseHandler)
  .use(logMetrics(metrics, { captureColdStartMetric: true }))

Idempotency: Preventing Duplicate Processing

Powertools' idempotency utility prevents duplicate Lambda executions — critical for payment processing, email sends, and any non-idempotent operation:

import { makeHandlerIdempotent } from '@aws-lambda-powertools/idempotency/middleware'
import { DynamoDBPersistenceLayer } from '@aws-lambda-powertools/idempotency/dynamodb'

const persistenceStore = new DynamoDBPersistenceLayer({
  tableName: 'IdempotencyTable',
})

export const handler = middy(baseHandler)
  .use(makeHandlerIdempotent({
    persistenceStore,
    expiresAfterSeconds: 3600,  // Results cached for 1 hour
  }))
// If the same request ID is sent twice within 1 hour:
// → First call: runs handler, caches result
// → Second call: returns cached result immediately, no handler execution

serverless-http: Run Any Framework on Lambda

serverless-http wraps existing Node.js HTTP frameworks to run on Lambda. Instead of rewriting your Express/Fastify/Hono app as Lambda handlers, serverless-http translates between API Gateway events and HTTP requests:

import serverless from 'serverless-http'
import express from 'express'

const app = express()

// Your entire Express app — routes, middleware, error handlers
app.use(express.json())

app.get('/users/:id', authenticate, async (req, res) => {
  const user = await db.users.findById(req.params.id)
  if (!user) return res.status(404).json({ error: 'Not found' })
  res.json(user)
})

app.post('/users', authenticate, validateBody(createUserSchema), async (req, res) => {
  const user = await db.users.create(req.body)
  res.status(201).json(user)
})

app.use((err, req, res, next) => {
  console.error(err)
  res.status(err.status ?? 500).json({ error: err.message })
})

// Wrap the Express app — returns a Lambda handler
export const handler = serverless(app)

Hono on Lambda via serverless-http

Hono's edge-native design makes it ideal for Lambda — small bundle, fast cold starts:

import serverless from 'serverless-http'
import { Hono } from 'hono'
import { zValidator } from '@hono/zod-validator'
import { z } from 'zod'

const app = new Hono()

app.get('/users/:id', async (c) => {
  const user = await db.users.findById(c.req.param('id'))
  if (!user) return c.json({ error: 'Not found' }, 404)
  return c.json(user)
})

app.post(
  '/users',
  zValidator('json', z.object({
    name: z.string().min(1),
    email: z.string().email(),
  })),
  async (c) => {
    const data = c.req.valid('json')
    const user = await db.users.create(data)
    return c.json(user, 201)
  }
)

export const handler = serverless(app)

serverless-http Configuration

export const handler = serverless(app, {
  // Map API Gateway binary media types to buffers
  binary: ['image/*', 'application/pdf'],

  // Custom request/response transformations
  request(request, event, context) {
    request.lambdaEvent = event
    request.lambdaContext = context
  },

  response(response, event, context) {
    response.headers['X-Lambda-Request-Id'] = context.awsRequestId
  },
})

Performance Comparison: Cold Start Impact

Cold start performance matters for API latency, especially on new deployments:

PackageAdded Cold Start TimeNotes
middy (core only)~2msNegligible
middy + 5 middleware~8-15msMiddleware initialization cost
@aws-lambda-powertools (all 3)~10-20msX-Ray patching is the heavy part
serverless-http + Express~40-80msExpress initialization
serverless-http + Hono~8-15msHono's minimal startup
serverless-http + Fastify~25-40msFastify plugin loading

Tip for cold start optimization: Use Lambda's POWERTOOLS_TRACER_CAPTURE_RESPONSE=false and POWERTOOLS_TRACER_CAPTURE_ERROR=false environment variables to disable verbose tracing in cost-sensitive functions.


Combining All Three: The Production Pattern

In production, using all three packages together is common and they compose cleanly:

import middy from '@middy/core'
import httpJsonBodyParser from '@middy/http-json-body-parser'
import httpErrorHandler from '@middy/http-error-handler'
import httpCors from '@middy/http-cors'
import ssm from '@middy/ssm'
import { Logger } from '@aws-lambda-powertools/logger'
import { Tracer } from '@aws-lambda-powertools/tracer'
import { Metrics, MetricUnit } from '@aws-lambda-powertools/metrics'
import { injectLambdaContext } from '@aws-lambda-powertools/logger/middleware'
import { captureLambdaHandler } from '@aws-lambda-powertools/tracer/middleware'
import { logMetrics } from '@aws-lambda-powertools/metrics/middleware'

const logger = new Logger({ serviceName: 'order-service' })
const tracer = new Tracer({ serviceName: 'order-service' })
const metrics = new Metrics({ namespace: 'Ecommerce', serviceName: 'order-service' })

const baseHandler = async (event, context) => {
  const { productId, quantity } = event.body
  const userId = context.user.id  // Injected by auth middleware

  logger.info('Creating order', { userId, productId, quantity })

  const order = await orderService.create({ userId, productId, quantity })

  metrics.addMetric('OrderCreated', MetricUnit.Count, 1)
  metrics.addMetric('OrderValue', MetricUnit.None, order.total)

  return {
    statusCode: 201,
    body: JSON.stringify(order),
  }
}

export const handler = middy(baseHandler)
  // Powertools observability (order matters: logger first for context)
  .use(injectLambdaContext(logger, { clearState: true }))
  .use(captureLambdaHandler(tracer))
  .use(logMetrics(metrics, { captureColdStartMetric: true }))
  // Request processing
  .use(httpJsonBodyParser())
  .use(httpCors({ origin: 'https://myapp.com' }))
  // Fetch secrets at cold start
  .use(ssm({
    fetchData: { STRIPE_KEY: '/prod/stripe-secret' },
    cacheExpiry: 15 * 60 * 1000,
    setToContext: true,
  }))
  // Error handling (always last)
  .use(httpErrorHandler({ fallbackMessage: 'An error occurred' }))

When to Use Each

Use middy when:

  • You're building raw Lambda handlers (not migrating an existing Express app)
  • You need a composable middleware pipeline for validation, auth, error handling
  • You want the official @middy/* packages for SSM, Secrets Manager, and other integrations
  • Your team thinks in middleware terms (similar to Express patterns)

Use @aws-lambda-powertools when:

  • You need structured logging that plays well with CloudWatch Logs Insights
  • You want X-Ray tracing without manual segment management
  • You need idempotency for payment processing or email sends
  • You're processing SQS/SNS/Kinesis batches and want partial batch failure handling

Use serverless-http when:

  • You have an existing Express/Fastify/Hono app you want to deploy to Lambda
  • Your team prefers standard HTTP framework patterns over Lambda-specific code
  • You're lifting-and-shifting an existing application to serverless
  • You want to run the same app locally (via Express) and on Lambda (via serverless-http)

Methodology

  • Download data from npmjs.com API, March 2026 weekly averages
  • Cold start measurements from AWS Lambda documentation and community benchmarks (us-east-1, Node.js 22 runtime, 512MB)
  • Versions: middy v5.x, @aws-lambda-powertools v2.x, serverless-http v3.x
  • Sources: middy.js.org, docs.powertools.aws.dev, AWS Lambda documentation

Compare Lambda and serverless packages on PkgPulse — health scores, download trends, and dependency analysis.

Related: Best Node.js Background Job Libraries 2026 · Hono vs Fastify vs Express API Framework 2026 · OpenTelemetry API Observability 2026

Comments

Stay Updated

Get the latest package insights, npm trends, and tooling tips delivered to your inbox.