TL;DR
n8n is the workflow automation platform — visual builder, 400+ integrations, code nodes (JavaScript/Python), webhooks, self-hosted or cloud, the most popular open-source Zapier alternative. Automatisch is the open-source Zapier alternative — simple trigger-action flows, self-hosted, privacy-focused, growing integration list. Node-RED is the flow-based programming tool — IoT-focused, visual wiring, MQTT/HTTP/WebSocket nodes, function nodes, dashboard, runs on Raspberry Pi to cloud. In 2026: n8n for business workflow automation, Automatisch for simple Zapier-like flows, Node-RED for IoT and data pipelines.
Key Takeaways
- n8n: 50K+ GitHub stars — 400+ integrations, code nodes, AI workflows
- Automatisch: 6K+ GitHub stars — Zapier alternative, simple flows, privacy-first
- Node-RED: 20K+ GitHub stars — IoT, flow programming, MQTT, dashboard
- n8n has the largest integration ecosystem and most advanced workflow features
- Automatisch provides the simplest Zapier-like experience for self-hosting
- Node-RED excels at IoT, hardware, and real-time data processing
n8n
n8n — workflow automation platform:
Installation
# Docker:
docker run -d \
--name n8n \
-p 5678:5678 \
-v n8n-data:/home/node/.n8n \
n8nio/n8n
# Docker Compose (production):
# docker-compose.yml
version: "3.8"
services:
n8n:
image: n8nio/n8n:latest
ports:
- "5678:5678"
environment:
- N8N_HOST=n8n.example.com
- N8N_PORT=5678
- N8N_PROTOCOL=https
- WEBHOOK_URL=https://n8n.example.com/
- DB_TYPE=postgresdb
- DB_POSTGRESDB_HOST=postgres
- DB_POSTGRESDB_PORT=5432
- DB_POSTGRESDB_DATABASE=n8n
- DB_POSTGRESDB_USER=n8n
- DB_POSTGRESDB_PASSWORD=secret
- N8N_ENCRYPTION_KEY=your-encryption-key
volumes:
- n8n-data:/home/node/.n8n
depends_on:
- postgres
postgres:
image: postgres:16-alpine
environment:
POSTGRES_USER: n8n
POSTGRES_PASSWORD: secret
POSTGRES_DB: n8n
volumes:
- postgres-data:/var/lib/postgresql/data
volumes:
n8n-data:
postgres-data:
Workflow via API
// n8n REST API — create and manage workflows:
const N8N_URL = "https://n8n.example.com"
const N8N_API_KEY = process.env.N8N_API_KEY!
const headers = {
"X-N8N-API-KEY": N8N_API_KEY,
"Content-Type": "application/json",
}
// Create workflow:
const workflow = await fetch(`${N8N_URL}/api/v1/workflows`, {
method: "POST",
headers,
body: JSON.stringify({
name: "Package Download Monitor",
nodes: [
{
parameters: {
rule: { interval: [{ field: "hours", hoursInterval: 6 }] },
},
name: "Schedule Trigger",
type: "n8n-nodes-base.scheduleTrigger",
position: [250, 300],
},
{
parameters: {
url: "https://api.npmjs.org/downloads/point/last-week/react",
method: "GET",
},
name: "Fetch Downloads",
type: "n8n-nodes-base.httpRequest",
position: [450, 300],
},
{
parameters: {
conditions: {
number: [{
value1: "={{ $json.downloads }}",
operation: "larger",
value2: 25000000,
}],
},
},
name: "Check Threshold",
type: "n8n-nodes-base.if",
position: [650, 300],
},
{
parameters: {
channel: "#alerts",
text: "🔥 React downloads exceeded 25M: {{ $json.downloads }}",
},
name: "Slack Alert",
type: "n8n-nodes-base.slack",
position: [850, 200],
},
],
connections: {
"Schedule Trigger": {
main: [[{ node: "Fetch Downloads", type: "main", index: 0 }]],
},
"Fetch Downloads": {
main: [[{ node: "Check Threshold", type: "main", index: 0 }]],
},
"Check Threshold": {
main: [
[{ node: "Slack Alert", type: "main", index: 0 }],
[],
],
},
},
active: true,
}),
}).then((r) => r.json())
// Execute workflow manually:
await fetch(`${N8N_URL}/api/v1/workflows/${workflow.id}/execute`, {
method: "POST",
headers,
})
// List executions:
const executions = await fetch(
`${N8N_URL}/api/v1/executions?workflowId=${workflow.id}&limit=10`,
{ headers }
).then((r) => r.json())
Code node (JavaScript)
// n8n Code Node — runs JavaScript/Python:
// Process incoming data:
const items = $input.all()
const processed = items.map((item) => {
const { name, downloads, version } = item.json
return {
json: {
name,
downloads,
version,
downloads_formatted: new Intl.NumberFormat().format(downloads),
trend: downloads > 1000000 ? "popular" : "growing",
updated_at: new Date().toISOString(),
},
}
})
return processed
// Access environment variables:
// const apiKey = $env.MY_API_KEY
// Access previous node data:
// const webhookData = $('Webhook').first().json
// HTTP request in code:
// const response = await $http.request({
// method: 'GET',
// url: 'https://api.example.com/data',
// headers: { Authorization: 'Bearer token' },
// })
Webhook workflow
// n8n Webhook — receive external events:
// Webhook node listens at:
// POST https://n8n.example.com/webhook/package-updates
// Example: GitHub webhook → process → notify:
// 1. Webhook Trigger (receives GitHub push event)
// 2. Code Node (extract package info)
// 3. HTTP Request (fetch npm stats)
// 4. IF Node (check if downloads changed significantly)
// 5. Slack/Email notification
// Call n8n webhook from your app:
await fetch("https://n8n.example.com/webhook/package-updates", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
package: "react",
event: "version_released",
version: "19.1.0",
timestamp: new Date().toISOString(),
}),
})
Automatisch
Automatisch — open-source Zapier:
Installation
# Docker Compose:
git clone https://github.com/automatisch/automatisch.git
cd automatisch
cp .env.example .env
docker compose up -d
# docker-compose.yml
version: "3.8"
services:
automatisch-main:
image: automatischio/automatisch:latest
ports:
- "3000:3000"
environment:
- APP_ENV=production
- APP_SECRET_KEY=your-secret-key
- POSTGRES_HOST=postgres
- POSTGRES_PORT=5432
- POSTGRES_DATABASE=automatisch
- POSTGRES_USERNAME=automatisch
- POSTGRES_PASSWORD=secret
- REDIS_HOST=redis
- ENCRYPTION_KEY=your-encryption-key
depends_on:
- postgres
- redis
automatisch-worker:
image: automatischio/automatisch:latest
command: npm run worker
environment:
- APP_ENV=production
- APP_SECRET_KEY=your-secret-key
- POSTGRES_HOST=postgres
- POSTGRES_DATABASE=automatisch
- POSTGRES_USERNAME=automatisch
- POSTGRES_PASSWORD=secret
- REDIS_HOST=redis
- ENCRYPTION_KEY=your-encryption-key
depends_on:
- postgres
- redis
postgres:
image: postgres:16-alpine
environment:
POSTGRES_USER: automatisch
POSTGRES_PASSWORD: secret
POSTGRES_DB: automatisch
volumes:
- postgres-data:/var/lib/postgresql/data
redis:
image: redis:7-alpine
volumes:
- redis-data:/data
volumes:
postgres-data:
redis-data:
Flow configuration
// Automatisch flows follow a trigger → action pattern:
// Example flow: New GitHub issue → Slack notification
// Configuration via the web UI:
// Trigger: GitHub — New Issue
// Config:
// Repository: myorg/my-project
// Event: issues.opened
// Action: Slack — Send Message
// Config:
// Channel: #github-issues
// Message: "New issue: {{trigger.issue.title}}\n{{trigger.issue.html_url}}"
// Example flow: Webhook → Database → Email
// Trigger: Webhook — catch incoming data
// Action 1: PostgreSQL — Insert row
// Table: events
// Columns: { type: "{{trigger.body.type}}", data: "{{trigger.body}}" }
// Action 2: Email — Send notification
// To: team@example.com
// Subject: "New event: {{trigger.body.type}}"
API
// Automatisch GraphQL API:
const AUTOMATISCH_URL = "https://automatisch.example.com"
const AUTOMATISCH_TOKEN = process.env.AUTOMATISCH_TOKEN!
const headers = {
Authorization: `Bearer ${AUTOMATISCH_TOKEN}`,
"Content-Type": "application/json",
}
// List flows:
const { data } = await fetch(`${AUTOMATISCH_URL}/graphql`, {
method: "POST",
headers,
body: JSON.stringify({
query: `
query GetFlows {
getFlows {
edges {
node {
id
name
active
status
steps {
id
type
appKey
status
}
createdAt
updatedAt
}
}
}
}
`,
}),
}).then((r) => r.json())
// Get execution history:
const executions = await fetch(`${AUTOMATISCH_URL}/graphql`, {
method: "POST",
headers,
body: JSON.stringify({
query: `
query GetExecutions($flowId: String!) {
getExecutions(flowId: $flowId) {
edges {
node {
id
status
createdAt
executionSteps {
id
status
dataOut
}
}
}
}
}
`,
variables: { flowId: "flow-id" },
}),
}).then((r) => r.json())
Supported integrations
// Automatisch currently supports:
// Communication:
// - Slack, Discord, Microsoft Teams, Twilio
// Developer:
// - GitHub, GitLab, Webhooks
// Productivity:
// - Google Sheets, Google Calendar, Google Drive
// - Notion, Todoist
// CRM/Marketing:
// - HubSpot, Mailchimp, SendGrid
// Database:
// - PostgreSQL, MySQL, MongoDB
// Other:
// - HTTP Request, RSS, Schedule/Cron
// - SMTP Email, Typeform, Stripe
// Custom apps:
// Community app development follows a structured pattern:
// - Trigger definitions
// - Action definitions
// - Authentication configuration
// - Data mapping
Node-RED
Node-RED — flow-based programming:
Installation
# npm (global):
npm install -g node-red
node-red
# Docker:
docker run -d \
--name node-red \
-p 1880:1880 \
-v node-red-data:/data \
nodered/node-red:latest
# docker-compose.yml
version: "3.8"
services:
node-red:
image: nodered/node-red:latest
ports:
- "1880:1880"
volumes:
- node-red-data:/data
environment:
- TZ=America/Los_Angeles
restart: unless-stopped
volumes:
node-red-data:
Flow definitions (JSON)
[
{
"id": "webhook-trigger",
"type": "http in",
"url": "/api/package-update",
"method": "post",
"name": "Package Update Webhook"
},
{
"id": "parse-payload",
"type": "function",
"name": "Parse Package Data",
"func": "const { name, version, downloads } = msg.payload;\nmsg.package = { name, version, downloads };\nmsg.payload = { name, version, downloads_formatted: downloads.toLocaleString() };\nreturn msg;",
"wires": [["check-threshold"]]
},
{
"id": "check-threshold",
"type": "switch",
"name": "Check Downloads",
"property": "package.downloads",
"rules": [
{ "t": "gt", "v": "1000000", "vt": "num" },
{ "t": "else" }
],
"wires": [["slack-notify"], ["log-only"]]
},
{
"id": "slack-notify",
"type": "http request",
"method": "POST",
"url": "https://hooks.slack.com/services/xxx",
"name": "Slack Notification"
}
]
Function nodes (JavaScript)
// Node-RED Function Node — full JavaScript:
// Process incoming messages:
const payload = msg.payload
// Fetch npm data:
const response = await fetch(
`https://api.npmjs.org/downloads/point/last-week/${payload.name}`
)
const data = await response.json()
msg.payload = {
name: payload.name,
weekly_downloads: data.downloads,
trend: data.downloads > payload.previous_downloads ? "up" : "down",
change_pct: (
((data.downloads - payload.previous_downloads) / payload.previous_downloads) *
100
).toFixed(1),
checked_at: new Date().toISOString(),
}
// Set topic for MQTT:
msg.topic = `packages/${payload.name}/stats`
return msg
// Node-RED Function Node — multiple outputs:
const package_data = msg.payload
if (package_data.downloads > 10000000) {
// Output 1: High traffic packages
return [{ payload: { ...package_data, tier: "high" } }, null, null]
} else if (package_data.downloads > 1000000) {
// Output 2: Medium traffic
return [null, { payload: { ...package_data, tier: "medium" } }, null]
} else {
// Output 3: Low traffic
return [null, null, { payload: { ...package_data, tier: "low" } }]
}
HTTP API and MQTT
// Node-RED HTTP endpoint flow:
// HTTP In → Function → HTTP Response
// Creates: GET /api/packages
// Function node:
const packages = flow.get("packages") || []
msg.payload = {
count: packages.length,
packages: packages.slice(0, 20),
updated_at: flow.get("last_update"),
}
msg.headers = { "Content-Type": "application/json" }
return msg
// MQTT integration:
// MQTT In node subscribes to: packages/+/updates
// Function node processes the message
// MQTT Out node publishes to: packages/processed
// Dashboard (node-red-dashboard):
// Gauge, Chart, Text, Button, Form nodes
// Auto-generates a web dashboard at /ui
Custom nodes
// Create custom Node-RED node:
// package-checker.js
module.exports = function (RED) {
function PackageCheckerNode(config) {
RED.nodes.createNode(this, config)
const node = this
node.on("input", async function (msg) {
const packageName = config.package || msg.payload.name
try {
const response = await fetch(
`https://registry.npmjs.org/${packageName}`
)
const data = await response.json()
msg.payload = {
name: data.name,
version: data["dist-tags"].latest,
description: data.description,
license: data.license,
}
node.status({ fill: "green", shape: "dot", text: `${data.name}@${data["dist-tags"].latest}` })
node.send(msg)
} catch (error) {
node.status({ fill: "red", shape: "ring", text: error.message })
node.error(error.message, msg)
}
})
}
RED.nodes.registerType("package-checker", PackageCheckerNode)
}
// package-checker.html
// <script type="text/javascript">
// RED.nodes.registerType('package-checker', {
// category: 'function',
// color: '#a6bbcf',
// defaults: { name: { value: "" }, package: { value: "" } },
// inputs: 1,
// outputs: 1,
// label: function() { return this.name || "package-checker" }
// })
// </script>
Feature Comparison
| Feature | n8n | Automatisch | Node-RED |
|---|---|---|---|
| License | Fair-code (SEL) | AGPL v3 | Apache 2.0 |
| Interface | Visual workflow | Visual flow | Visual wiring |
| Integrations | 400+ | 30+ | 4000+ (community) |
| Code execution | ✅ (JS/Python) | ❌ | ✅ (JS) |
| Webhooks | ✅ | ✅ | ✅ |
| Scheduling | ✅ (cron) | ✅ (cron) | ✅ (inject node) |
| Error handling | ✅ (retry, fallback) | Basic | ✅ (catch nodes) |
| Sub-workflows | ✅ | ❌ | ✅ (subflows) |
| AI nodes | ✅ (LangChain, OpenAI) | ❌ | Via community |
| Version control | ✅ | ❌ | ✅ (projects) |
| Multi-user | ✅ | ✅ | Via admin |
| MQTT | Via node | ❌ | ✅ (native) |
| Dashboard | ❌ | ❌ | ✅ (built-in) |
| IoT support | Limited | ❌ | ✅ (core strength) |
| Self-hosted | ✅ | ✅ | ✅ |
| Cloud hosted | ✅ | ❌ | ❌ (FlowFuse) |
| Resource usage | Medium | Low | Low |
When to Use Each
Use n8n if:
- Want the most integrations and advanced workflow automation
- Need code nodes (JavaScript/Python) for custom logic
- Building business automation (CRM, marketing, ops workflows)
- Want AI-powered workflows with LangChain/OpenAI nodes
Use Automatisch if:
- Want the simplest Zapier-like self-hosted experience
- Need basic trigger-action flows without code
- Prefer privacy-focused, fully open-source automation
- Building straightforward integrations between SaaS tools
Use Node-RED if:
- Building IoT, hardware, or real-time data processing flows
- Need MQTT, serial, or hardware protocol support
- Want a visual dashboard for monitoring data streams
- Building data transformation pipelines with custom nodes
Self-Hosting Security and Credential Management
Running workflow automation platforms internally raises important security considerations because these tools connect to dozens of external services and store credentials for all of them. n8n encrypts stored credentials using the N8N_ENCRYPTION_KEY environment variable — losing this key means losing access to all stored credentials, so it must be backed up separately from the database. n8n supports RBAC through its enterprise tier and basic user authentication in the community edition. Automatisch similarly encrypts connection credentials using the ENCRYPTION_KEY environment variable. For both platforms, credentials should never be stored in environment variables that become part of the workflow definition — they should be managed through the platform's credential vault and referenced by ID. Node-RED's credential storage is simpler: secrets are stored in an encrypted flows_cred.json file alongside the flow definitions, encrypted using the credentialSecret from settings.js. For production Node-RED deployments, this file should be in a directory that is backed up and not committed to source control.
TypeScript and Code Quality in Automation
Workflow automation platforms frequently require custom JavaScript logic, and the quality of that execution environment affects maintainability. n8n's Code Node runs in a separate VM context with access to predefined globals ($input, $env, $items, $http) and supports async/await for HTTP requests. The code runs server-side in Node.js, so npm packages available to the n8n server process can be imported using require(). n8n does not provide TypeScript support in Code Nodes — the code is executed as JavaScript without compilation. Node-RED's Function nodes also run JavaScript in a sandboxed Node.js context, with a curated set of globals available through the global, flow, and msg objects. Automatisch has no code execution capability at all — logic must be expressed through the visual trigger-action flow model. For automation tasks that require custom transformation or business logic that can't be expressed in a low-code flow, n8n or Node-RED are the appropriate choices, and writing complex logic as separate Node.js scripts called via n8n's HTTP Request node is often cleaner than inline Code nodes.
Observability and Error Handling in Production
Production workflow automation requires robust monitoring and error handling to ensure critical business automations don't silently fail. n8n maintains a full execution log in its PostgreSQL database, with the ability to review the input and output of every node in every execution. Failed executions are flagged and accessible through the n8n UI, and you can configure error workflows that execute when a specific workflow fails — useful for sending Slack alerts or creating tickets when critical automation breaks. Node-RED provides a debug panel in its UI for local development but relies on the built-in logging system and catch nodes for production error handling. The node-red-contrib-prometheus-exporter community node can expose metrics for Prometheus scraping, which is the standard approach for integrating Node-RED into production observability stacks. Automatisch's error handling is more basic, with failed flow executions visible in the execution history but limited options for automated error notification or retry configuration.
Integration Ecosystem and Long-Tail Connector Support
The breadth of pre-built integrations determines how much custom code you need to write for each new automation use case. n8n's 400+ nodes cover the major SaaS tools (Salesforce, HubSpot, Stripe, GitHub, Google Workspace, AWS services) and provide abstractions for common operation types (create record, update record, list records) that work consistently across integrations. Node-RED's 4000+ community nodes represent a much larger ecosystem but with significantly more quality variance — popular nodes like the Slack, MQTT, and HTTP Request nodes are well-maintained, while niche integrations may be abandoned or incompatible with recent Node-RED versions. Automatisch's 30+ integrations are fewer but curated for quality and consistency, making it a reasonable starting point for simple SaaS-to-SaaS automation flows that don't require the long-tail connector coverage that n8n provides.
Scaling and High-Availability Deployment
Workflow automation platforms need to scale when the volume of triggered workflows grows beyond what a single instance can handle. n8n supports horizontal scaling through queue mode, where a Redis-backed work queue distributes workflow executions across multiple worker processes. The main n8n instance handles the UI and webhook ingestion, while worker instances pull jobs from the queue and execute them. This architecture allows scaling workflow execution independently of the UI server. Node-RED's single-threaded Node.js process handles one flow at a time by default, though the async nature of JavaScript means many concurrent HTTP requests and MQTT messages can be in flight simultaneously. For high-throughput Node-RED deployments (IoT sensor data from thousands of devices), running multiple Node-RED instances behind a load balancer with shared MQTT broker state is the standard scaling approach. Automatisch's worker architecture uses a similar Redis queue pattern to n8n, with separate main and worker containers supporting horizontal execution scaling.
Webhook Ingestion and Trigger Reliability
The reliability of workflow triggers is as important as the reliability of workflow execution, because a missed trigger means a workflow never runs regardless of how robust the execution engine is. n8n's webhook nodes expose unique URLs that external systems post events to, and n8n buffers incoming webhook payloads in its database before processing them. This means that if n8n is temporarily unavailable during a brief deployment or restart, webhooks received after the service restores will still be processed — but webhooks sent during the outage are lost unless the sending service retries. For critical triggers (Stripe payment events, GitHub merge events, form submissions), configuring the sending service to retry failed webhook deliveries with exponential backoff is essential. Node-RED's HTTP-in nodes receive webhooks synchronously within the Node.js event loop, which means high webhook volume can saturate the event loop and increase processing latency for all concurrent operations. Automatisch uses a queue-backed webhook ingestion pattern similar to n8n, storing incoming events before processing them, which provides better resilience to brief processing delays and allows prioritizing webhook types when the queue backs up.
Methodology
GitHub stars as of March 2026. Feature comparison based on n8n v1.x, Automatisch v0.x, and Node-RED v4.x.
Compare automation tools and developer platforms on PkgPulse →
See also: AVA vs Jest and unplugin vs Rollup Plugin vs Vite Plugin, Coolify vs CapRover vs Dokku (2026).