n8n vs Automatisch vs Node-RED: Workflow Automation (2026)
TL;DR
n8n is the workflow automation platform — visual builder, 400+ integrations, code nodes (JavaScript/Python), webhooks, self-hosted or cloud, the most popular open-source Zapier alternative. Automatisch is the open-source Zapier alternative — simple trigger-action flows, self-hosted, privacy-focused, growing integration list. Node-RED is the flow-based programming tool — IoT-focused, visual wiring, MQTT/HTTP/WebSocket nodes, function nodes, dashboard, runs on Raspberry Pi to cloud. In 2026: n8n for business workflow automation, Automatisch for simple Zapier-like flows, Node-RED for IoT and data pipelines.
Key Takeaways
- n8n: 50K+ GitHub stars — 400+ integrations, code nodes, AI workflows
- Automatisch: 6K+ GitHub stars — Zapier alternative, simple flows, privacy-first
- Node-RED: 20K+ GitHub stars — IoT, flow programming, MQTT, dashboard
- n8n has the largest integration ecosystem and most advanced workflow features
- Automatisch provides the simplest Zapier-like experience for self-hosting
- Node-RED excels at IoT, hardware, and real-time data processing
n8n
n8n — workflow automation platform:
Installation
# Docker:
docker run -d \
--name n8n \
-p 5678:5678 \
-v n8n-data:/home/node/.n8n \
n8nio/n8n
# Docker Compose (production):
# docker-compose.yml
version: "3.8"
services:
n8n:
image: n8nio/n8n:latest
ports:
- "5678:5678"
environment:
- N8N_HOST=n8n.example.com
- N8N_PORT=5678
- N8N_PROTOCOL=https
- WEBHOOK_URL=https://n8n.example.com/
- DB_TYPE=postgresdb
- DB_POSTGRESDB_HOST=postgres
- DB_POSTGRESDB_PORT=5432
- DB_POSTGRESDB_DATABASE=n8n
- DB_POSTGRESDB_USER=n8n
- DB_POSTGRESDB_PASSWORD=secret
- N8N_ENCRYPTION_KEY=your-encryption-key
volumes:
- n8n-data:/home/node/.n8n
depends_on:
- postgres
postgres:
image: postgres:16-alpine
environment:
POSTGRES_USER: n8n
POSTGRES_PASSWORD: secret
POSTGRES_DB: n8n
volumes:
- postgres-data:/var/lib/postgresql/data
volumes:
n8n-data:
postgres-data:
Workflow via API
// n8n REST API — create and manage workflows:
const N8N_URL = "https://n8n.example.com"
const N8N_API_KEY = process.env.N8N_API_KEY!
const headers = {
"X-N8N-API-KEY": N8N_API_KEY,
"Content-Type": "application/json",
}
// Create workflow:
const workflow = await fetch(`${N8N_URL}/api/v1/workflows`, {
method: "POST",
headers,
body: JSON.stringify({
name: "Package Download Monitor",
nodes: [
{
parameters: {
rule: { interval: [{ field: "hours", hoursInterval: 6 }] },
},
name: "Schedule Trigger",
type: "n8n-nodes-base.scheduleTrigger",
position: [250, 300],
},
{
parameters: {
url: "https://api.npmjs.org/downloads/point/last-week/react",
method: "GET",
},
name: "Fetch Downloads",
type: "n8n-nodes-base.httpRequest",
position: [450, 300],
},
{
parameters: {
conditions: {
number: [{
value1: "={{ $json.downloads }}",
operation: "larger",
value2: 25000000,
}],
},
},
name: "Check Threshold",
type: "n8n-nodes-base.if",
position: [650, 300],
},
{
parameters: {
channel: "#alerts",
text: "🔥 React downloads exceeded 25M: {{ $json.downloads }}",
},
name: "Slack Alert",
type: "n8n-nodes-base.slack",
position: [850, 200],
},
],
connections: {
"Schedule Trigger": {
main: [[{ node: "Fetch Downloads", type: "main", index: 0 }]],
},
"Fetch Downloads": {
main: [[{ node: "Check Threshold", type: "main", index: 0 }]],
},
"Check Threshold": {
main: [
[{ node: "Slack Alert", type: "main", index: 0 }],
[],
],
},
},
active: true,
}),
}).then((r) => r.json())
// Execute workflow manually:
await fetch(`${N8N_URL}/api/v1/workflows/${workflow.id}/execute`, {
method: "POST",
headers,
})
// List executions:
const executions = await fetch(
`${N8N_URL}/api/v1/executions?workflowId=${workflow.id}&limit=10`,
{ headers }
).then((r) => r.json())
Code node (JavaScript)
// n8n Code Node — runs JavaScript/Python:
// Process incoming data:
const items = $input.all()
const processed = items.map((item) => {
const { name, downloads, version } = item.json
return {
json: {
name,
downloads,
version,
downloads_formatted: new Intl.NumberFormat().format(downloads),
trend: downloads > 1000000 ? "popular" : "growing",
updated_at: new Date().toISOString(),
},
}
})
return processed
// Access environment variables:
// const apiKey = $env.MY_API_KEY
// Access previous node data:
// const webhookData = $('Webhook').first().json
// HTTP request in code:
// const response = await $http.request({
// method: 'GET',
// url: 'https://api.example.com/data',
// headers: { Authorization: 'Bearer token' },
// })
Webhook workflow
// n8n Webhook — receive external events:
// Webhook node listens at:
// POST https://n8n.example.com/webhook/package-updates
// Example: GitHub webhook → process → notify:
// 1. Webhook Trigger (receives GitHub push event)
// 2. Code Node (extract package info)
// 3. HTTP Request (fetch npm stats)
// 4. IF Node (check if downloads changed significantly)
// 5. Slack/Email notification
// Call n8n webhook from your app:
await fetch("https://n8n.example.com/webhook/package-updates", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
package: "react",
event: "version_released",
version: "19.1.0",
timestamp: new Date().toISOString(),
}),
})
Automatisch
Automatisch — open-source Zapier:
Installation
# Docker Compose:
git clone https://github.com/automatisch/automatisch.git
cd automatisch
cp .env.example .env
docker compose up -d
# docker-compose.yml
version: "3.8"
services:
automatisch-main:
image: automatischio/automatisch:latest
ports:
- "3000:3000"
environment:
- APP_ENV=production
- APP_SECRET_KEY=your-secret-key
- POSTGRES_HOST=postgres
- POSTGRES_PORT=5432
- POSTGRES_DATABASE=automatisch
- POSTGRES_USERNAME=automatisch
- POSTGRES_PASSWORD=secret
- REDIS_HOST=redis
- ENCRYPTION_KEY=your-encryption-key
depends_on:
- postgres
- redis
automatisch-worker:
image: automatischio/automatisch:latest
command: npm run worker
environment:
- APP_ENV=production
- APP_SECRET_KEY=your-secret-key
- POSTGRES_HOST=postgres
- POSTGRES_DATABASE=automatisch
- POSTGRES_USERNAME=automatisch
- POSTGRES_PASSWORD=secret
- REDIS_HOST=redis
- ENCRYPTION_KEY=your-encryption-key
depends_on:
- postgres
- redis
postgres:
image: postgres:16-alpine
environment:
POSTGRES_USER: automatisch
POSTGRES_PASSWORD: secret
POSTGRES_DB: automatisch
volumes:
- postgres-data:/var/lib/postgresql/data
redis:
image: redis:7-alpine
volumes:
- redis-data:/data
volumes:
postgres-data:
redis-data:
Flow configuration
// Automatisch flows follow a trigger → action pattern:
// Example flow: New GitHub issue → Slack notification
// Configuration via the web UI:
// Trigger: GitHub — New Issue
// Config:
// Repository: myorg/my-project
// Event: issues.opened
// Action: Slack — Send Message
// Config:
// Channel: #github-issues
// Message: "New issue: {{trigger.issue.title}}\n{{trigger.issue.html_url}}"
// Example flow: Webhook → Database → Email
// Trigger: Webhook — catch incoming data
// Action 1: PostgreSQL — Insert row
// Table: events
// Columns: { type: "{{trigger.body.type}}", data: "{{trigger.body}}" }
// Action 2: Email — Send notification
// To: team@example.com
// Subject: "New event: {{trigger.body.type}}"
API
// Automatisch GraphQL API:
const AUTOMATISCH_URL = "https://automatisch.example.com"
const AUTOMATISCH_TOKEN = process.env.AUTOMATISCH_TOKEN!
const headers = {
Authorization: `Bearer ${AUTOMATISCH_TOKEN}`,
"Content-Type": "application/json",
}
// List flows:
const { data } = await fetch(`${AUTOMATISCH_URL}/graphql`, {
method: "POST",
headers,
body: JSON.stringify({
query: `
query GetFlows {
getFlows {
edges {
node {
id
name
active
status
steps {
id
type
appKey
status
}
createdAt
updatedAt
}
}
}
}
`,
}),
}).then((r) => r.json())
// Get execution history:
const executions = await fetch(`${AUTOMATISCH_URL}/graphql`, {
method: "POST",
headers,
body: JSON.stringify({
query: `
query GetExecutions($flowId: String!) {
getExecutions(flowId: $flowId) {
edges {
node {
id
status
createdAt
executionSteps {
id
status
dataOut
}
}
}
}
}
`,
variables: { flowId: "flow-id" },
}),
}).then((r) => r.json())
Supported integrations
// Automatisch currently supports:
// Communication:
// - Slack, Discord, Microsoft Teams, Twilio
// Developer:
// - GitHub, GitLab, Webhooks
// Productivity:
// - Google Sheets, Google Calendar, Google Drive
// - Notion, Todoist
// CRM/Marketing:
// - HubSpot, Mailchimp, SendGrid
// Database:
// - PostgreSQL, MySQL, MongoDB
// Other:
// - HTTP Request, RSS, Schedule/Cron
// - SMTP Email, Typeform, Stripe
// Custom apps:
// Community app development follows a structured pattern:
// - Trigger definitions
// - Action definitions
// - Authentication configuration
// - Data mapping
Node-RED
Node-RED — flow-based programming:
Installation
# npm (global):
npm install -g node-red
node-red
# Docker:
docker run -d \
--name node-red \
-p 1880:1880 \
-v node-red-data:/data \
nodered/node-red:latest
# docker-compose.yml
version: "3.8"
services:
node-red:
image: nodered/node-red:latest
ports:
- "1880:1880"
volumes:
- node-red-data:/data
environment:
- TZ=America/Los_Angeles
restart: unless-stopped
volumes:
node-red-data:
Flow definitions (JSON)
[
{
"id": "webhook-trigger",
"type": "http in",
"url": "/api/package-update",
"method": "post",
"name": "Package Update Webhook"
},
{
"id": "parse-payload",
"type": "function",
"name": "Parse Package Data",
"func": "const { name, version, downloads } = msg.payload;\nmsg.package = { name, version, downloads };\nmsg.payload = { name, version, downloads_formatted: downloads.toLocaleString() };\nreturn msg;",
"wires": [["check-threshold"]]
},
{
"id": "check-threshold",
"type": "switch",
"name": "Check Downloads",
"property": "package.downloads",
"rules": [
{ "t": "gt", "v": "1000000", "vt": "num" },
{ "t": "else" }
],
"wires": [["slack-notify"], ["log-only"]]
},
{
"id": "slack-notify",
"type": "http request",
"method": "POST",
"url": "https://hooks.slack.com/services/xxx",
"name": "Slack Notification"
}
]
Function nodes (JavaScript)
// Node-RED Function Node — full JavaScript:
// Process incoming messages:
const payload = msg.payload
// Fetch npm data:
const response = await fetch(
`https://api.npmjs.org/downloads/point/last-week/${payload.name}`
)
const data = await response.json()
msg.payload = {
name: payload.name,
weekly_downloads: data.downloads,
trend: data.downloads > payload.previous_downloads ? "up" : "down",
change_pct: (
((data.downloads - payload.previous_downloads) / payload.previous_downloads) *
100
).toFixed(1),
checked_at: new Date().toISOString(),
}
// Set topic for MQTT:
msg.topic = `packages/${payload.name}/stats`
return msg
// Node-RED Function Node — multiple outputs:
const package_data = msg.payload
if (package_data.downloads > 10000000) {
// Output 1: High traffic packages
return [{ payload: { ...package_data, tier: "high" } }, null, null]
} else if (package_data.downloads > 1000000) {
// Output 2: Medium traffic
return [null, { payload: { ...package_data, tier: "medium" } }, null]
} else {
// Output 3: Low traffic
return [null, null, { payload: { ...package_data, tier: "low" } }]
}
HTTP API and MQTT
// Node-RED HTTP endpoint flow:
// HTTP In → Function → HTTP Response
// Creates: GET /api/packages
// Function node:
const packages = flow.get("packages") || []
msg.payload = {
count: packages.length,
packages: packages.slice(0, 20),
updated_at: flow.get("last_update"),
}
msg.headers = { "Content-Type": "application/json" }
return msg
// MQTT integration:
// MQTT In node subscribes to: packages/+/updates
// Function node processes the message
// MQTT Out node publishes to: packages/processed
// Dashboard (node-red-dashboard):
// Gauge, Chart, Text, Button, Form nodes
// Auto-generates a web dashboard at /ui
Custom nodes
// Create custom Node-RED node:
// package-checker.js
module.exports = function (RED) {
function PackageCheckerNode(config) {
RED.nodes.createNode(this, config)
const node = this
node.on("input", async function (msg) {
const packageName = config.package || msg.payload.name
try {
const response = await fetch(
`https://registry.npmjs.org/${packageName}`
)
const data = await response.json()
msg.payload = {
name: data.name,
version: data["dist-tags"].latest,
description: data.description,
license: data.license,
}
node.status({ fill: "green", shape: "dot", text: `${data.name}@${data["dist-tags"].latest}` })
node.send(msg)
} catch (error) {
node.status({ fill: "red", shape: "ring", text: error.message })
node.error(error.message, msg)
}
})
}
RED.nodes.registerType("package-checker", PackageCheckerNode)
}
// package-checker.html
// <script type="text/javascript">
// RED.nodes.registerType('package-checker', {
// category: 'function',
// color: '#a6bbcf',
// defaults: { name: { value: "" }, package: { value: "" } },
// inputs: 1,
// outputs: 1,
// label: function() { return this.name || "package-checker" }
// })
// </script>
Feature Comparison
| Feature | n8n | Automatisch | Node-RED |
|---|---|---|---|
| License | Fair-code (SEL) | AGPL v3 | Apache 2.0 |
| Interface | Visual workflow | Visual flow | Visual wiring |
| Integrations | 400+ | 30+ | 4000+ (community) |
| Code execution | ✅ (JS/Python) | ❌ | ✅ (JS) |
| Webhooks | ✅ | ✅ | ✅ |
| Scheduling | ✅ (cron) | ✅ (cron) | ✅ (inject node) |
| Error handling | ✅ (retry, fallback) | Basic | ✅ (catch nodes) |
| Sub-workflows | ✅ | ❌ | ✅ (subflows) |
| AI nodes | ✅ (LangChain, OpenAI) | ❌ | Via community |
| Version control | ✅ | ❌ | ✅ (projects) |
| Multi-user | ✅ | ✅ | Via admin |
| MQTT | Via node | ❌ | ✅ (native) |
| Dashboard | ❌ | ❌ | ✅ (built-in) |
| IoT support | Limited | ❌ | ✅ (core strength) |
| Self-hosted | ✅ | ✅ | ✅ |
| Cloud hosted | ✅ | ❌ | ❌ (FlowFuse) |
| Resource usage | Medium | Low | Low |
When to Use Each
Use n8n if:
- Want the most integrations and advanced workflow automation
- Need code nodes (JavaScript/Python) for custom logic
- Building business automation (CRM, marketing, ops workflows)
- Want AI-powered workflows with LangChain/OpenAI nodes
Use Automatisch if:
- Want the simplest Zapier-like self-hosted experience
- Need basic trigger-action flows without code
- Prefer privacy-focused, fully open-source automation
- Building straightforward integrations between SaaS tools
Use Node-RED if:
- Building IoT, hardware, or real-time data processing flows
- Need MQTT, serial, or hardware protocol support
- Want a visual dashboard for monitoring data streams
- Building data transformation pipelines with custom nodes
Methodology
GitHub stars as of March 2026. Feature comparison based on n8n v1.x, Automatisch v0.x, and Node-RED v4.x.
Compare automation tools and developer platforms on PkgPulse →