Lý thuyết
45 phút
Bài 8/15

Backup & Restore

Comprehensive backup strategies cho n8n - workflows, credentials, database, và disaster recovery

💾 Backup & Restore

Data Backup

Mất workflows và credentials là nightmare. Bài này covers everything về backup - từ manual exports đến automated disaster recovery.

What to Backup

Critical Components:

Text
1N8N BACKUP TARGETS
2──────────────────
3
41. WORKFLOWS
5 └── All workflow definitions
6 └── Active/inactive states
7 └── Settings & configurations
8
92. CREDENTIALS
10 └── API keys, tokens
11 └── OAuth connections
12 └── Database passwords
13 └── ⚠️ ENCRYPTED - needs key!
14
153. DATABASE
16 └── execution_entity (history)
17 └── workflow_entity (workflows)
18 └── credentials_entity (creds)
19 └── webhook_entity (webhooks)
20
214. ENCRYPTION KEY
22 └── N8N_ENCRYPTION_KEY
23 └── ⚠️ Without this = data loss!
24
255. ENVIRONMENT CONFIG
26 └── .env file
27 └── docker-compose.yml
28 └── nginx.conf
Critical

N8N_ENCRYPTION_KEY là quan trọng nhất! Không có key = không decrypt được credentials!

Backup Methods

Method 1: Export via UI

Workflow Export:

Text
11. Open workflow
22. Click ⋮ menu → Download
33. Save as JSON file
4
5Or export all:
61. Settings → Personal
72. Export workflows
83. Choose "All Workflows"

Credentials (Manual Only):

Text
1❌ Cannot export credentials via UI
2❌ For security reasons
3✅ Must backup database instead

Method 2: n8n CLI Export

Bash
1# Export all workflows
2docker exec n8n n8n export:workflow --all --output=/backup/workflows.json
3
4# Export specific workflow
5docker exec n8n n8n export:workflow --id=YOUR_WORKFLOW_ID --output=/backup/workflow.json
6
7# Export credentials (requires encryption key)
8docker exec n8n n8n export:credentials --all --output=/backup/credentials.json
9
10# Decrypt credentials for backup
11docker exec n8n n8n export:credentials --all --decrypted --output=/backup/credentials-plain.json
Security

Decrypted credentials file chứa all secrets - store securely!

Method 3: Database Backup

PostgreSQL:

Bash
1#!/bin/bash
2# backup-postgres.sh
3
4DATE=$(date +%Y%m%d_%H%M%S)
5BACKUP_DIR=/backups/postgres
6
7# Create backup
8PGPASSWORD=$POSTGRES_PASSWORD pg_dump \
9 -h postgres \
10 -U n8n \
11 -d n8n \
12 -F c \
13 -f "$BACKUP_DIR/n8n_$DATE.dump"
14
15# Compress
16gzip "$BACKUP_DIR/n8n_$DATE.dump"
17
18# Cleanup old backups (keep 30 days)
19find $BACKUP_DIR -name "*.dump.gz" -mtime +30 -delete
20
21echo "Backup complete: n8n_$DATE.dump.gz"

SQLite:

Bash
1#!/bin/bash
2# backup-sqlite.sh
3
4DATE=$(date +%Y%m%d_%H%M%S)
5BACKUP_DIR=/backups/sqlite
6N8N_DATA=/home/node/.n8n
7
8# Stop n8n for consistency (optional)
9docker compose stop n8n
10
11# Copy database
12cp "$N8N_DATA/database.sqlite" "$BACKUP_DIR/n8n_$DATE.sqlite"
13
14# Compress
15gzip "$BACKUP_DIR/n8n_$DATE.sqlite"
16
17# Restart
18docker compose start n8n

Method 4: Docker Volume Backup

Bash
1#!/bin/bash
2# backup-volumes.sh
3
4DATE=$(date +%Y%m%d_%H%M%S)
5BACKUP_DIR=/backups/volumes
6
7# Backup n8n data volume
8docker run --rm \
9 -v n8n_data:/source:ro \
10 -v $BACKUP_DIR:/backup \
11 alpine tar czf /backup/n8n_data_$DATE.tar.gz -C /source .
12
13# Backup postgres volume
14docker run --rm \
15 -v postgres_data:/source:ro \
16 -v $BACKUP_DIR:/backup \
17 alpine tar czf /backup/postgres_data_$DATE.tar.gz -C /source .

Complete Backup Script

Bash
1#!/bin/bash
2# full-backup.sh - Complete n8n backup
3
4set -e
5
6# Configuration
7BACKUP_DIR="/backups"
8DATE=$(date +%Y%m%d_%H%M%S)
9BACKUP_NAME="n8n_backup_$DATE"
10RETENTION_DAYS=30
11
12# Create backup directory
13mkdir -p "$BACKUP_DIR/$BACKUP_NAME"
14cd "$BACKUP_DIR/$BACKUP_NAME"
15
16echo "Starting n8n backup: $BACKUP_NAME"
17
18# 1. Export workflows
19echo "Exporting workflows..."
20docker exec n8n n8n export:workflow --all --output=/tmp/workflows.json
21docker cp n8n:/tmp/workflows.json ./workflows.json
22
23# 2. Export credentials (encrypted)
24echo "Exporting credentials..."
25docker exec n8n n8n export:credentials --all --output=/tmp/credentials.json
26docker cp n8n:/tmp/credentials.json ./credentials.json
27
28# 3. Database backup
29echo "Backing up database..."
30PGPASSWORD=$POSTGRES_PASSWORD pg_dump \
31 -h localhost \
32 -p 5432 \
33 -U n8n \
34 -d n8n \
35 -F c \
36 -f ./database.dump
37
38# 4. Copy environment files
39echo "Copying configuration..."
40cp /opt/n8n/.env ./env_backup.txt
41cp /opt/n8n/docker-compose.yml ./docker-compose.yml
42
43# 5. Save encryption key
44echo "Saving encryption key..."
45echo "N8N_ENCRYPTION_KEY=$N8N_ENCRYPTION_KEY" > ./encryption_key.txt
46chmod 600 ./encryption_key.txt
47
48# 6. Create archive
49echo "Creating archive..."
50cd "$BACKUP_DIR"
51tar czf "${BACKUP_NAME}.tar.gz" "$BACKUP_NAME"
52rm -rf "$BACKUP_NAME"
53
54# 7. Upload to remote (optional)
55# aws s3 cp "${BACKUP_NAME}.tar.gz" s3://your-bucket/n8n-backups/
56
57# 8. Cleanup old backups
58echo "Cleaning up old backups..."
59find "$BACKUP_DIR" -name "n8n_backup_*.tar.gz" -mtime +$RETENTION_DAYS -delete
60
61echo "Backup complete: ${BACKUP_NAME}.tar.gz"
62echo "Size: $(du -h ${BACKUP_NAME}.tar.gz | cut -f1)"

Automated Backup

Cron Setup:

Bash
1# Edit crontab
2crontab -e
3
4# Add these entries:
5
6# Daily backup at 2 AM
70 2 * * * /opt/n8n/scripts/full-backup.sh >> /var/log/n8n-backup.log 2>&1
8
9# Weekly database optimization at 3 AM Sunday
100 3 * * 0 docker exec n8n-postgres vacuumdb -U n8n -d n8n -v >> /var/log/n8n-backup.log 2>&1
11
12# Monthly full export at 4 AM on 1st
130 4 1 * * /opt/n8n/scripts/monthly-export.sh >> /var/log/n8n-backup.log 2>&1

Docker Backup Service:

yaml
1# docker-compose.yml
2services:
3 backup:
4 image: alpine
5 volumes:
6 - ./scripts:/scripts:ro
7 - ./backups:/backups
8 - n8n_data:/n8n_data:ro
9 - postgres_data:/postgres_data:ro
10 environment:
11 - POSTGRES_PASSWORD=${POSTGRES_PASSWORD}
12 - N8N_ENCRYPTION_KEY=${N8N_ENCRYPTION_KEY}
13 command: >
14 sh -c "
15 apk add --no-cache postgresql-client &&
16 while true; do
17 /scripts/backup.sh
18 sleep 86400
19 done
20 "

Restore Procedures

Restore Workflows:

Bash
1# From JSON export
2docker exec -i n8n n8n import:workflow --input=/backup/workflows.json
3
4# Specific workflow
5docker exec -i n8n n8n import:workflow --input=/backup/workflow.json --id=WORKFLOW_ID

Restore Credentials:

Bash
1# From encrypted export (needs same encryption key!)
2docker exec -i n8n n8n import:credentials --input=/backup/credentials.json

Restore Database (PostgreSQL):

Bash
1# Stop n8n
2docker compose stop n8n
3
4# Restore database
5PGPASSWORD=$POSTGRES_PASSWORD pg_restore \
6 -h localhost \
7 -U n8n \
8 -d n8n \
9 -c \
10 --if-exists \
11 /backups/database.dump
12
13# Start n8n
14docker compose start n8n

Full Restore Script:

Bash
1#!/bin/bash
2# restore.sh - Full n8n restore
3
4set -e
5
6BACKUP_FILE=$1
7
8if [ -z "$BACKUP_FILE" ]; then
9 echo "Usage: ./restore.sh backup_file.tar.gz"
10 exit 1
11fi
12
13echo "Starting restore from: $BACKUP_FILE"
14
15# Extract backup
16RESTORE_DIR="/tmp/n8n_restore_$$"
17mkdir -p "$RESTORE_DIR"
18tar xzf "$BACKUP_FILE" -C "$RESTORE_DIR"
19
20# Find backup directory
21BACKUP_DIR=$(ls "$RESTORE_DIR")
22
23# Stop services
24echo "Stopping services..."
25docker compose stop n8n
26
27# Restore encryption key (CRITICAL!)
28echo "Checking encryption key..."
29source "$RESTORE_DIR/$BACKUP_DIR/encryption_key.txt"
30if [ -z "$N8N_ENCRYPTION_KEY" ]; then
31 echo "ERROR: No encryption key in backup!"
32 exit 1
33fi
34
35# Restore database
36echo "Restoring database..."
37PGPASSWORD=$POSTGRES_PASSWORD pg_restore \
38 -h localhost \
39 -U n8n \
40 -d n8n \
41 -c \
42 --if-exists \
43 "$RESTORE_DIR/$BACKUP_DIR/database.dump"
44
45# Restore n8n data volume
46echo "Restoring n8n data..."
47docker run --rm \
48 -v n8n_data:/target \
49 -v "$RESTORE_DIR/$BACKUP_DIR":/backup:ro \
50 alpine sh -c "rm -rf /target/* && tar xzf /backup/n8n_data.tar.gz -C /target"
51
52# Start services
53echo "Starting services..."
54docker compose start n8n
55
56# Cleanup
57rm -rf "$RESTORE_DIR"
58
59echo "Restore complete!"
60echo "Please verify workflows and credentials work correctly."

Remote Backup

To S3:

Bash
1#!/bin/bash
2# Upload to S3
3
4BACKUP_FILE=$1
5S3_BUCKET="your-backup-bucket"
6S3_PATH="n8n-backups/$(date +%Y/%m)"
7
8aws s3 cp "$BACKUP_FILE" "s3://$S3_BUCKET/$S3_PATH/"
9
10# List backups
11aws s3 ls "s3://$S3_BUCKET/n8n-backups/" --recursive

To Backblaze B2:

Bash
1#!/bin/bash
2# Using rclone
3
4rclone copy \
5 /backups/n8n_backup_latest.tar.gz \
6 b2:your-bucket/n8n-backups/

To Google Drive:

Bash
1#!/bin/bash
2# Using rclone
3
4rclone copy \
5 /backups/n8n_backup_latest.tar.gz \
6 gdrive:n8n-backups/

Backup Verification

Test Restore Script:

Bash
1#!/bin/bash
2# verify-backup.sh
3
4BACKUP_FILE=$1
5
6echo "Verifying backup: $BACKUP_FILE"
7
8# Extract and check
9TEMP_DIR="/tmp/verify_$$"
10mkdir -p "$TEMP_DIR"
11tar xzf "$BACKUP_FILE" -C "$TEMP_DIR"
12
13# Check required files
14REQUIRED_FILES=(
15 "workflows.json"
16 "credentials.json"
17 "database.dump"
18 "encryption_key.txt"
19)
20
21for file in "${REQUIRED_FILES[@]}"; do
22 if [ -f "$TEMP_DIR"/*/"$file" ]; then
23 echo "✅ $file exists"
24 else
25 echo "❌ $file MISSING!"
26 fi
27done
28
29# Check workflow count
30WORKFLOW_COUNT=$(cat "$TEMP_DIR"/*/workflows.json | jq '. | length')
31echo "📊 Workflows in backup: $WORKFLOW_COUNT"
32
33# Cleanup
34rm -rf "$TEMP_DIR"
35
36echo "Verification complete"

Disaster Recovery Plan

Recovery Time Objectives:

Text
1DISASTER RECOVERY PLAN
2──────────────────────
3
4SCENARIO RTO RPO
5────────────────────────────────────────────
6Database corruption 1 hour 24 hours
7Server failure 2 hours 24 hours
8Region outage 4 hours 24 hours
9Ransomware/breach 8 hours 24 hours
10
11RTO = Recovery Time Objective
12RPO = Recovery Point Objective

DR Checklist:

Text
1□ Daily automated backups running
2□ Backups stored off-site
3□ Encryption key stored separately
4□ Restore procedure documented
5□ Restore tested monthly
6□ Secondary server ready (optional)
7□ Contact list for emergencies

Bài Tập Thực Hành

Backup Challenge

Build your backup system:

  1. Create full backup script
  2. Setup automated daily backup (cron)
  3. Configure remote storage (S3/B2/GDrive)
  4. Test complete restore process
  5. Document recovery procedure
  6. Verify backups monthly

Never lose your work! 💾

Key Takeaways

Remember
  • 🔑 Encryption key - Most critical, store separately!
  • 📅 Daily backups - Automate, don't rely on manual
  • ☁️ Off-site storage - Protect against hardware failure
  • 🧪 Test restores - Untested backup = no backup
  • 📝 Document - Clear recovery procedures

Tiếp Theo

Bài tiếp theo: Security Best Practices - Hardening n8n cho production với auth, firewall, và secrets management.