💾 Backup & Restore
Mất workflows và credentials là nightmare. Bài này covers everything về backup - từ manual exports đến automated disaster recovery.
What to Backup
Critical Components:
Text
1N8N BACKUP TARGETS2──────────────────3 41. WORKFLOWS5 └── All workflow definitions6 └── Active/inactive states7 └── Settings & configurations8 92. CREDENTIALS10 └── API keys, tokens11 └── OAuth connections12 └── Database passwords13 └── ⚠️ ENCRYPTED - needs key!14 153. DATABASE16 └── execution_entity (history)17 └── workflow_entity (workflows)18 └── credentials_entity (creds)19 └── webhook_entity (webhooks)20 214. ENCRYPTION KEY22 └── N8N_ENCRYPTION_KEY23 └── ⚠️ Without this = data loss!24 255. ENVIRONMENT CONFIG26 └── .env file27 └── docker-compose.yml28 └── nginx.confCritical
N8N_ENCRYPTION_KEY là quan trọng nhất! Không có key = không decrypt được credentials!
Backup Methods
Method 1: Export via UI
Workflow Export:
Text
11. Open workflow22. Click ⋮ menu → Download33. Save as JSON file4 5Or export all:61. Settings → Personal72. Export workflows83. Choose "All Workflows"Credentials (Manual Only):
Text
1❌ Cannot export credentials via UI2❌ For security reasons3✅ Must backup database insteadMethod 2: n8n CLI Export
Bash
1# Export all workflows2docker exec n8n n8n export:workflow --all --output=/backup/workflows.json3 4# Export specific workflow5docker exec n8n n8n export:workflow --id=YOUR_WORKFLOW_ID --output=/backup/workflow.json6 7# Export credentials (requires encryption key)8docker exec n8n n8n export:credentials --all --output=/backup/credentials.json9 10# Decrypt credentials for backup11docker exec n8n n8n export:credentials --all --decrypted --output=/backup/credentials-plain.jsonSecurity
Decrypted credentials file chứa all secrets - store securely!
Method 3: Database Backup
PostgreSQL:
Bash
1#!/bin/bash2# backup-postgres.sh3 4DATE=$(date +%Y%m%d_%H%M%S)5BACKUP_DIR=/backups/postgres6 7# Create backup8PGPASSWORD=$POSTGRES_PASSWORD pg_dump \9 -h postgres \10 -U n8n \11 -d n8n \12 -F c \13 -f "$BACKUP_DIR/n8n_$DATE.dump"14 15# Compress16gzip "$BACKUP_DIR/n8n_$DATE.dump"17 18# Cleanup old backups (keep 30 days)19find $BACKUP_DIR -name "*.dump.gz" -mtime +30 -delete20 21echo "Backup complete: n8n_$DATE.dump.gz"SQLite:
Bash
1#!/bin/bash2# backup-sqlite.sh3 4DATE=$(date +%Y%m%d_%H%M%S)5BACKUP_DIR=/backups/sqlite6N8N_DATA=/home/node/.n8n7 8# Stop n8n for consistency (optional)9docker compose stop n8n10 11# Copy database12cp "$N8N_DATA/database.sqlite" "$BACKUP_DIR/n8n_$DATE.sqlite"13 14# Compress15gzip "$BACKUP_DIR/n8n_$DATE.sqlite"16 17# Restart18docker compose start n8nMethod 4: Docker Volume Backup
Bash
1#!/bin/bash2# backup-volumes.sh3 4DATE=$(date +%Y%m%d_%H%M%S)5BACKUP_DIR=/backups/volumes6 7# Backup n8n data volume8docker run --rm \9 -v n8n_data:/source:ro \10 -v $BACKUP_DIR:/backup \11 alpine tar czf /backup/n8n_data_$DATE.tar.gz -C /source .12 13# Backup postgres volume14docker run --rm \15 -v postgres_data:/source:ro \16 -v $BACKUP_DIR:/backup \17 alpine tar czf /backup/postgres_data_$DATE.tar.gz -C /source .Complete Backup Script
Bash
1#!/bin/bash2# full-backup.sh - Complete n8n backup3 4set -e5 6# Configuration7BACKUP_DIR="/backups"8DATE=$(date +%Y%m%d_%H%M%S)9BACKUP_NAME="n8n_backup_$DATE"10RETENTION_DAYS=3011 12# Create backup directory13mkdir -p "$BACKUP_DIR/$BACKUP_NAME"14cd "$BACKUP_DIR/$BACKUP_NAME"15 16echo "Starting n8n backup: $BACKUP_NAME"17 18# 1. Export workflows19echo "Exporting workflows..."20docker exec n8n n8n export:workflow --all --output=/tmp/workflows.json21docker cp n8n:/tmp/workflows.json ./workflows.json22 23# 2. Export credentials (encrypted)24echo "Exporting credentials..."25docker exec n8n n8n export:credentials --all --output=/tmp/credentials.json26docker cp n8n:/tmp/credentials.json ./credentials.json27 28# 3. Database backup29echo "Backing up database..."30PGPASSWORD=$POSTGRES_PASSWORD pg_dump \31 -h localhost \32 -p 5432 \33 -U n8n \34 -d n8n \35 -F c \36 -f ./database.dump37 38# 4. Copy environment files39echo "Copying configuration..."40cp /opt/n8n/.env ./env_backup.txt41cp /opt/n8n/docker-compose.yml ./docker-compose.yml42 43# 5. Save encryption key44echo "Saving encryption key..."45echo "N8N_ENCRYPTION_KEY=$N8N_ENCRYPTION_KEY" > ./encryption_key.txt46chmod 600 ./encryption_key.txt47 48# 6. Create archive49echo "Creating archive..."50cd "$BACKUP_DIR"51tar czf "${BACKUP_NAME}.tar.gz" "$BACKUP_NAME"52rm -rf "$BACKUP_NAME"53 54# 7. Upload to remote (optional)55# aws s3 cp "${BACKUP_NAME}.tar.gz" s3://your-bucket/n8n-backups/56 57# 8. Cleanup old backups58echo "Cleaning up old backups..."59find "$BACKUP_DIR" -name "n8n_backup_*.tar.gz" -mtime +$RETENTION_DAYS -delete60 61echo "Backup complete: ${BACKUP_NAME}.tar.gz"62echo "Size: $(du -h ${BACKUP_NAME}.tar.gz | cut -f1)"Automated Backup
Cron Setup:
Bash
1# Edit crontab2crontab -e3 4# Add these entries:5 6# Daily backup at 2 AM70 2 * * * /opt/n8n/scripts/full-backup.sh >> /var/log/n8n-backup.log 2>&18 9# Weekly database optimization at 3 AM Sunday100 3 * * 0 docker exec n8n-postgres vacuumdb -U n8n -d n8n -v >> /var/log/n8n-backup.log 2>&111 12# Monthly full export at 4 AM on 1st130 4 1 * * /opt/n8n/scripts/monthly-export.sh >> /var/log/n8n-backup.log 2>&1Docker Backup Service:
yaml
1# docker-compose.yml2services:3 backup:4 image: alpine5 volumes:6 - ./scripts:/scripts:ro7 - ./backups:/backups8 - n8n_data:/n8n_data:ro9 - postgres_data:/postgres_data:ro10 environment:11 - POSTGRES_PASSWORD=${POSTGRES_PASSWORD}12 - N8N_ENCRYPTION_KEY=${N8N_ENCRYPTION_KEY}13 command: >14 sh -c "15 apk add --no-cache postgresql-client &&16 while true; do17 /scripts/backup.sh18 sleep 8640019 done20 "Restore Procedures
Restore Workflows:
Bash
1# From JSON export2docker exec -i n8n n8n import:workflow --input=/backup/workflows.json3 4# Specific workflow5docker exec -i n8n n8n import:workflow --input=/backup/workflow.json --id=WORKFLOW_IDRestore Credentials:
Bash
1# From encrypted export (needs same encryption key!)2docker exec -i n8n n8n import:credentials --input=/backup/credentials.jsonRestore Database (PostgreSQL):
Bash
1# Stop n8n2docker compose stop n8n3 4# Restore database5PGPASSWORD=$POSTGRES_PASSWORD pg_restore \6 -h localhost \7 -U n8n \8 -d n8n \9 -c \10 --if-exists \11 /backups/database.dump12 13# Start n8n14docker compose start n8nFull Restore Script:
Bash
1#!/bin/bash2# restore.sh - Full n8n restore3 4set -e5 6BACKUP_FILE=$17 8if [ -z "$BACKUP_FILE" ]; then9 echo "Usage: ./restore.sh backup_file.tar.gz"10 exit 111fi12 13echo "Starting restore from: $BACKUP_FILE"14 15# Extract backup16RESTORE_DIR="/tmp/n8n_restore_$$"17mkdir -p "$RESTORE_DIR"18tar xzf "$BACKUP_FILE" -C "$RESTORE_DIR"19 20# Find backup directory21BACKUP_DIR=$(ls "$RESTORE_DIR")22 23# Stop services24echo "Stopping services..."25docker compose stop n8n26 27# Restore encryption key (CRITICAL!)28echo "Checking encryption key..."29source "$RESTORE_DIR/$BACKUP_DIR/encryption_key.txt"30if [ -z "$N8N_ENCRYPTION_KEY" ]; then31 echo "ERROR: No encryption key in backup!"32 exit 133fi34 35# Restore database36echo "Restoring database..."37PGPASSWORD=$POSTGRES_PASSWORD pg_restore \38 -h localhost \39 -U n8n \40 -d n8n \41 -c \42 --if-exists \43 "$RESTORE_DIR/$BACKUP_DIR/database.dump"44 45# Restore n8n data volume46echo "Restoring n8n data..."47docker run --rm \48 -v n8n_data:/target \49 -v "$RESTORE_DIR/$BACKUP_DIR":/backup:ro \50 alpine sh -c "rm -rf /target/* && tar xzf /backup/n8n_data.tar.gz -C /target"51 52# Start services53echo "Starting services..."54docker compose start n8n55 56# Cleanup57rm -rf "$RESTORE_DIR"58 59echo "Restore complete!"60echo "Please verify workflows and credentials work correctly."Remote Backup
To S3:
Bash
1#!/bin/bash2# Upload to S33 4BACKUP_FILE=$15S3_BUCKET="your-backup-bucket"6S3_PATH="n8n-backups/$(date +%Y/%m)"7 8aws s3 cp "$BACKUP_FILE" "s3://$S3_BUCKET/$S3_PATH/"9 10# List backups11aws s3 ls "s3://$S3_BUCKET/n8n-backups/" --recursiveTo Backblaze B2:
Bash
1#!/bin/bash2# Using rclone3 4rclone copy \5 /backups/n8n_backup_latest.tar.gz \6 b2:your-bucket/n8n-backups/To Google Drive:
Bash
1#!/bin/bash2# Using rclone3 4rclone copy \5 /backups/n8n_backup_latest.tar.gz \6 gdrive:n8n-backups/Backup Verification
Test Restore Script:
Bash
1#!/bin/bash2# verify-backup.sh3 4BACKUP_FILE=$15 6echo "Verifying backup: $BACKUP_FILE"7 8# Extract and check9TEMP_DIR="/tmp/verify_$$"10mkdir -p "$TEMP_DIR"11tar xzf "$BACKUP_FILE" -C "$TEMP_DIR"12 13# Check required files14REQUIRED_FILES=(15 "workflows.json"16 "credentials.json"17 "database.dump"18 "encryption_key.txt"19)20 21for file in "${REQUIRED_FILES[@]}"; do22 if [ -f "$TEMP_DIR"/*/"$file" ]; then23 echo "✅ $file exists"24 else25 echo "❌ $file MISSING!"26 fi27done28 29# Check workflow count30WORKFLOW_COUNT=$(cat "$TEMP_DIR"/*/workflows.json | jq '. | length')31echo "📊 Workflows in backup: $WORKFLOW_COUNT"32 33# Cleanup34rm -rf "$TEMP_DIR"35 36echo "Verification complete"Disaster Recovery Plan
Recovery Time Objectives:
Text
1DISASTER RECOVERY PLAN2──────────────────────3 4SCENARIO RTO RPO5────────────────────────────────────────────6Database corruption 1 hour 24 hours7Server failure 2 hours 24 hours8Region outage 4 hours 24 hours9Ransomware/breach 8 hours 24 hours10 11RTO = Recovery Time Objective12RPO = Recovery Point ObjectiveDR Checklist:
Text
1□ Daily automated backups running2□ Backups stored off-site3□ Encryption key stored separately4□ Restore procedure documented5□ Restore tested monthly6□ Secondary server ready (optional)7□ Contact list for emergenciesBài Tập Thực Hành
Backup Challenge
Build your backup system:
- Create full backup script
- Setup automated daily backup (cron)
- Configure remote storage (S3/B2/GDrive)
- Test complete restore process
- Document recovery procedure
- Verify backups monthly
Never lose your work! 💾
Key Takeaways
Remember
- 🔑 Encryption key - Most critical, store separately!
- 📅 Daily backups - Automate, don't rely on manual
- ☁️ Off-site storage - Protect against hardware failure
- 🧪 Test restores - Untested backup = no backup
- 📝 Document - Clear recovery procedures
Tiếp Theo
Bài tiếp theo: Security Best Practices - Hardening n8n cho production với auth, firewall, và secrets management.
