Logo
Vestcodes
Back to Blog

Automated File Backup System Using n8n

August 29, 2025
8 min read
Automated File Backup System Using n8n

Automated File Backup System Using n8n

Data loss can be devastating for any business or individual. Whether it's accidental deletion, hardware failure, or a security breach, having a reliable backup system is crucial. In this guide, we'll build a comprehensive automated file backup solution using n8n that can handle scheduled backups, versioning, and multi-location storage.

Why Automate File Backups with n8n?

  • Data Protection: Safeguard against data loss from hardware failures or human error
  • Time Savings: Eliminate manual backup processes
  • Version Control: Maintain multiple versions of important files
  • Cross-Platform: Backup between different cloud services
  • Customization: Tailor the backup process to your specific needs
  • Encryption: Add an extra layer of security to your backups

Prerequisites

Before we begin, you'll need:

  1. An n8n instance (cloud or self-hosted)
  2. Access to source and destination storage services (e.g., Google Drive, Dropbox, S3)
  3. (Optional) Encryption keys if you plan to encrypt backups

Step 1: Setting Up Storage Services

Source Storage Setup

  1. Google Drive

    • Go to Google Cloud Console
    • Create a new project
    • Enable Google Drive API
    • Create OAuth 2.0 credentials
    • Note your Client ID and Secret
  2. Local Server/FTP

    • Ensure n8n has access to the file system
    • Note the directory paths

Destination Storage Setup

  1. Amazon S3

    • Create an S3 bucket
    • Generate access keys with write permissions
    • Note your bucket name and region
  2. Dropbox

    • Create a Dropbox App
    • Generate an access token
    • Note your App key and secret

Step 2: Creating the Backup Workflow

1. Schedule Trigger

  1. Add a "Schedule Trigger" node
  2. Set your preferred backup frequency (e.g., daily at 2 AM)
  3. Consider off-peak hours to minimize performance impact

2. List Source Files

For Google Drive:

  1. Add a "Google Drive" node
  2. Authenticate with your Google account
  3. Select "List Files" operation
  4. Configure the folder to back up
  5. Set up filters (e.g., file type, modified date)

For Local Files:

  1. Add a "Read/Write Files from Disk" node
  2. Set the directory path
  3. Configure to list files recursively if needed

3. Filter and Process Files

Add a "Function" node to filter and prepare files for backup:

// Get current date for versioning
const now = new Date();
const timestamp = now.toISOString().replace(/[:.]/g, '-');
const today = now.toISOString().split('T')[0];

// Filter criteria (modify as needed)
const MAX_FILE_SIZE = 100 * 1024 * 1024; // 100MB
const EXCLUDED_EXTENSIONS = ['.tmp', '.log'];
const EXCLUDED_FOLDERS = ['node_modules', '.git'];

// Process each file/directory
const filesToBackup = [];

for (const item of items) {
    const file = item.json;
    
    // Skip directories if needed
    if (file.isDirectory) continue;
    
    // Skip files that are too large
    if (file.size > MAX_FILE_SIZE) {
        console.log(`Skipping large file: ${file.name} (${file.size} bytes)`);
        continue;
    }
    
    // Skip excluded extensions
    const extension = file.name.split('.').pop().toLowerCase();
    if (EXCLUDED_EXTENSIONS.includes(`.${extension}`)) continue;
    
    // Skip files in excluded folders
    const path = file.path || '';
    if (EXCLUDED_FOLDERS.some(folder => path.includes(`/${folder}/`))) continue;
    
    // Add to backup list with metadata
    filesToBackup.push({
        json: {
            ...file,
            backupPath: `backups/${today}/${timestamp}/${file.name}`,
            originalPath: path,
            backupDate: now.toISOString(),
            version: timestamp
        }
    });
}

return filesToBackup;

4. File Encryption (Optional)

For added security, add a "Crypto" node:

  1. Add a "Crypto" node after the function
  2. Select "Encrypt" operation
  3. Choose encryption algorithm (e.g., AES-256)
  4. Set your encryption key
  5. Map the file content to be encrypted

5. Upload to Backup Destination

For Amazon S3:

  1. Add an "AWS S3" node
  2. Configure with your AWS credentials
  3. Set operation to "Upload a File"
  4. Map the file content and path
  5. Set appropriate content type and permissions

For Dropbox:

  1. Add a "Dropbox" node
  2. Authenticate with your Dropbox account
  3. Set operation to "Upload"
  4. Map the file content and path
  5. Set the auto-rename option to handle duplicates

6. Verify Backup

Add verification steps to ensure files were backed up correctly:

  1. Add a "Function" node to check backup status:
const results = [];

for (const item of items) {
    const file = item.json;
    const backupResult = {
        originalFile: file.name,
        backupPath: file.backupPath,
        status: 'success',
        timestamp: new Date().toISOString(),
        size: file.size,
        error: null
    };
    
    // Check if the backup was successful
    if (!item.error) {
        results.push({
            json: backupResult
        });
    } else {
        results.push({
            json: {
                ...backupResult,
                status: 'failed',
                error: item.error.message
            }
        });
    }
}

return results;

7. Send Backup Report

  1. Add an "Email" node (e.g., Gmail)
  2. Configure with your email service
  3. Create a detailed report:
    • Successfully backed up files
  • Failed backups
  • Total size of backup
  • Any errors or warnings

8. Clean Up Old Backups

Add a cleanup step to remove old backups and manage storage:

  1. Add a "Function" node to identify old backups:
const KEEP_DAILY = 7;     // Keep daily backups for 7 days
const KEEP_WEEKLY = 4;     // Keep weekly backups for 4 weeks
const KEEP_MONTHLY = 12;   // Keep monthly backups for 12 months

const now = new Date();
const filesToDelete = [];

// Group files by date
const filesByDate = {};

for (const item of items) {
    const file = item.json;
    const date = new Date(file.backupDate);
    const dateStr = date.toISOString().split('T')[0];
    
    if (!filesByDate[dateStr]) {
        filesByDate[dateStr] = [];
    }
    
    filesByDate[dateStr].push(file);
}

// Determine which backups to keep
const sortedDates = Object.keys(filesByDate).sort().reverse();
const keepFiles = new Set();

// Keep most recent backup for each day
for (let i = 0; i < Math.min(sortedDates.length, KEEP_DAILY); i++) {
    const date = sortedDates[i];
    const newestFile = filesByDate[date].sort((a, b) => 
        new Date(b.backupDate) - new Date(a.backupDate)
    )[0];
    keepFiles.add(newestFile.backupPath);
}

// Keep weekly backups
for (let i = KEEP_DAILY; i < Math.min(sortedDates.length, KEEP_DAILY + KEEP_WEEKLY * 7); i += 7) {
    const date = sortedDates[i];
    const newestFile = filesByDate[date].sort((a, b) => 
        new Date(b.backupDate) - new Date(a.backupDate)
    )[0];
    keepFiles.add(newestFile.backupPath);
}

// Keep monthly backups
for (let i = KEEP_DAILY + KEEP_WEEKLY * 7; i < Math.min(sortedDates.length, KEEP_DAILY + KEEP_WEEKLY * 7 + KEEP_MONTHLY * 30); i += 30) {
    const date = sortedDates[i];
    const newestFile = filesByDate[date].sort((a, b) => 
        new Date(b.backupDate) - new Date(a.backupDate)
    )[0];
    keepFiles.add(newestFile.backupPath);
}

// Identify files to delete
for (const item of items) {
    const file = item.json;
    if (!keepFiles.has(file.backupPath)) {
        filesToDelete.push({
            json: {
                ...file,
                deleteReason: 'Old backup',
                deleteDate: now.toISOString()
            }
        });
    }
}

return filesToDelete;
  1. Add nodes to delete old backups from each storage service

Step 3: Advanced Features

1. Incremental Backups

  1. Add a "Function" node to track file hashes:
const crypto = require('crypto');

// Generate file hash to detect changes
function getFileHash(content) {
    return crypto.createHash('sha256').update(content).digest('hex');
}

// Compare with previous hashes
const previousHashes = await $node["PreviousHashes"].getAllItems();

for (const item of items) {
    const file = item.json;
    const content = await $node["ReadFile"].getBinary(file.id);
    const currentHash = getFileHash(content);
    
    const previousFile = previousHashes.find(f => f.json.path === file.path);
    
    if (previousFile && previousFile.json.hash === currentHash) {
        // File hasn't changed, skip backup
        continue;
    }
    
    // Add to backup queue
    return {
        json: {
            ...file,
            hash: currentHash,
            modified: !previousFile || previousFile.json.hash !== currentHash
        },
        binary: {
            data: content
        }
    };
}

2. Compression

  1. Add a "Compression" node
  2. Configure to create ZIP archives
  3. Set compression level
  4. Group files by type or date

3. Multi-Destination Backup

  1. Duplicate the upload steps for each destination
  2. Add error handling for each destination
  3. Verify backup in each location

Best Practices

  1. 3-2-1 Backup Rule:

    • Keep 3 copies of your data
    • On 2 different media
    • With 1 copy offsite
  2. Regular Testing:

    • Periodically test restore procedures
    • Verify backup integrity
    • Document recovery process
  3. Security:

    • Use strong encryption for sensitive data
    • Implement proper access controls
    • Rotate encryption keys periodically
  4. Monitoring:

    • Set up alerts for failed backups
    • Monitor storage usage
    • Review backup logs regularly

Common Issues and Solutions

Authentication Failures

  • Check token expiration
  • Verify API permissions
  • Ensure IP whitelisting if applicable

Large File Handling

  • Split large files into smaller chunks
  • Increase timeout settings
  • Consider using multi-part uploads

Storage Limitations

  • Implement retention policies
  • Compress files before backup
  • Consider tiered storage solutions

Conclusion

By implementing this automated backup system with n8n, you've created a robust solution to protect your valuable data. The flexibility of n8n allows you to customize the backup process to fit your specific needs, whether you're backing up a personal photo collection or critical business documents.

Remember to regularly test your backups by restoring files to ensure they're working correctly. A backup is only as good as your ability to restore from it.

Need help setting up a custom backup solution? Contact our automation experts for personalized assistance.