Duplicati Backup to Cloud Storage
Duplicati is a free, open-source backup software enabling secure, compressed, encrypted backups to various cloud storage providers and local destinations. With support for incremental backups, deduplication, and transparent encryption, Duplicati provides enterprise-grade backup capabilities without licensing costs. This guide covers installation, configuring backup jobs targeting cloud storage services, and implementing recovery procedures.
Table of Contents
- Duplicati Features and Architecture
- Installation and Setup
- Cloud Storage Configuration
- Backup Job Creation
- Encryption and Security
- Scheduling and Retention
- Restore Operations
- Monitoring and Maintenance
- Conclusion
Duplicati Features and Architecture
Duplicati distinguishes itself through several key capabilities:
- Multi-backend Support: Amazon S3, Azure, Google Drive, Backblaze B2, SFTP, WebDAV
- Encryption: AES-256 encryption by default with configurable algorithms
- Deduplication: Automatic detection of duplicate data blocks across backups
- Incremental Backups: Only changed data transferred after initial backup
- Compression: Built-in compression reduces storage requirements
- Verification: Automatic backup verification ensures integrity
Installation and Setup
System Requirements
# Check system specifications
uname -m
free -h
df -h /
# Duplicati requires:
# - .NET Runtime 6.0 or later (or Mono on older systems)
# - 500 MB disk space minimum
# - Network connectivity for cloud uploads
Installing Duplicati on Ubuntu/Debian
# Add Duplicati repository
sudo add-apt-repository ppa:duplicati/duplicati-team
sudo apt-get update
# Install Duplicati
sudo apt-get install -y duplicati
# Start Duplicati service
sudo systemctl enable duplicati
sudo systemctl start duplicati
# Verify installation
duplicati-cli --version
# Check service status
sudo systemctl status duplicati
Installing Duplicati on CentOS/RHEL
# Install from official repository or direct download
sudo yum install -y https://updates.duplicati.com/duplicati-2.0.6.3-1.rhel.x86_64.rpm
# Or build from source
git clone https://github.com/duplicati/duplicati.git
cd duplicati
./build.sh release
# Enable and start service
sudo systemctl enable duplicati
sudo systemctl start duplicati
Accessing Web Interface
# Duplicati provides web interface at:
# http://localhost:8200/
# If running on remote server, use SSH tunnel:
ssh -L 8200:localhost:8200 user@backup-server
# Default login: No authentication required (configure in settings)
# Set authentication
# Visit Settings > Security > Configure authentication
Cloud Storage Configuration
Amazon S3 Configuration
# Generate AWS Access Keys
# Visit AWS Console > IAM > Users > Your User > Security Credentials
# Create S3 bucket for backups
aws s3 mb s3://my-backup-bucket
# In Duplicati web interface:
# 1. Create New Backup
# 2. Select Backend: Amazon S3
# 3. Configure:
# - AWS Access Key ID: [your-access-key]
# - AWS Secret Access Key: [your-secret-key]
# - S3 Bucket: my-backup-bucket
# - Storage Region: us-east-1
# Or via CLI:
duplicati-cli backup "s3://aws_access_key_id:aws_secret_access_key@my-backup-bucket/" /path/to/backup
Azure Blob Storage Configuration
# Create Azure Storage Account
# Via Azure Portal or CLI:
az storage account create --name mybackupaccount --resource-group mygroup
# Get connection string
CONN_STR=$(az storage account show-connection-string -n mybackupaccount -g mygroup -o tsv)
# Create container
az storage container create --name backup-container --connection-string "$CONN_STR"
# In Duplicati web interface:
# 1. Backend: Azure
# 2. Configure:
# - Blob Storage Account: mybackupaccount
# - Blob Storage Key: [storage-account-key]
# - Blob Container: backup-container
Google Drive Configuration
# Google Drive setup in Duplicati:
# 1. Backend: Google Drive
# 2. OAuth will prompt for authentication
# 3. Grant Duplicati access to Google Drive
# 4. Select or create folder for backups
# Alternative: Manual OAuth token
# 1. Go to Google Cloud Console
# 2. Create OAuth 2.0 credentials
# 3. Configure in Duplicati settings
Backblaze B2 Configuration
# Create Backblaze B2 account and bucket
# Visit https://www.backblaze.com/b2/cloud-storage.html
# Generate application key
# Account Settings > App Keys > Create Application Key
# In Duplicati web interface:
# 1. Backend: B2 Cloud Storage
# 2. Configure:
# - B2 Account ID: [account-id]
# - B2 Application Key: [app-key]
# - B2 Bucket Name: my-backup-bucket
SFTP Configuration
# For on-premises backup servers:
# 1. Backend: SSH/SFTP
# 2. Configure:
# - Server: backup.example.com
# - Username: sftp-user
# - Password: [password]
# - Remote Path: /backups/duplicati
# - Port: 22 (or custom)
# Or using key authentication:
# 1. Generate SSH keypair on backup client
ssh-keygen -t ed25519 -f ~/.ssh/duplicati_key
# 2. Copy public key to SFTP server
ssh-copy-id -i ~/.ssh/duplicati_key.pub [email protected]
# 3. Configure SFTP with key
# Upload private key in Duplicati settings
Backup Job Creation
Setting Up First Backup Job
# Through web interface:
# 1. Visit http://localhost:8200/
# 2. Click "New" > "Backup"
# 3. Name: "Full Server Backup"
# 4. Backend: Select your configured storage
# 5. Encryption: Enable with strong passphrase
# 6. Add source directories:
# - /home
# - /etc
# - /var/www
# - /opt/applications
# 7. Exclude patterns:
# - **/.cache
# - **/node_modules
# - **/__pycache__
# - /proc
# - /sys
# - /dev
CLI Backup Creation
# Create backup job via command line
duplicati-cli backup \
"s3://access-key:secret-key@bucket-name/path/" \
/home /etc /var/www \
--encryption-module aes \
--encryption-key-size 256 \
--compression-module zip \
--backup-name "Full Server Backup"
# List all backup configurations
duplicati-cli list-backups
# Get backup job details
duplicati-cli status \
"s3://access-key:secret-key@bucket-name/path/" \
/home
Advanced Options
# Enable detailed logging
duplicati-cli backup \
"s3://credentials@bucket/" \
/data \
--log-file /var/log/duplicati-backup.log \
--log-level Debug
# Set block size for deduplication
duplicati-cli backup \
"s3://credentials@bucket/" \
/data \
--blocksize 524288 # 512KB blocks
# Configure bandwidth limits
duplicati-cli backup \
"s3://credentials@bucket/" \
/data \
--upload-speed-limit 10MB # Limit uploads to 10MB/s
# Enable file versioning (keep multiple versions)
duplicati-cli backup \
"s3://credentials@bucket/" \
/data \
--keep-time 7d # Keep daily backups for 7 days
Encryption and Security
Encryption Configuration
# During backup creation, configure encryption:
# 1. Encryption: AES-256 (default)
# 2. Passphrase: Generate strong password
# Example: $(openssl rand -base64 32)
# 3. Salt: Optional, system-generated
# Never lose encryption passphrase - you cannot recover backups without it
# Store passphrase securely:
echo "my-encryption-passphrase" > /root/.duplicati-passphrase
chmod 600 /root/.duplicati-passphrase
# Export backup configuration
# Settings > Backup > Advanced > Export as command line
# Saves all configuration including encryption details
SSL/TLS Configuration
# Configure HTTPS for Duplicati web interface
# Settings > Security > Configure HTTPS
# Or via environment variables:
export DUPLICATI_USEHTTPS=true
export DUPLICATI_CERTIFICATEPATH=/etc/duplicati/cert.pem
# Generate self-signed certificate
sudo openssl req -x509 -nodes -days 365 -newkey rsa:2048 \
-keyout /etc/duplicati/key.pem \
-out /etc/duplicati/cert.pem
# Restart Duplicati
sudo systemctl restart duplicati
Authentication Setup
# Configure web interface authentication
# Settings > Security > Username and Password
# Or via environment:
export DUPLICATI_AUTHENTICATIONREQUIRED=true
export DUPLICATI_AUTHENTICATIONUSERNAME=admin
export DUPLICATI_AUTHENTICATIONPASSWORD=strong_password
# Restart service
sudo systemctl restart duplicati
Scheduling and Retention
Automated Backup Scheduling
# Configure backup schedule in web interface:
# 1. Select backup job
# 2. Settings > Schedule
# 3. Frequency: Daily, Weekly, Monthly, or Custom
# 4. Time: 02:00 (off-peak hours)
# 5. Days: Select specific days if needed
# Example cron-like schedule:
# "Run weekdays at 2:00 AM"
Retention Policies
# Configure automatic cleanup:
# 1. Settings > Cleanup options
# 2. Delete backups:
# - Older than: 30 days
# - Keep at least: 5 recent backups
# - Keep versions: Enable
# Via CLI:
duplicati-cli list-broken-backups \
"s3://credentials@bucket/" \
--all-backups
duplicati-cli delete-broken-backups \
"s3://credentials@bucket/" \
--all-backups
Backup Verification
# Verify backup integrity
duplicati-cli test \
"s3://credentials@bucket/path/" \
-dbpath /var/lib/duplicati/test.db \
--backup-name "Full Server Backup"
# Run full verification (slower but thorough)
duplicati-cli repair \
"s3://credentials@bucket/path/" \
--dbpath /var/lib/duplicati/test.db \
--log-file /var/log/duplicati-verify.log
# Schedule verification
# In crontab: 0 3 * * 0 duplicati-cli test s3://...
Restore Operations
File-Level Restore
# List backup contents
duplicati-cli list \
"s3://credentials@bucket/" \
--backup-name "Full Server Backup"
# List specific path contents
duplicati-cli list \
"s3://credentials@bucket/" \
--backup-name "Full Server Backup" \
--include /home/user/
# Restore individual file
duplicati-cli restore \
"s3://credentials@bucket/" \
--restore-path /home/user/important-file.txt \
--destination /restore/location/
# Restore directory
duplicati-cli restore \
"s3://credentials@bucket/" \
--restore-path /home/user/Documents \
--destination /restore/location/
Full Restore Procedure
# Extract to temporary location (careful with permissions)
duplicati-cli restore \
"s3://credentials@bucket/" \
--backup-name "Full Server Backup" \
--destination /tmp/restore/
# Verify restored files
ls -la /tmp/restore/
# Restore to original locations (if server is clean)
rsync -av /tmp/restore/home/ /home/
rsync -av /tmp/restore/etc/ /etc/ --backup-dir /etc.backup/
# Verify restoration
diff /home/user/file.txt /tmp/restore/home/user/file.txt
Partial Restore with Version Control
# List all backup versions
duplicati-cli list-versions \
"s3://credentials@bucket/" \
--backup-name "Full Server Backup"
# Restore specific version
duplicati-cli restore \
"s3://credentials@bucket/" \
--backup-name "Full Server Backup" \
--version 5 \
--restore-path /path/to/file \
--destination /restore/location/
Monitoring and Maintenance
Backup Monitoring
# Get backup statistics
duplicati-cli status \
"s3://credentials@bucket/" \
--backup-name "Full Server Backup"
# View backup history
duplicati-cli list-history \
"s3://credentials@bucket/" \
--backup-name "Full Server Backup"
# Check backup size and duration
duplicati-cli list \
"s3://credentials@bucket/" | wc -l
Log Monitoring
# View Duplicati service logs
sudo journalctl -u duplicati -f
# Check application logs
tail -f /var/log/duplicati/*.log
# Monitor specific backup job logs
grep "Full Server Backup" /var/log/duplicati/*.log
Database Maintenance
# Check Duplicati database integrity
duplicati-cli test-database \
--dbpath /var/lib/duplicati/
# Repair corrupted database
duplicati-cli repair-database \
--dbpath /var/lib/duplicati/duplicati.db
# Vacuum database (optimize)
sqlite3 /var/lib/duplicati/duplicati.db "VACUUM;"
Conclusion
Duplicati transforms cloud backup from a complex infrastructure task into an accessible, automated process. By leveraging encryption, deduplication, and flexible scheduling, organizations achieve comprehensive data protection without significant capital investment. Proper configuration of cloud backends, encryption keys, and retention policies ensures your data remains recoverable while managing storage costs effectively. Whether protecting individual systems or enterprise infrastructure, Duplicati's open-source architecture and powerful feature set deliver reliable disaster recovery capabilities suitable for diverse deployment scenarios.


