Backup & Recovery
Implement backup and recovery procedures for EZ-Console deployments.
Overview
Regular backups are essential for production deployments. This guide covers database backups, file backups, and recovery procedures for EZ-Console applications.
Backup Strategy
Backup Types
- Database Backups: Full and incremental database backups
- File Backups: Uploaded files and configuration
- Configuration Backups: Configuration files and environment variables
Backup Frequency
- Database: Daily full backups, hourly incremental (if supported)
- Files: Daily backups
- Configuration: On every change
Database Backups
MySQL Backup
Manual Backup
#!/bin/bash
# backup-mysql.sh
DATE=$(date +%Y%m%d_%H%M%S)
BACKUP_DIR="/backups/mysql"
DB_NAME="myapp"
DB_USER="root"
DB_PASS="password"
DB_HOST="localhost"
# Create backup directory
mkdir -p $BACKUP_DIR
# Full backup
mysqldump -h $DB_HOST -u $DB_USER -p$DB_PASS \
--single-transaction \
--routines \
--triggers \
$DB_NAME | gzip > $BACKUP_DIR/backup_$DATE.sql.gz
# Keep only last 30 days
find $BACKUP_DIR -name "backup_*.sql.gz" -mtime +30 -delete
echo "Backup completed: backup_$DATE.sql.gz"
Automated Backup (Cron)
# Add to crontab
crontab -e
# Daily backup at 2 AM
0 2 * * * /path/to/backup-mysql.sh
PostgreSQL Backup
Manual Backup
#!/bin/bash
# backup-postgres.sh
DATE=$(date +%Y%m%d_%H%M%S)
BACKUP_DIR="/backups/postgres"
DB_NAME="myapp"
DB_USER="postgres"
DB_HOST="localhost"
mkdir -p $BACKUP_DIR
# Full backup
PGPASSWORD=$DB_PASS pg_dump -h $DB_HOST -U $DB_USER \
-F c \
-b \
-v \
-f $BACKUP_DIR/backup_$DATE.dump \
$DB_NAME
# Compress
gzip $BACKUP_DIR/backup_$DATE.dump
# Keep only last 30 days
find $BACKUP_DIR -name "backup_*.dump.gz" -mtime +30 -delete
echo "Backup completed: backup_$DATE.dump.gz"
SQLite Backup
#!/bin/bash
# backup-sqlite.sh
DATE=$(date +%Y%m%d_%H%M%S)
BACKUP_DIR="/backups/sqlite"
DB_FILE="/opt/myapp/ez-console.db"
mkdir -p $BACKUP_DIR
# SQLite backup
sqlite3 $DB_FILE ".backup '$BACKUP_DIR/backup_$DATE.db'"
# Compress
gzip $BACKUP_DIR/backup_$DATE.db
# Keep only last 30 days
find $BACKUP_DIR -name "backup_*.db.gz" -mtime +30 -delete
echo "Backup completed: backup_$DATE.db.gz"
File Backups
Upload Directory Backup
#!/bin/bash
# backup-files.sh
DATE=$(date +%Y%m%d_%H%M%S)
BACKUP_DIR="/backups/files"
UPLOAD_DIR="/opt/myapp/uploads"
mkdir -p $BACKUP_DIR
# Create archive
tar -czf $BACKUP_DIR/uploads_$DATE.tar.gz -C $(dirname $UPLOAD_DIR) $(basename $UPLOAD_DIR)
# Keep only last 30 days
find $BACKUP_DIR -name "uploads_*.tar.gz" -mtime +30 -delete
echo "File backup completed: uploads_$DATE.tar.gz"
Configuration Backup
#!/bin/bash
# backup-config.sh
DATE=$(date +%Y%m%d_%H%M%S)
BACKUP_DIR="/backups/config"
CONFIG_DIR="/etc/myapp"
mkdir -p $BACKUP_DIR
# Backup config files
tar -czf $BACKUP_DIR/config_$DATE.tar.gz -C $CONFIG_DIR .
# Keep only last 90 days
find $BACKUP_DIR -name "config_*.tar.gz" -mtime +90 -delete
echo "Config backup completed: config_$DATE.tar.gz"
Remote Backup
S3 Backup
#!/bin/bash
# backup-to-s3.sh
BACKUP_FILE="backup_$(date +%Y%m%d_%H%M%S).tar.gz"
S3_BUCKET="myapp-backups"
# Create backup
tar -czf /tmp/$BACKUP_FILE /backups/
# Upload to S3
aws s3 cp /tmp/$BACKUP_FILE s3://$S3_BUCKET/
# Cleanup local file
rm /tmp/$BACKUP_FILE
echo "Backup uploaded to S3: $BACKUP_FILE"
SFTP Backup
#!/bin/bash
# backup-to-sftp.sh
BACKUP_FILE="backup_$(date +%Y%m%d_%H%M%S).tar.gz"
REMOTE_HOST="backup-server.example.com"
REMOTE_USER="backup"
REMOTE_DIR="/backups/myapp"
# Create backup
tar -czf /tmp/$BACKUP_FILE /backups/
# Upload via SFTP
sftp $REMOTE_USER@$REMOTE_HOST <<EOF
put /tmp/$BACKUP_FILE $REMOTE_DIR/
quit
EOF
# Cleanup
rm /tmp/$BACKUP_FILE
echo "Backup uploaded to SFTP: $BACKUP_FILE"
Recovery Procedures
MySQL Recovery
#!/bin/bash
# restore-mysql.sh
BACKUP_FILE=$1
DB_NAME="myapp"
DB_USER="root"
DB_PASS="password"
if [ -z "$BACKUP_FILE" ]; then
echo "Usage: $0 <backup_file.sql.gz>"
exit 1
fi
# Decompress if needed
if [[ $BACKUP_FILE == *.gz ]]; then
gunzip -c $BACKUP_FILE | mysql -u $DB_USER -p$DB_PASS $DB_NAME
else
mysql -u $DB_USER -p$DB_PASS $DB_NAME < $BACKUP_FILE
fi
echo "Database restored from: $BACKUP_FILE"
PostgreSQL Recovery
#!/bin/bash
# restore-postgres.sh
BACKUP_FILE=$1
DB_NAME="myapp"
DB_USER="postgres"
if [ -z "$BACKUP_FILE" ]; then
echo "Usage: $0 <backup_file.dump.gz>"
exit 1
fi
# Decompress if needed
if [[ $BACKUP_FILE == *.gz ]]; then
gunzip -c $BACKUP_FILE | pg_restore -U $DB_USER -d $DB_NAME -v
else
pg_restore -U $DB_USER -d $DB_NAME -v $BACKUP_FILE
fi
echo "Database restored from: $BACKUP_FILE"
SQLite Recovery
#!/bin/bash
# restore-sqlite.sh
BACKUP_FILE=$1
DB_FILE="/opt/myapp/ez-console.db"
if [ -z "$BACKUP_FILE" ]; then
echo "Usage: $0 <backup_file.db.gz>"
exit 1
fi
# Stop application
systemctl stop myapp
# Decompress if needed
if [[ $BACKUP_FILE == *.gz ]]; then
gunzip -c $BACKUP_FILE > $DB_FILE
else
cp $BACKUP_FILE $DB_FILE
fi
# Restart application
systemctl start myapp
echo "Database restored from: $BACKUP_FILE"
File Recovery
#!/bin/bash
# restore-files.sh
BACKUP_FILE=$1
RESTORE_DIR="/opt/myapp"
if [ -z "$BACKUP_FILE" ]; then
echo "Usage: $0 <backup_file.tar.gz>"
exit 1
fi
# Stop application
systemctl stop myapp
# Restore files
tar -xzf $BACKUP_FILE -C $RESTORE_DIR
# Restart application
systemctl start myapp
echo "Files restored from: $BACKUP_FILE"
Automated Backup Script
Complete Backup Script
#!/bin/bash
# full-backup.sh
set -e
DATE=$(date +%Y%m%d_%H%M%S)
BACKUP_ROOT="/backups"
BACKUP_DIR="$BACKUP_ROOT/$DATE"
mkdir -p $BACKUP_DIR
echo "Starting backup at $(date)"
# Database backup
echo "Backing up database..."
# ... database backup commands ...
# File backup
echo "Backing up files..."
# ... file backup commands ...
# Config backup
echo "Backing up configuration..."
# ... config backup commands ...
# Create archive
echo "Creating archive..."
cd $BACKUP_ROOT
tar -czf backup_$DATE.tar.gz $DATE
# Remove directory
rm -rf $BACKUP_DIR
# Upload to remote (optional)
# aws s3 cp backup_$DATE.tar.gz s3://myapp-backups/
# Cleanup old backups
find $BACKUP_ROOT -name "backup_*.tar.gz" -mtime +30 -delete
echo "Backup completed at $(date)"
Backup Verification
Verify Backup Integrity
#!/bin/bash
# verify-backup.sh
BACKUP_FILE=$1
if [ -z "$BACKUP_FILE" ]; then
echo "Usage: $0 <backup_file>"
exit 1
fi
# Check if file exists
if [ ! -f "$BACKUP_FILE" ]; then
echo "Error: Backup file not found"
exit 1
fi
# Check file size
SIZE=$(stat -f%z "$BACKUP_FILE" 2>/dev/null || stat -c%s "$BACKUP_FILE" 2>/dev/null)
if [ $SIZE -lt 1000 ]; then
echo "Warning: Backup file seems too small ($SIZE bytes)"
fi
# Test decompression
if [[ $BACKUP_FILE == *.gz ]]; then
gunzip -t $BACKUP_FILE
if [ $? -eq 0 ]; then
echo "✓ Backup file is valid"
else
echo "✗ Backup file is corrupted"
exit 1
fi
fi
echo "Backup verification passed"
Disaster Recovery Plan
Recovery Steps
-
Assess Situation
- Identify what was lost
- Determine backup availability
- Check system status
-
Prepare Recovery Environment
- Set up new server if needed
- Install required software
- Configure network
-
Restore Database
- Select appropriate backup
- Restore database
- Verify data integrity
-
Restore Files
- Restore uploaded files
- Restore configuration
-
Verify and Test
- Test application functionality
- Verify data integrity
- Monitor for issues
Best Practices
1. Regular Backups
- Daily database backups
- Daily file backups
- On-demand config backups
2. Test Restores
Regularly test restore procedures to ensure backups are valid.
3. Offsite Storage
Store backups in multiple locations:
- Local server
- Remote storage (S3, SFTP)
- Offsite backup location
4. Retention Policy
- Keep daily backups for 30 days
- Keep weekly backups for 3 months
- Keep monthly backups for 1 year
Related Topics
- Database Migration - Migration procedures
- Troubleshooting - Recovery from issues
Need help? Ask in GitHub Discussions.