~3 min read • Updated Feb 27, 2026
1. Using rsync to Back Up /home to a Remote Server
As servers grow and store more data, traditional tar.gz backups become slower. For servers where most data resides in public_html or email folders, using rsync for the entire /home directory is a highly efficient alternative.
Important: You must restore DirectAdmin accounts first. After DA restores users, you can rsync the data back. If needed, run the DA restore twice for updated metadata.
When creating DA backups, deselect the following in Step 4:
Domains DirectoryE-Mail Data
These items are stored in /home and will be handled by rsync instead.
1.1 Example rsync Script for Remote Backup
#!/bin/bash
BACKUP_HOST="remote.hostname.com"
BACKUP_USER=`hostname -s`
BACKUP_SOURCE="/home"
BACKUP_DESTINATION="/home/$BACKUP_USER/users"
ionice -c3 nice -n 19 rsync -q -a --delete -e ssh $BACKUP_SOURCE \
$BACKUP_USER@$BACKUP_HOST:$BACKUP_DESTINATION >/var/log/backup.log 2>&1
After restoring DA accounts:
- rsync the data back
- adjust
/home/USERNAME/domainsto/home/USERNAMEif needed
To include webmail settings:
/var/www/html/squirrelmail/data/var/www/html/webmail/tmp
2. Keeping Both Local and Remote Backups
DirectAdmin allows either local OR remote FTP backups per job. To keep both, create one FTP backup job and use user_backup_post.sh to copy the file locally before it is removed.
2.1 Script for Admin‑Only Local Copy
#!/bin/sh
RESELLER=admin
SAVE_PATH=/home/${RESELLER}/admin_backups
BACKUP_PATH=$(echo $file | cut -d/ -f1,2,3,4)
REQUIRED_PATH=/home/tmp/${RESELLER}.
if [[ "$BACKUP_PATH" == ${REQUIRED_PATH}* ]]; then
NEW_FILE=${SAVE_PATH}/`echo $file | cut -d/ -f6`
cp -fp $file ${NEW_FILE}
fi
exit 0;
2.2 Script for All Resellers
#!/bin/sh
RESELLER=$reseller
SAVE_PATH=/home/${RESELLER}/user_backups
BACKUP_PATH=`echo $file | cut -d/ -f1,2,3,4`
REQUIRED_PATH=/home/tmp/
if [[ "$BACKUP_PATH" == ${REQUIRED_PATH}* ]]; then
NEW_FILE=${SAVE_PATH}/`echo $file | cut -d/ -f6`
cp -fp $file ${NEW_FILE}
fi
exit 0;
Make executable:
chmod 755 /usr/local/directadmin/scripts/custom/user_backup_post.sh3. Extracting and Repacking a User Backup File
If a tar.gz backup is corrupted, DirectAdmin may fail to restore it. You can extract and re‑compress it manually.
3.1 Extract
cd /home/admin/admin_backups
cp user.admin.bob.tar.gz user.admin.bob.tar.gz.backup
mkdir temp
cd temp
tar xvzf ../user.admin.bob.tar.gz
3.2 Repack
cd /home/admin/admin_backups/temp
tar cvzf ../user.admin.bob.tar.gz *
4. Working with tar.zst Files
4.1 Create
tar --preserve-permissions --use-compress-program /usr/local/bin/zstdmt \
-cf backup.tar.zst backup
4.2 Extract
tar --preserve-permissions --use-compress-program /usr/local/bin/zstdmt \
-xf backup.tar.zst
5. mysqldump Timeout
DirectAdmin includes:
database_dump_timeout=14400To disable timeout:
/usr/local/directadmin/directadmin set database_dump_timeout 06. “MySQL Server Has Gone Away” During Restore
Possible causes:
wait_timeouttoo low- MySQL restarted mid‑restore
max_allowed_packettoo small
Increase packet size:
max_allowed_packet=20M7. Backup Abort (2): Hard Link Found
This occurs when backup_hard_link_check=1 detects a hard link under the user’s home directory.
To disable the check:
backup_hard_link_check=0Ensure:
strict_backup_permissions=1Optional optimization:
direct_imap_backup=1Ignore unreadable files:
extra_backup_option=--warning=no-file-removed --warning=no-file-changed8. Common FTP Backup Errors
8.1 curl: (55) Send failure: Connection reset by peer
- FTP user over quota
- Firewall blocking ports 20, 21, 35000–35999
8.2 curl: (55) SSL_write() returned SYSCALL
Client does not support required SSL protocol.
8.3 curl: (55) Send failure: Connection timed out
Likely firewall issue.
8.4 curl: (7) Failed to connect: Connection refused
FTP server not accepting connections.
Written & researched by Dr. Shahin Siami