<html>
<head>
<meta http-equiv="content-type" content="text/html; charset=utf-8">
</head>
<body bgcolor="#FFFFFF" text="#000000">
Hey guys,
<br>
<br>
i tried to create a simple rsync script that should create daily
backups from a ZFS storage and put them into a timestamp folder.
<br>
After creating the initial full backup, the following backups should
only contain "new data" and the rest will be referenced via
hardlinks (-link-dest)
<br>
<br>
This was at least a simple enough scenario to achieve it with my
pathetic scripting skills. This is what i came up with:
<br>
<br>
#!/bin/sh
<br>
<br>
# rsync copy script for rsync pull from FreeNAS to BackupNAS for
Buero dataset
<br>
<br>
# Set variables
<br>
EXPIRED=`date +"%d-%m-%Y" -d "14 days ago"`
<br>
<br>
# Copy previous timefile to timeold.txt if it exists
<br>
if [ -f "/volume1/rsync/Buero/timenow.txt" ]
<br>
then
<br>
yes | cp /volume1/rsync/Buero/timenow.txt
/volume1/rsync/Buero/timeold.txt
<br>
fi
<br>
# Create current timefile
<br>
echo `date +"%d-%m-%Y-%H%M"` >
/volume1/rsync/Buero/timenow.txt
<br>
# rsync command
<br>
if [ -f "/volume1/rsync/Buero/timeold.txt" ]
<br>
then
<br>
rsync -aqzh \
<br>
--delete --stats --exclude-from=/volume1/rsync/Buero/exclude.txt
\
<br>
--log-file=/volume1/Backup_Test/logs/rsync-`date
+"%d-%m-%Y-%H%M"`.log \
<br>
--link-dest=<i class="moz-txt-slash"><span class="moz-txt-tag">/</span>volume1/Backup_Test<span
class="moz-txt-tag">/</span></i>`cat
/volume1/rsync/Buero/timeold.txt` \
<br>
<a class="moz-txt-link-abbreviated"
href="mailto:Test@192.168.2.2::Test/volume1/Backup_Test/">Test@192.168.2.2::Test/volume1/Backup_Test/</a>`date
+"%d-%m-%Y-%H%M"`
<br>
else
<br>
rsync -aqzh \
<br>
--delete --stats --exclude-from=/volume1/rsync/Buero/exclude.txt
\
<br>
--log-file=/volume1/Backup_Buero/logs/rsync-`date
+"%d-%m-%Y-%H%M"`.log \
<br>
<a class="moz-txt-link-abbreviated"
href="mailto:Test@192.168.2.2::Test/volume1/Backup_Test/">Test@192.168.2.2::Test/volume1/Backup_Test/</a>`date
+"%d-%m-%Y-%H%M"`
<br>
fi
<br>
<br>
# Delete expired snapshots (2 weeks old)
<br>
if [ -d <i class="moz-txt-slash"><span class="moz-txt-tag">/</span>volume1/Backup_Buero<span
class="moz-txt-tag">/</span></i>$EXPIRED-* ]
<br>
then
<br>
rm -Rf <i class="moz-txt-slash"><span class="moz-txt-tag">/</span>volume1/Backup_Buero<span
class="moz-txt-tag">/</span></i>$EXPIRED-*
<br>
fi
<br>
<br>
Well, it works but there is a huge flaw with his approach and i am
not able to solve it on my own unfortunately.
<br>
As long as the backups are finishing properly, everything is fine
but as soon as one backup job couldn`t be finished for some reason,
(like it will be aborted accidently or a power cut occurs)
<br>
the whole backup chain is messed up and usually the script creates a
new full backup which fills up my backup storage.
<br>
<br>
What i would like to achieve is, to improve the script so that a
backup run that wasn`t finished properly will be resumed, next time
the script triggers.
<br>
Only if that was successful should the next incremental backup be
created so that the files that didn`t changed from the previous
backup can be hardlinked properly.
<br>
<br>
I did a little bit of research and i am not sure if i am on the
right track here but apparently this can be done with return codes,
but i honestly don`t know how to do this.
<br>
Thank you in advance for your help and sorry if this question may
seem foolish to most of you people.
<br>
<br>
Regards
<br>
<br>
Dennis
<br>
<br>
<br>
<br>
<br>
<br>
<br>
<br>
<br>
</body>
</html>