Scripted backup to multiple CD Rom's.

Mark Paine mark at
Tue Aug 20 11:18:25 EST 2002

Quoting Drake Diedrich <dld at>:
>    Yeah, it does.  I wasn't happy with any of the multi-CD packages I
> found.
> They all did something I didn't like:
> 1) didn't compress
> 2) didn't back up more than 1 CD
> 3) couldn't handle files larger than a CD
> 4) used some proprietary or obscure format
> 5) didn't encrypt
> 6) losing one CD made the whole set unrecoverable

Joining in rather late to this discussion, but the couple of backup solutions I 
found I was not happy with either, mainly because of many of the above 
reasons.  Glad I wasn't the only one unhappy.
>    I just use tar, gpg, and a compressor of your choice (gzip -3 for
> the
> speedy, bzip if you are *really* cheap and don't mind taking a long time
> to
> make backups), and fill each CD with compressed multi-volume tarball
> pieces. 
> There's no reason the same system couldn't be used on larger media
> like
> tapes or DVDs, just change the media size and the media writer
> strings.

I ended up doing a similar thing.  I wrote a shell script which had a list of 
the directories that I wanted to backup with controlling code and a separate 
exclude file (for tar).  The script backs up each dir into a compressed 
tarball, one per directory being backed up.  If the tarball was larger than a 
CD, its then split.  Once I have backed up all the directoriers, I then move 
the tarballs into a number of sub-directories, each one corresponding to a CD.  
Once done, burn the CDs, one per sub-directory.  Also create an empty file so 
that when I do an incremental backup I use the creation date of that file.  
Same bash script does full and incrementals.  Incrementals I burn to CD-RW.  
When it gets filled, then time for a full backup.

Not overly efficient as it requires a lot of disk space as all the CDs are 
created before being burnt.  Currently up to 7 CDs to backup user and system 
directories on the home server and will be running out of space shortly so will 
have to do a rethink.  However, I do get full CDs (except for the last one), 
and each CD contains a number of tarballs, each tarball corresponding to a 
directory that I want to backup.  (ie each user in /home is backed up in a 
separate tarball, as is /etc, /var/spool/mail etc)  If needed to recover a 
file, then I just find the CD with the required tarball and extract as 
required.  (Thankfully, never had to do it except to make sure that it could be 
done.)  Nothing fancy to recover, just tar.

> A CLUG talk this month perhaps?  Not much more to say than what I just
> did,
> but it's always nice to see it in action and take a code walk-through
> (short
> enough to survey each line).  A current encoder attached.  Decoding is
> currently done manually, but a similar automated decoder could also be
> written on the day I needed it.

Love to, but CLUGs are on a night I can't attend (child sitting).  However, 
your code does give me some ideas for some tar options I think I should look 
into.  Gotta rethink my backup strategy.  Will do that once I upgrade to Woody 
and see what debs are available.  Arrgh, more work....

Mark P.
.sig - TBA

More information about the linux mailing list