Linux backup

Malcolm Kay malcolm.kay at internode.on.net
Thu Aug 19 12:25:41 UTC 2004


Some weeks ago I enquired here about 'dump' for
use with ext3 file systems; and was strongly advised
the Linux and 'dump' don't play well together.

Reading the arguments including Linus Torvalds's comment
'  Right now, the cpio/tar/xxx solutions are definitely 
   the best ones, and will work on multiple filesystems 
   (another limitation of "dump"). Whatever problems they 
   have, they are still better than the _guaranteed_(*)  
   data corruptions of "dump".'
I was and am still convinced that 'dump' is not the way to
go under linux.

So I've spent some time scripting to marry in 'tar' 
backups for recently acquired Linux machines with a 
backup system that uses 'dump' for our unix machines.

Yesterday I ran this for the first time on one of the Linux
machines and found the backup aborted with the following 
error in the log file:
   /bin/tar: /home/thi/OM5438/test.hir1: file changed as we read it
   /bin/tar: Error exit delayed from previous errors
   Backup /data/pstar/root-0-z.tgz FAILED at Wed 18 Aug 2004 15:29:20 CST

So 'dump' leads to corrupt backups, 'tar' leads to aborted backups.
The abort message is undoubtably correct -- the file in question is a
temporary file used during circuit simulation analysis. Individual 
simululation runs can take from a few second upto a week. So it is not
practical to close down everything for backup. (If it was then
partitions could be dismounted for backup and the principal problem 
espoused for 'dump' would disappear.) Such files are not crucial to the 
backup. If tar simply skipped them or indicated that they were corrupt
in the archive while correctly preserving the rest of the file system
then this would be satisfactory -- but instead it aborts.
 

So is there someway to get 'tar' to continue when an odd file or two
exhibits this sort of problem? I know about the option:
  --ignore-failed-read
              don't exit with non-zero status on unreadable files
but from my interpretation of the man page it is not relevant to this
problem.

Does 'cpio' have the same problem?

Some have suggested 'amanda', but my understanding is that this is
just a wrapper that optionally uses 'dump' or 'tar' so this seems
to take us nowhere.

What else is there out there for backup? I am not looking for a backup 
system; just a reasonably reliable backup utility that can be used
so that the linux machines can be incorporated into the backup system
that works well for our unix machines.

Some advice please.

Malcolm Kay






More information about the redhat-list mailing list