We help IT Professionals succeed at work.

How to get around the tar 2 gig limit

nplib
nplib asked
on
I am trying to use tar to backup my whole linux server to an external source.

The backup works fine but stops at 2 gig.

the command I use is
tar -zcvpf /mnt/backup/web.tar.gz \ --directory / ........then a bunch of excludes.
is there a way to split the tar backup into say 1.5 g chunks so that it does a complete backup? or is there another newer tar.

I'm using tar 1.15.91
Comment
Watch Question

Commented:
perhaps this will help

tar -cvlf - / | split -a 2 -b 2000m - /mnt/big_backup.tar

 specify the -l switch for tar so it stays within one file system.  it'll try to backup the backup file that split just created

 FreeBSD 6.0 doesn't have the option -l,  but this works instead

tar -cv --exclude "/mnt" -f - / | split -a 2 -b 2000m - /mnt/big_backup.tar

Commented:
oh and i also found this

tar -cvf backup1.tar / /var /opt
tar -cvf backup2.tar /usr /home

good luck.
Top Expert 2005

Commented:
What filesystem type contain /mnt/backup/ ? Maybe it have 2GB file size limit?

Author

Commented:
ntfs.

Author

Commented:
I use
mount -t smbfs -o workgroup=........ to mount to it

Author

Commented:
cool,

I got this one
"tar -cv --exclude "/mnt" -f - / | split -a 2 -b 2000m - /mnt/big_backup.tar"
to work,

how ever it makes
*.taraa
*.tarab
*.tarac
etc..
do I need some specal command line to extract them? or is tar smart enough to extract them all?

Author

Commented:
This one works
tar -cv --exclude "/mnt" -f - / | split -a 2 -b 2000m - /mnt/big_backup.tar