How to get around the tar 2 gig limit

I am trying to use tar to backup my whole linux server to an external source.

The backup works fine but stops at 2 gig.

the command I use is
tar -zcvpf /mnt/backup/web.tar.gz \ --directory / ........then a bunch of excludes.
is there a way to split the tar backup into say 1.5 g chunks so that it does a complete backup? or is there another newer tar.

I'm using tar 1.15.91
LVL 17
nplibAsked:
Who is Participating?

[Webinar] Streamline your web hosting managementRegister Today

x
 
swampf0xConnect With a Mentor Commented:
perhaps this will help

tar -cvlf - / | split -a 2 -b 2000m - /mnt/big_backup.tar

 specify the -l switch for tar so it stays within one file system.  it'll try to backup the backup file that split just created

 FreeBSD 6.0 doesn't have the option -l,  but this works instead

tar -cv --exclude "/mnt" -f - / | split -a 2 -b 2000m - /mnt/big_backup.tar
0
 
swampf0xCommented:
oh and i also found this

tar -cvf backup1.tar / /var /opt
tar -cvf backup2.tar /usr /home

good luck.
0
 
ravenplCommented:
What filesystem type contain /mnt/backup/ ? Maybe it have 2GB file size limit?
0
The new generation of project management tools

With monday.com’s project management tool, you can see what everyone on your team is working in a single glance. Its intuitive dashboards are customizable, so you can create systems that work for you.

 
nplibAuthor Commented:
ntfs.
0
 
nplibAuthor Commented:
I use
mount -t smbfs -o workgroup=........ to mount to it
0
 
mullinsbcCommented:
0
 
nplibAuthor Commented:
cool,

I got this one
"tar -cv --exclude "/mnt" -f - / | split -a 2 -b 2000m - /mnt/big_backup.tar"
to work,

how ever it makes
*.taraa
*.tarab
*.tarac
etc..
do I need some specal command line to extract them? or is tar smart enough to extract them all?

0
 
nplibAuthor Commented:
This one works
tar -cv --exclude "/mnt" -f - / | split -a 2 -b 2000m - /mnt/big_backup.tar
0
All Courses

From novice to tech pro — start learning today.