tar to mutiple files

We have directory in solaris which has thousands of subdirectories and files.

We need to tar this directory and copy it over the WAN. Since this directory is quite big in size (around 600GB) we can't just create one tar file.
Is there a way to create mutiple tar files based on size, say 10GB each?

One approach could be to select a few files and directories and tar them one at a time but this could be error prone (as we could miss something) and requires lot of manual work.

Is there a command which enables to create tar files based on size (say 10GB each), so that as soon as 10Gb size is reached a new tar file is created and so on.
LVL 1
ank5Asked:
Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

x
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

SjizzelCommented:
I found this article:

Sometimes when you want to store your backup or any other large set of files online or want to share them someone else you need to find a way to compress and split the files into chunks of 100 or more Megabytes. I felt the need for this as well recently when I wanted to store my backups online and the online storage service had a cap of 100 MB per file. I found a really neat solution based on the tar command. Using this method I split my backup of about 1 GB into 10 chunks of 100 MB each with incremental filenames.

The 1 GB file I wanted to split was called dbbackup.db. Here’s the command I ran to create multiple tar files of 100 MB each out of it:

# tar -cf – dbbackup.db | split -b 100m – db_backup.tar

This command took a long time to run. Once it was done running I was left with ten files, 100 MB each named db_backup.taraa, db_backup.tarab, db_backup.tarac, and so on and so forth.

Now I can copy these files to my external storage or ship them with ease. To stitch the 1GB file back together all I need to do is to run the following command:

# cat db_backup.tara* | (tar x)

And voila, I get my original file again.


http://www.simplehelp.net/2009/05/25/how-to-create-a-multi-part-tar-file-with-linux/
0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
torakeshbCommented:
To create:

tar -cf - <path to tar> | split -b 1000m – test.tar

To extract

cat test.tara* | tar xvf
0
omarfaridCommented:
I think you missed the - in your command

cat test.tara* | tar xvf -
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Unix OS

From novice to tech pro — start learning today.