Archiving backup

Hi All,

We have an application running all the time and we are affordable to have very minimal downtime for backup. During downtime we have to take a back up of the application's database related files. It's not RDBMS, a kind of flat file database.

The directory size is 18 GB. The application is running on SunOS. We use tar and compress commands to archive. But this takes around 30 mins. We want to minimise the time it takes for archiving. Is there any other efficient way to archive and time saving methods.

Please help.

Cheers!
LVL 1
naga1979Asked:
Who is Participating?

Improve company productivity with a Business Account.Sign Up

x
 
nobusConnect With a Mentor Commented:
0
 
Duncan MeyersConnect With a Mentor Commented:
Can you tar the directory to a high-speed tape technology such as LTO-2? LTO-2 runs at around 1.2GB/min for writes depending on how fast the server hardware is, the file types (ie lots of small files take longer to write out than a few big files), SCSI interface and so on.
0
 
alanclosCommented:
Why not do image level backups which will only copy the blocks of data that change (even with flat files).  Then build a "full" from previous fulls and these incrementals/differentials?
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.