• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 243
  • Last Modified:

Archiving backup

Hi All,

We have an application running all the time and we are affordable to have very minimal downtime for backup. During downtime we have to take a back up of the application's database related files. It's not RDBMS, a kind of flat file database.

The directory size is 18 GB. The application is running on SunOS. We use tar and compress commands to archive. But this takes around 30 mins. We want to minimise the time it takes for archiving. Is there any other efficient way to archive and time saving methods.

Please help.

Cheers!
0
naga1979
Asked:
naga1979
2 Solutions
 
nobusCommented:
0
 
Duncan MeyersCommented:
Can you tar the directory to a high-speed tape technology such as LTO-2? LTO-2 runs at around 1.2GB/min for writes depending on how fast the server hardware is, the file types (ie lots of small files take longer to write out than a few big files), SCSI interface and so on.
0
 
alanclosCommented:
Why not do image level backups which will only copy the blocks of data that change (even with flat files).  Then build a "full" from previous fulls and these incrementals/differentials?
0

Featured Post

What does it mean to be "Always On"?

Is your cloud always on? With an Always On cloud you won't have to worry about downtime for maintenance or software application code updates, ensuring that your bottom line isn't affected.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now