[Last Call] Learn about multicloud storage options and how to improve your company's cloud strategy. Register Now

x
?
Solved

Need to speed up Backup Exec when backing up small image files

Posted on 2009-07-10
3
Medium Priority
?
901 Views
Last Modified: 2013-12-01
We are using Backup Exec 12.5, backing up to disk on a data valut.  Most backup jobs move very quickly (1800 MB/min).  However, we store over a terbyte of image files (mostly Tiff) that are all less than 200 KB.

When Backup Exec hits these folders it slows down to a crawl.  Is there a way to speed up backing up of all of these small files?  Looking for any suggestions.

Thanks
0
Comment
Question by:keagle79
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
3 Comments
 
LVL 12

Expert Comment

by:Serge Fournier
ID: 24822407
do you have many files il same sub directory?

never put more than 500 files in a directory

more than that, accessing thoses files is a crawl, even for backup exec
0
 

Author Comment

by:keagle79
ID: 24822438
No, actually they are split up into many directories with no more than 200 hundred.  The are scanned documents that are separated by date.
0
 
LVL 21

Accepted Solution

by:
SelfGovern earned 500 total points
ID: 24830265
Small files will always be a problem -- I have seen fast servers slow to under 10MB/sec when sequentially reading a  bunch of 10K files.

Things to do:
- Put the files as close to the root directory as possible, without being in root, to minimize directory tree walking.
- Use the fastest possible source disk -- RAID 1+0 on a *good* *hardware* RAID controller
- Keep the disk defragmented (hopefully by a process that is not running during the backup!)
- Keep other applications from accessing this disk while backup is running, including background processes like indexing and defrag.
- If it's not a huge amount of data (i.e., tens to 100GB, vs. 100s of GB to TB+), you could put this data on SSD (solid state disk), which will give you the fastest possible read times, since there's no physical movement involved in reads.  The challenge is that SSD has a high, but limited, number of read cycles... so if these are very heavily accessed files, you might wear out SSD disks faster than you'd like.  (And remember to TURN OFF defragmentation processes on SSD!)

If none of those help (and they may not help much; this small-file thing is a NTFS/FAT attribute), then the question is, "How often do you have to restore single files?"  If the answer is "Rarely" or "Never", then the best solution is to put the files on a disk (stripe set, probably) of their own, and perform an image backup.  An image backup will read the disk sectors sequentially, and can give much faster speeds than a file-by-file backup (which you're doing now) that has to read the directory, walk the tree, read one file, go back to root, read the directory, walk the tree, read one file....

The problem with image backup is that restores will take significantly longer... but if restores are rare and this is just for archive in case of disaster, then that is probably not a problem.  Note that, even if there is "some" other data on this disk that does need occasional single-file restores, you will still back that up as part of the disk image (which gets *everything*), but you can also back up those other files separately as part of a file-by-file backup (specify that particular directory in a standard backup).

If an image backup is not practical for some reason, you've got one other choice, which is to use a disk target, then move that to tape.  With D2D2T (Disk to Disk to Tape), you use disk as the first target of your backup job, which creates a huge single file that is mostly contiguous (and is your backup job in the same format as if it had been written to tape)... then step 2 is to use your backup application to copy that to physical tape.  Since it's coming from a huge file (hopefully close to root!), you can get good backup speeds to tape (but this will not improve the original backup speed, since source disk is the bottleneck).

If you're going to use D2D2T, the cheapest method is to use the backup application to create a D2D tartet on your server's hard disk.  Make sure it's big enough to hold the complete backup job.  Problems are that there is much more server overhead, you have to manage the space manually, and it's a server-by-server task, not something you can do for all servers easily.

Then the more expensive but much more scalable solution is to purchase a D2D backup system (a type of viirtual tape library, or VTL), a typically Linux-based appliance that mimics multiple tape libraries and acts as a backup target for multiple servers at once.  The best VTLs allow you to perform some sort of automigration, where the D2D system itself can copy or move the data to physical tape, so you don't have to go back over the network.  Different VTLs are available, they can have either iSCSI (simple, free, decent performance) or Fibre Channel (more expensive, high performance) connectivity to your servers.

I'm pretty sure those are your options.   If you do look at VTLs or D2D backup systems, please consider the HP D2D2500 or D2D4000 series (see http://www.hp.com/go/d2d ).  Obligatory Disclaimer: Yes, I do work for HP -- but everything in this email up to this paragraph is as vendor-neutral as you can get.
0

Featured Post

Concerto's Cloud Advisory Services

Want to avoid the missteps to gaining all the benefits of the cloud? Learn more about the different assessment options from our Cloud Advisory team.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Microservice architecture adoption brings many advantages, but can add intricacy. Selecting the right orchestration tool is most important for business specific needs.
Windows Server 2003 introduced persistent Volume Shadow Copies and made 2003 a must-do upgrade.  Since then, it's been a must-implement feature for all servers doing any kind of file sharing.
This tutorial will walk an individual through the steps necessary to enable the VMware\Hyper-V licensed feature of Backup Exec 2012. In addition, how to add a VMware server and configure a backup job. The first step is to acquire the necessary licen…
This tutorial will show how to configure a new Backup Exec 2012 server and move an existing database to that server with the use of the BEUtility. Install Backup Exec 2012 on the new server and apply all of the latest hotfixes and service packs. The…

650 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question