Solved

Storage Craft backup too large - comparison tool

Posted on 2013-12-09
6
530 Views
Last Modified: 2013-12-12
Have a Windows 2012 Standard 64-bit Server, with StorageCraft v5.14

If I run an incremental backup during the day the file size is less that 1gb.  If the incremental backup runs at its 8am scheduled time it is over 7.5gb.  I have picked over the event log for any conflicting processes, and rescheduled them all.  Now after the backup runs, a defrag (if the drive is over 30% defragmented) runs AFTER the backup.  If I compare the (2) different backups, the files and folder sizes are within .5gb in size.

Is there a comparison utility / tool out there, that I can use to compare these .spi files (and/or md5 files) to see why there is such a huge difference.  General concensus is that there is some process running at night that is causing this.

Goal:  I want to bring the file size down, so it is feasible to use the ImageManager to transfer the .spi file offsite.
0
Comment
Question by:ITIExperts1
  • 3
  • 2
6 Comments
 
LVL 46

Accepted Solution

by:
noxcho earned 500 total points
ID: 39708505
There is no such open-free tool that can do this. Maybe SC engineers can give you some development kits for this.
As for the file size. Is there compression enabled on your server? Or encryption?
Your guess about defragmentation is correct. Are you sure it is disabled between two incremental runs?
Note, if you take full then defrag then incremental - increment will big big. I think you know this.
0
 

Author Comment

by:ITIExperts1
ID: 39709989
Have gone back and forth with StorageCraft for awhile now, that is why I am now seeking other help.  Have done everything they requested.

The only file compression and encryption is being done per user, on individual files and by StorageCraft.

The defrag is kicking off after the backup completes.  That is, if you can believe the Server Even / Log Files.

The full was only taken in the very beginning.

Your help is appreciated, and hopefully, someone else out there, has come across this.
0
 
LVL 46

Expert Comment

by:noxcho
ID: 39710007
Can you keep Defrag off for 2 days and see if the size changes? Or have you done this already?
0
Optimizing Cloud Backup for Low Bandwidth

With cloud storage prices going down a growing number of SMBs start to use it for backup storage. Unfortunately, business data volume rarely fits the average Internet speed. This article provides an overview of main Internet speed challenges and reveals backup best practices.

 

Author Closing Comment

by:ITIExperts1
ID: 39711254
Turning the Defrag off worked.  Thank you!
0
 
LVL 46

Expert Comment

by:noxcho
ID: 39711662
Thanks for feedback. I was trying to understand if you do always a new full backup after defrag and then increment. Because if you do once increment and then defrag - no difference before increment or after - it will be always bigger than it should be. Because increment compares the data or sectors changed and defrag changes the position of data.
0
 
LVL 16

Expert Comment

by:gurutc
ID: 39713831
noxcho is right.  Your backup software was looking at the drive bitmap, which changes when you defrag.  nice work noxcho!

- gurutc
0

Featured Post

Use Case: Protecting a Hybrid Cloud Infrastructure

Microsoft Azure is rapidly becoming the norm in dynamic IT environments. This document describes the challenges that organizations face when protecting data in a hybrid cloud IT environment and presents a use case to demonstrate how Acronis Backup protects all data.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

This article is an update and follow-up of my previous article:   Storage 101: common concepts in the IT enterprise storage This time, I expand on more frequently used storage concepts.
The question appears often enough, how do I transfer my data from my old server to the new server while preserving file shares, share permissions, and NTFS permisions.  Here are my tips for handling such a transfer.
This tutorial will walk an individual through the steps necessary to enable the VMware\Hyper-V licensed feature of Backup Exec 2012. In addition, how to add a VMware server and configure a backup job. The first step is to acquire the necessary licen…
This tutorial will walk an individual through the process of configuring basic necessities in order to use the 2010 version of Data Protection Manager. These include storage, agents, and protection jobs. Launch Data Protection Manager from the deskt…

803 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question