Go Premium for a chance to win a PS4. Enter to Win


Using Windows XP Pro Backup

Posted on 2011-09-27
Medium Priority
Last Modified: 2012-05-12
I've been trying to set up a backup that will be scheduled to run at night.  It's of one folder going into one folder ... so that part's simple enough.
BUT, when I've set up the backup task I set it up to do an Incremental backup so that it will:
1) backup everything the first time
2) only back up new files thereafter

But, what it does instead is that it backs up the entire folder each time and I get this HUGE backup space growing daily.

How do I get what I want?
Question by:Fred Marshall
  • 4
  • 3
  • 2
  • +2
LVL 93

Expert Comment

ID: 36714894
use syncback - fast and good : www.2brightsparks.com/SyncBack 
LVL 26

Author Comment

by:Fred Marshall
ID: 36897637
I meant *with* Windows XP Pro Backup ....  Surely it must be possible.
LVL 22

Expert Comment

ID: 36908156
You need 2 jobs.  1 job to do the First Full Backup.  2nd job to do the Incremental backups.

This will guarantee what you want to happen.

1.  do a  Manual Full - Normal backup (to ensure you really have a Full backup to to also clear any file archive attribute).  By manual, I mean just make a backup job and run it.

For the Incremental backup, at the Scheduled Jobs tab, click Add Job, select your source folder and destination file, Choose Incremental method and also the Append this backup... mode

That should be it.  You will have a Full - normal backup that you made manually.  Lets call that file:  FullBackup.bkp
And you should have the Incremental(s) backup in some file called (for example):  DailyIncrBackup.bkp

The incremental backup file will contain individual backups for each time it runs (they will append/add to the file).

Industry Leaders: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

LVL 25

Assisted Solution

dew_associates earned 300 total points
ID: 36908718
First, I recommend that you read this great article at Microsoft on the backup utility and proper installation of Ntbackup.msi.


Second, the following document fully explains how to schedule/automate the backup process on Windows XP

LVL 26

Author Comment

by:Fred Marshall
ID: 36911142
dew_associates:  Thank you for the links.  At least now I have better insight into what to expect.  
From what I've read, it appears that the Windows backup won't do what I want.  Here are the spec's:
1) Do a full (Normal) backup the first time.  (Any backup can do that).
2) Do an Incremental backup thereafter.
3) Don't use more space than necessary for the entire set of backups.
4) Avoid having to rummage around to do a restore.
5) Save files in native format.

With Windows backup it appears that 3 and 4 are mutually exclusive.
It appears that Incremental backup will force dealing with 4.
With Windows backup it appears that 5 isn't possible.  Is that right or did I miss a setting to "not compress" or ..... ?

chakko:  So, I guess one might do a single Normal backup and then schedule an Incremental backup?  Is that what you mean?
LVL 25

Expert Comment

ID: 36911241
I think maybe you have missed something in the explanation of the backup settings.

Indeed, while you may have to do the first full backup by creating the settings and running the backup utility, you then just have to set up the backup job and then set up the scheduler to run that job.

Thus, your #1 and #2 are covered.

Youre #3 cannot be done, as adding incremental backups will cause the backup space to grom incrementally.

#4 is covered by scheduling the backup as explained.

#5 - I'm not sure why you seem to think that compression changes the native format, as it does not.
LVL 26

Author Comment

by:Fred Marshall
ID: 36911355

The #3 objective is to not use more space than *necessary*.  It is necessary to add new files - so that kind of incrementalism is fine.  But creating multiple copies of the same file is what I'm trying to avoid by this.  And, this is what the Windows backup seemed to be doing in Incremental mode when I tried it!  Specifically I was backing up a 29GB folder and was getting a single daily backup file that grew 29GB per day while the actual data added each day was way less than that.

The #4 objective is not met according to the Microsoft paper on the subject.  It says clearly that restoration is going to be a chore.  So, either that's correct or it isn't.  I've never done it so I'm asking.

A compressed format is definitely not native.  If I see .zip (or whatever) then I don't see .doc and .xls, etc.  I need an *app* to make the underlying files visiable and accessible.  Now, I grant that a Windows backup or a .zip would likely be so readily available that this isn't likely a huge issue.  But I still prefer native format.  Compression is only useful when it's really needed and can be a nuisance when it's not needed.  An example of why this objective:
Lets say we use Backup MY PC.  The resulting backup (as I recall) is in a proprietary format.  Let's say the backed up computer dies.  Backup My PC dies with it.  Now we have to get an installation of Backup My PC (which may no longer exist or the file format may no longer exist, etc. etc.).  I want to avoid that kind of step.  So *native* format please.  That said, I guess I would back off if the format were quite universal like .zip or .7z or .....
LVL 25

Expert Comment

ID: 36911493
The first issue you mention is a settings issue. You need to be particularly careful in your selections so that only files that have change since the last increment are included.

Indeed, making a full backup followed by incremental backups will create a chore should you need to restore your system. This was your original request, although not a recommended procedure today given the large number of inexpensive USB backup drives.

In all honesty the best way to handle the issue would be with two inexpensive drives and do full backups and alternate the drives, verifying the backups as completed? Why? If one backup is corrupted for any reason, you at least have the previous to work with and at best loose only one days work.

Last but not least, while the backup compression does append a form of zip to the file, it remains readable within the Windows environment. Backup My PC is not what I would use.

In you are working with critical documents, and/or your time is important to you, I would follow another approach entirely. I would run a pair of drives in a basic RAID 1 configuration and then schedule regular backups as a safeguard. The mirrored drives will forestall most major events other than maybe total corruption via a virus etc.
LVL 22

Assisted Solution

chakko earned 300 total points
ID: 36914437
If you schedule the Incrementals then is will backup (per your schedule) any files which have been changed since the previous incremental.
If you edit a file every day, then that file will be in the backup every time the backup runs.  You will get many copies of that file in the backup (Since we selected Append option for the Incremental backups, the file you edit daily will be in every Incremental backup in the backup .bkp file).

What you want sounds more like a file sync program.  Take a look at Microsoft SyncToy (free) to copy your folder/files (for example).

There are many file/folder sync programs.  It will better accomplish what you want.  The files will stay in their original format (doc, xls, pdf, etc) as individual files.  Only 1 copy will be kept in the backup (copy to) location.  You should look for any option such as 'mirror' and be careful of that.  You probably don't want any option that will delete the destination side if the source side was deleted.

Sounds like you only need:    copy from source (left side) to destination (right side)

Most will have the feature that only changed files are copied (so it is faster on subsequent uses) - which is similar to incremental backup


LVL 24

Accepted Solution

yo_bee earned 400 total points
ID: 36914473
Y you looking to make a complete copy of a certain root directory that is accessible without having to do a restor job.

I use robocopy for thses type of jobs. It is free download for xp.


Here is the command line you need to run

Robocopy "source" "dest" /e /xo /w:1 /r:1 /z /log:filename.txt /NP /tee

/e = subdirectories (recursive)
/xo = exclude files is the source is the same age or older (mean if the file has not changed then it is skipped)
/w:n = wait time when a file is not accessible or locked. :n is the number of seconds that you robocopy waits
/r:n = retries after the wait time is reached. :n is the number of retries
/log:  this is what it says. It is not needed
/np = no progress. (recommended if you are logging to prevent extremely large logs
/tee = see the progress while logging. S
LVL 24

Expert Comment

ID: 36914478
To schedule create a scheduled jod that runs similar command
LVL 26

Author Closing Comment

by:Fred Marshall
ID: 37001410
Thanks all

Featured Post

Veeam and MySQL: How to Perform Backup & Recovery

MySQL and the MariaDB variant are among the most used databases in Linux environments, and many critical applications support their data on them. Watch this recorded webinar to find out how Veeam Backup & Replication allows you to get consistent backups of MySQL databases.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

The core idea of this article is to make you acquainted with the best way in which you can export Exchange mailbox to PST format.
Today as you open your Outlook, you witness an error message: “Outlook is using an old copy of your Outlook Data File…”. Probably, Outlook is accessing an old OST file.
This is used to tweak the memory usage for your computer, it is used for servers more so than workstations but just be careful editing registry settings as it may cause irreversible results. I hold no responsibility for anything you do to the regist…
Monitoring a network: why having a policy is the best policy? Michael Kulchisky, MCSE, MCSA, MCP, VTSP, VSP, CCSP outlines the enormous benefits of having a policy-based approach when monitoring medium and large networks. Software utilized in this v…

876 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question