Best method to migrate local data to external SCSI array?

We simply need to migrate (move) data from local drives on a server to an external array (connected to the same server).   The total amount of data is approx 600GB.  What is the best, comprehensive method to move the data to the SCSI connected array and retain security,integrity, etc?  

I suppose XCOPY is an option but there is likely a better solution.  Also, any pitfalls using XCOPY?

thanks for any insight!
LVL 1
davisAsked:
Who is Participating?
 
dooleydogCommented:
xcopy or robocopy, either should do the trick,

Remember, if you are moving a shared folder, it will not be shared after the move, you will have to re-share it.

Good Luck,
0
 
Lee W, MVPTechnology and Business Process AdvisorCommented:
Do you have a block of HOURS of downtime to do this?  If so, then XCOPY or ROBOCOPY should do fine.  Many people like Robocopy because it can retry files, but I dislike it because the retry is sequential, meaning if you set it to retry 10 times, it retries 10 times then moves on to the next file.  This retry would be a LOT better if it "marked the file" for retry and tried AFTER it backed up everything else - in that way it would give time for the file to be closed.

If you DO NOT have a large block of time to do this in, I would do a backup (most likely to tape), then restore it to the new drive.  Then, take the server offline and do a differential backup - then restore that.  The differential should be small and quick - so the total downtime would be minimal.  As a side benefit, you get to test your backups and know they are ok.
0
 
Brick-TamlandCommented:
If the data is on a separate partition from the OS you can just move the whole partition. That way the only thing that changes is the hardware. I like using Partition Commander for this.
0
Keep up with what's happening at Experts Exchange!

Sign up to receive Decoded, a new monthly digest with product updates, feature release info, continuing education opportunities, and more.

 
NJComputerNetworksCommented:
"Do you have a block of HOURS of downtime to do this?  If so, then XCOPY or ROBOCOPY should do fine.  Many people like Robocopy because it can retry files, but I dislike it because the retry is sequential, meaning if you set it to retry 10 times, it retries 10 times then moves on to the next file.  This retry would be a LOT better if it "marked the file" for retry and tried AFTER it backed up everything else - in that way it would give time for the file to be closed."

Actually, you can use the MIR switch with Robocopy...  You use the MIR switch with robocopy and copy the data to the new location during the day.  This copy will take a long time if you have GB's of data.  Then later, off hours, you can then re-run the Robocopy command with the MIR switch.  In a much shorter time frame, roboy copy will synch the target directory with the souce (deleting item in the target that might have been deleting by users during the day.. and adding new files or changed files).  This technique allows you to prestage your data... and then quickly synch the data at a later time.  (Note: during the second running of robocopy with the /mir switch, it is best to lock out users before running it...this ='s no locked files.  You can do this by changing permissions on the share.)

0
 
Lee W, MVPTechnology and Business Process AdvisorCommented:
Fair enough - but I still think a retry as it is basically implemented is not very good.
0
 
NJComputerNetworksCommented:
With this much data, I would probably choose to do the backup method you mention anyway...
0
 
davisAuthor Commented:
Amazing feedback!  Got the answer and plenty more helpful info...

thanks
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.