Learn how to a build a cloud-first strategyRegister Now

x
?
Solved

Best method to migrate local data to external SCSI array?

Posted on 2006-06-08
7
Medium Priority
?
279 Views
Last Modified: 2010-04-18
We simply need to migrate (move) data from local drives on a server to an external array (connected to the same server).   The total amount of data is approx 600GB.  What is the best, comprehensive method to move the data to the SCSI connected array and retain security,integrity, etc?  

I suppose XCOPY is an option but there is likely a better solution.  Also, any pitfalls using XCOPY?

thanks for any insight!
0
Comment
Question by:davis
7 Comments
 
LVL 9

Accepted Solution

by:
dooleydog earned 200 total points
ID: 16860881
xcopy or robocopy, either should do the trick,

Remember, if you are moving a shared folder, it will not be shared after the move, you will have to re-share it.

Good Luck,
0
 
LVL 97

Assisted Solution

by:Lee W, MVP
Lee W, MVP earned 200 total points
ID: 16860907
Do you have a block of HOURS of downtime to do this?  If so, then XCOPY or ROBOCOPY should do fine.  Many people like Robocopy because it can retry files, but I dislike it because the retry is sequential, meaning if you set it to retry 10 times, it retries 10 times then moves on to the next file.  This retry would be a LOT better if it "marked the file" for retry and tried AFTER it backed up everything else - in that way it would give time for the file to be closed.

If you DO NOT have a large block of time to do this in, I would do a backup (most likely to tape), then restore it to the new drive.  Then, take the server offline and do a differential backup - then restore that.  The differential should be small and quick - so the total downtime would be minimal.  As a side benefit, you get to test your backups and know they are ok.
0
 
LVL 3

Assisted Solution

by:Brick-Tamland
Brick-Tamland earned 100 total points
ID: 16861061
If the data is on a separate partition from the OS you can just move the whole partition. That way the only thing that changes is the hardware. I like using Partition Commander for this.
0
Independent Software Vendors: We Want Your Opinion

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

 
LVL 33

Expert Comment

by:NJComputerNetworks
ID: 16861698
"Do you have a block of HOURS of downtime to do this?  If so, then XCOPY or ROBOCOPY should do fine.  Many people like Robocopy because it can retry files, but I dislike it because the retry is sequential, meaning if you set it to retry 10 times, it retries 10 times then moves on to the next file.  This retry would be a LOT better if it "marked the file" for retry and tried AFTER it backed up everything else - in that way it would give time for the file to be closed."

Actually, you can use the MIR switch with Robocopy...  You use the MIR switch with robocopy and copy the data to the new location during the day.  This copy will take a long time if you have GB's of data.  Then later, off hours, you can then re-run the Robocopy command with the MIR switch.  In a much shorter time frame, roboy copy will synch the target directory with the souce (deleting item in the target that might have been deleting by users during the day.. and adding new files or changed files).  This technique allows you to prestage your data... and then quickly synch the data at a later time.  (Note: during the second running of robocopy with the /mir switch, it is best to lock out users before running it...this ='s no locked files.  You can do this by changing permissions on the share.)

0
 
LVL 97

Expert Comment

by:Lee W, MVP
ID: 16861731
Fair enough - but I still think a retry as it is basically implemented is not very good.
0
 
LVL 33

Expert Comment

by:NJComputerNetworks
ID: 16861808
With this much data, I would probably choose to do the backup method you mention anyway...
0
 
LVL 1

Author Comment

by:davis
ID: 16862969
Amazing feedback!  Got the answer and plenty more helpful info...

thanks
0

Featured Post

What does it mean to be "Always On"?

Is your cloud always on? With an Always On cloud you won't have to worry about downtime for maintenance or software application code updates, ensuring that your bottom line isn't affected.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

by Batuhan Cetin Within the dynamic life of an IT administrator, we hold many information in our minds like user names, passwords, IDs, phone numbers, incomes, service tags, bills and the order from our wives to buy milk when coming back to home.…
Scenerio: You have a server running Server 2003 and have applied a retail pack of Terminal Server Licenses.  You want to change servers or your server has crashed and you need to reapply the Terminal Server Licenses. When you enter the 16-digit lic…
Integration Management Part 2
Screencast - Getting to Know the Pipeline

810 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question