Transferring files


I am needing to transfer a big amount of data from one server to the other. Some of this data is under users home drive that they connect to through the network. It is such a large amount of data that it will take a while to move, and in that process I see an issue coming up. I cannot shut out the users from their data for very long, and if I move it, and they change an item in that folder of data, it will not be updated when the information is moved.

Is there any easy way to move this data without causing a huge downtime for the users?

Thank You
Who is Participating?
tigermattConnect With a Mentor Commented:

Have you considered using DFS Replication to replicate the data between the servers?

The process might be a little slower, but because replication is in use rather than a one-time move or copy operation, you can be sure that any changes made during the replication process will be replicated to the new destination too.

Then, at some point in the future (out of hours) you can disable the replication and move all users to the share on the new server - if that is your intention.

Take a look at for further information.

You can always copy the files and then at some later date use xcopy, robocopy, synctoy or syncback to syncronise the data, the syncronisation will only re-copy any files that have changed.
eg using XCOPY

XCOPY \\Oldserver\files\*.* \\NewServer\Files /M

/M specifies the use of the archive bit, so files which are new or have been modified are copied
Can you script it and have it run overnite when no one is around. Another option is FastSCP which is a fast and secure copy between servers. There is a free version from Veeam that I use to move lots of data.
All Courses

From novice to tech pro — start learning today.