Process/Update files in a folder even when in use

Published:
Updated:
You may have already been in the need to update a whole folder stucture using a script. Robocopy does it well and even provides a list of non-updated files in a log (if asked to). Generally those files that were locked by a user or a process by the time the update script ran.

When only a few files are in use, you usually go to the "Manage open files" GUI on the server that provides the shared disk to force the files closed and re-run your script.

Now you may have a lot of those files or, even worse, you may want to schedule your script during the night, and you cannot because of those files that are blocked. Immediately, the OPENFILES /disconnect command comes to mind but this function does not allow you to close accessed files or folders in a recursive way. You close one file or one folder at a time OR you do it by username but then you have no more control on which folder you target.

I've looked for a decent workaround on the Internet and found none but complicated or manual procedures.
So I came up with my own piece of batch that fits in just a few couples of lines. Clean, simple, robust.

After defining your "source" and "destination" variables %sourec% and %dest% respectively, just insert this piece of code into your script, with "myservername" that stands for the name of the server that holds the shared folder or disk. We assume that D:\ is your Data drive on this server and that "sharename" is the name of your shared path.
ECHO Closing open shared files and folders from destination folder
                      PushD "%dest%\"
                      FOR /R %%G IN (*) DO (OPENFILES /disconnect /s myservername /a * /op  "D:\sharename%%~pnxG")
                      FOR /D /r %%G IN (*) DO (OPENFILES /disconnect /s myservername /a * /op  "D:\sharename%%~pnxG")
                      PopD
                      ECHO Copying updated files from source to destination
                      ROBOCOPY "%source%" "%dest%" *.* /LOG:"%dest%\robocopylog.txt"
                      

Open in new window

Explanation:

  • The PushD command will change the current directoty to the destination folder.
  • The first FOR loop will run recursively through all the files in the destination folder and will free the access to them.
  • The second FOR loop does the same over the folders themselves in case the folder has a lock.
  • The PopD will restore the current folder to the original location.
  • Now you can perform a ROBOCOPY with no fear that a locked file will stop your update.
One limitation, though, is that the two FOR loops take some time to run (a minute for a few thousand files in my case, but beware if you have a big set of files) and the copy is only launched after that. So while the FOR loop runs, a user may lock some freshly unlocked file again. So this solution is better for nighly scripts or whatever time range when no user is playing with the files, through the Windows Task Scheduler.

One solution to that situation is to run the disconnection part of the script in a parallel loop (ie. a separate script) so the ROBOCOPY command, if configured correctly in a second script to retry a failed copy, will loop to on a locked file untill it gets freed by the disconnection script that is looping. That would be too cumbersome.

Another solution would be to merge the copy into the disconnect loop but then you have to face the problem of refering a file from a server point of view (for the OPENFILES command) AND from a client point of view (for the ROBOCOPY command) at the same time. That would also be too difficult to implement. Also, your very useful LOG switch that comes with ROBOCOPY becomes useless, you have to create your own log.

My solution to this is a different approach. The script I run must get elevated privileges, something I didn't find how to get through a scheduled task in my environment. I didn't want to log in as administrator for obvious security reasons. So I came up with a small Batch script that acts as a launcher. It delays the execution of my main script but not too much so it runs before our nightly backup. I run the launcher with an argument that corresponds to the name of my update script in an Command window with elevated privileges. It looks like that:
ECHO OFF
                      
                      SET hour=22
                      SET min=59
                      SET sec=00
                      
                      IF "%1" == "" ECHO No Parameter && EXIT /b
                      
                      :Label1
                      TIMEOUT 1 > nul
                      
                      if %time:~0,2% GEQ %hour% (if %time:~3,2% GEQ %min% (if %time:~6,2% GEQ %sec% (GOTO Label2)))
                      
                      IF %time:~6,1%==0 (SET /a secr=%time:~7,1%) ELSE (SET /a secr=%time:~6,2%)
                      IF %time:~3,1%==0 (SET /a minr=%time:~4,1%) ELSE (SET /a minr=%time:~3,2%)
                      
                      SET /a hourr=%time:~0,2%
                      SET /a secs=(%sec%-%secr%)+(%min%-%minr%)*60+(%hour%-%hourr%)*3600
                      SET /a remh=%secs%/3600 %% 24
                      SET /a sect=(%secs%-(%remh%*3600))
                      SET /a remm=%sect%/60
                      SET /a sect=(%sect%-(%remm%*60))
                      SET /a rems=%sect%
                      
                      echo Remaining time to execution: %remh%h%remm%m%rems%s
                      CLS
                      GOTO Label1
                      :Label2
                      ECHO It's time to execute the scheduled script: %time%
                      ECHO Executing %1 ...
                      \\myserver\myshare\path\%1
                      

Open in new window

Explanation:

  • We first set the time we want the script to start (must be before midnight)
  • At the end of the file, we also adapt the code to point to our real server name, share name and path to the script to be executed
  • We test if a script file name was passed as an argument
  • We make a 1 second pause and then check if we passed the planned execution time, if yes, we jump to execute the script passed as argument
  • We calculate the time remaining in seconds and convert it to HH:MM:SS
  • We loop
We could work this script a bit more to pass the time of execution and path to the script to be executed as arguments too. That would be another subject since the scope of this article is more on how to close accessed files than batch scripting skills.

A limitation to this whole approach is that an running application using the updated file may crash when the file is freed or when the user starts using the application again. So be a kind IT and send an email to users when you update their files :)
For information, the Windows File Explorer puts a lock on the folder that is on display on a user's device and if the folder is freed, Windows will lock the folder again after a few seconds, and fortunately that does not prevent the copy/renaming of files and folders.

Conclusion: We can easily overcome the missing feature of the OPENFILES command which does not provide a switch to recursively close all open files in a specified folder. With a bit of work, we can even make a practical script that merges that missing function with a routine to recursively update files in that same folder. Thanks to the ROBOCOPY function, we even get a full log of the update process.

I hope this article helps some ITs keep their update scripts simple and functional.
0
1,899 Views

Comments (0)

Have a question about something in this article? You can receive help directly from the article author. Sign up for a free trial to get started.