SHardy
asked on
Compressing Files via VBScript
Hello,
I have come across the need to compress some SQL backups as they are made. As such, I needed to create a script that will find the appropriate files, zip them & then delete the original file.
I found the following which seemed to help:
http://groups.google.com/group/microsoft.public.scripting.vbscript/browse_thread/thread/036c926787e64221
I had a completed script, calling the function below, which seemed to be working perfectly. Until, that is, I swapped the test .bak files with something a little bigger in size (from a few bytes to approx 6MB). Now I get the following error:
Error: Object required: 'shl.namespace(...)'
Code: 800A01A8
If I put a msgbox just before the do..loop, i can see the compression dialog and if I wait for this to finish before pressing ok, then no error.
As such, I need to find a different way to test that the compression is complete. This is where I have now come up against a brick wall.
Is anyone able to suggest an alternative method for checking that this has completed before letting the code move on?
For your information, both strSource & strTarget are provided as filenames with full paths. strTarget being the zip file.
Any help would be greatly appreciated.
I have come across the need to compress some SQL backups as they are made. As such, I needed to create a script that will find the appropriate files, zip them & then delete the original file.
I found the following which seemed to help:
http://groups.google.com/group/microsoft.public.scripting.vbscript/browse_thread/thread/036c926787e64221
I had a completed script, calling the function below, which seemed to be working perfectly. Until, that is, I swapped the test .bak files with something a little bigger in size (from a few bytes to approx 6MB). Now I get the following error:
Error: Object required: 'shl.namespace(...)'
Code: 800A01A8
If I put a msgbox just before the do..loop, i can see the compression dialog and if I wait for this to finish before pressing ok, then no error.
As such, I need to find a different way to test that the compression is complete. This is where I have now come up against a brick wall.
Is anyone able to suggest an alternative method for checking that this has completed before letting the code move on?
For your information, both strSource & strTarget are provided as filenames with full paths. strTarget being the zip file.
Any help would be greatly appreciated.
Function ZipFile(strSource,strTarget)
Const ForReading = 1, ForWriting = 2, ForAppending = 8
'write zip file header
set file = fso.opentextfile(strTarget,ForWriting,true)
file.write "PK" & chr(5) & chr(6) & string(18,chr(0))
file.close
'copy source file to zip file
set shl = CreateObject("Shell.Application")
shl.namespace(strTarget).copyhere(strSource)
do until shl.namespace(strTarget).items.count = 1
wscript.sleep 100
loop
set shl = nothing
End Function
SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
Oh. Spoke too soon.
Seemed to work on the 1st group of files I tested in against. However, I am getting the same error again now, after having run it a few times. I haven even increased the sleep to 3 seconds and then to 5 seconds, but with out any luck.
I am a bit confused why it worked 1st time, but then started to fail on the same files???
Seemed to work on the 1st group of files I tested in against. However, I am getting the same error again now, after having run it a few times. I haven even increased the sleep to 3 seconds and then to 5 seconds, but with out any luck.
I am a bit confused why it worked 1st time, but then started to fail on the same files???
ASKER
This is purely guesswork, but when adding a file into a zip folder, is it given some meaningless name until the compression is completed? Is it possible to look in the zip file/folder and check the name of the item/file compared against strSource? If possible, would this be a better way to check for completion?
ASKER
Sorry, I jumped the gun a bit at closing this off. Could you possibly re-nstate it as open? Thanks.
Hi there
Sorry, was away from computer a while. Is it possible to post the whole code then i can run it to see what is happening for you. Are you using a 3rd party zip tool or...?
Regards
Krystian
Sorry, was away from computer a while. Is it possible to post the whole code then i can run it to see what is happening for you. Are you using a 3rd party zip tool or...?
Regards
Krystian
ASKER
Hi,
I am not using a 3rd party ziptool. As this is to run on one of our servers, it is preferred to use tools that are available natively. Full code is below. However, I "think" (again) that I "may" have found the issue...
I tried manually creating a zip/compressed folder & copying the largest backup file into there. I then received a message saying that the resulting zip folder would be too big. A quick search told me that the maxcimum size is 2GB. Presumably this limit relates to the final zip file size? The backup file currently stands at approx 28GB. Based upon the compression rate achieved on the ather backup files, this would probably be compressed to about 4GB.
So, after all this it looks like it won't be possible through the standard Windows compression tool. So I am after an alternative. This would preferably be free, but commercial would obviously be considered. It MUST be usable via command line and VBScript. There must be no size limit. Or at least a reasonably large size limit (source file of 30GB+). I would NOT want to split the archive into multiple files. For managability, I would want single file archives. Any suggestions?
Thanks
I am not using a 3rd party ziptool. As this is to run on one of our servers, it is preferred to use tools that are available natively. Full code is below. However, I "think" (again) that I "may" have found the issue...
I tried manually creating a zip/compressed folder & copying the largest backup file into there. I then received a message saying that the resulting zip folder would be too big. A quick search told me that the maxcimum size is 2GB. Presumably this limit relates to the final zip file size? The backup file currently stands at approx 28GB. Based upon the compression rate achieved on the ather backup files, this would probably be compressed to about 4GB.
So, after all this it looks like it won't be possible through the standard Windows compression tool. So I am after an alternative. This would preferably be free, but commercial would obviously be considered. It MUST be usable via command line and VBScript. There must be no size limit. Or at least a reasonably large size limit (source file of 30GB+). I would NOT want to split the archive into multiple files. For managability, I would want single file archives. Any suggestions?
Thanks
dim fso, vPath, oFolder, vFullPath, oSubFolder, vFile, vSource, vTarget
vPath = "F:\Backup\"
set fso = CreateObject("Scripting.FileSystemObject")
set oFolder = fso.getfolder(vPath)
for each SubFolder in oFolder.SubFolders
if SubFolder.Name <> "master" and SubFolder.Name <> "model" and SubFolder.Name <> "msdb" then
vFullPath = vPath + SubFolder.Name + "\"
set oSubFolder = fso.getfolder(vFullPath)
set oFiles = oSubFolder.Files
for each oFile in oFiles
if datevalue(oFile.DateLastModified) = datevalue(Date) and right(oFile.Name,3) = "bak" then
vSource = vFullPath + oFile.Name
VTarget = replace(vSource,".bak",".zip")
ZipFile vSource,vTarget
end if
next
end if
next
set oFiles = nothing
set oSubFolder = nothing
set oFolder = nothing
set fso = nothing
WScript.Quit
Function ZipFile(strSource,strTarget)
Const ForReading = 1, ForWriting = 2, ForAppending = 8
'write zip file header
set file = fso.opentextfile(strTarget,ForWriting,true)
file.write "PK" & chr(5) & chr(6) & string(18,chr(0))
file.close
'copy source file to zip file
set shl = CreateObject("Shell.Application")
shl.namespace(strTarget).copyhere(strSource)
do until shl.namespace(strTarget).items.count = 1
wscript.sleep 1000
loop
set shl = nothing
End Function
SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
Hi,
Thanks for the link. I will take a look at that.
I am currently carrying out tests with 7Zip (http://www.7-zip.org/). This is an open source compression tool with command line option. In fact, it is possible to use without installing by using a standalone exe for command line zipping. However, I think (if all my testing goes ok) I would still have to install it, otherwise it just makes a job of un-compressing if needed.
I will upate this thread when I have reached any conclusions.
Thanks for the link. I will take a look at that.
I am currently carrying out tests with 7Zip (http://www.7-zip.org/). This is an open source compression tool with command line option. In fact, it is possible to use without installing by using a standalone exe for command line zipping. However, I think (if all my testing goes ok) I would still have to install it, otherwise it just makes a job of un-compressing if needed.
I will upate this thread when I have reached any conclusions.
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
Krystian / Jeff: So long as there are no objections(?), I will split the points between you 150/100. Although there were no solutions given, your comments were helpful to me, especially in the decision to give up on the Windows compression and to use a 3rd party app.
BTW, Jeff, the link you gave does not appear to point to a compression tool, but rather to an opensource IDE. I no longer need the correct link, as 7Zip seems to do the required job. But it would have been good to be able to compare your chosen app to this one.
Thanks again,
Simon
BTW, Jeff, the link you gave does not appear to point to a compression tool, but rather to an opensource IDE. I no longer need the correct link, as 7Zip seems to do the required job. But it would have been good to be able to compare your chosen app to this one.
Thanks again,
Simon
ASKER
The sleep, at 100 milliseconds, was not enough. Having increased this to a second seems to have fixed the problem.
Presumably with a bigger file it takes a little longer to finalise the copy, despite it already telling the OS that there is a file in the zip folder.
I should have realised this earlier. How many other problems are resolved simply by extending a sleep command? :)
Thanks,
Simon