Solved

# ActiveX script to check for a file in a directory, log the filename and then rename it

Posted on 2008-10-10
757 Views
Hi Experts,

I currently have the attached script working as a .vbs file on my desktop, but I didn't write it and I don't have any experience in this kind of thing, which is causing me a problem as I need to modify it to do what I need.

Basically, I need to include an ActiveX script in a dts package that I'm working on that will check a folder to make sure there is a file in there (which will always start with the same string but will end with a random one). There should only ever be one file in there. Once it's checked that it is there it needs to somehow make a note of the original filename. Ideally this would be stored in a database table as I'll need to reference it later. Then it needs to rename the file to 'feed.in'

Can anyone help with modifying this script to do what I need, or possibly writing a new one that will do the job??

Please note that this is just a script I found elsewhere on EE so the fact that it renames all of the files in a directory doesn't relate to what I actually need it to do, it's just that this was the closest I could get to what I wanted!

SWD
Const folder = "C:\documents and settings\swd\my documents\testfolder"
Const logfile = "logfile.log"

' ===

If folder = "" OR logfile = "" Then
WScript.Quit
End If

Dim fso: Set fso = CreateObject("Scripting.FileSystemObject")

Set objfolder = fso.GetFolder(folder)

If fso.FileExists(folder & "\" & logfile) Then
fso.DeleteFile(folder & "\" & logfile)
End If
Set objlog = fso.CreateTextFile(folder & "\" & logfile, true)

Call LoopFolder(objfolder)

Sub LoopFolder(folderobj)

Set objfiles = folderobj.files
Set objsubfolders = folderobj.subfolders

for each subfolder in objsubfolders
Call LoopFolder(subfolder)
next

i = 1

for each file in objfiles
if not file.Name = logfile Then
objlog.WriteLine("[" & Now() & "] Renaming File " & file.Name & " to " & folderobj.Path & "\" & folderobj.Name & i & "." & fso.GetExtensionName(file.Name))
file.Move folderobj.Path & "\" & folderobj.Name & i & "." & fso.GetExtensionName(file.Name)
End If
i = i+1
next

End Sub

0
Question by:SurreyWebDesigner
• 12
• 11
• 4

LVL 50

Accepted Solution

Dave Brett earned 250 total points
ID: 22691327
something like this

Change you file path and log path to suit. the code appends the file name to a csv

Cheers

Dave
Const FolderPath = "C:\excelfiles"
Const logfile = "C:\excelfiles\log\logfile.csv"

Dim fso, objFile, objfolder
Set fso = CreateObject("Scripting.FileSystemObject")
Set objfolder = fso.GetFolder(FolderPath)

If objfolder.Files.Count = 1 Then
For Each objFile In objfolder.Files
If fso.FileExists(logfile) = False Then
Set objlog = fso.createtextfile(logfile, True)
Else
Set objlog = fso.openTextFile(logfile, 8, True)
End If
objlog.WriteLine "[" & Now() & "] Renaming File " & objFile.Name & " to " & FolderPath & "feedin.in"
objFile.Name = "feed.in"
Next
End If

0

LVL 51

Expert Comment

ID: 22696872
Dave (aka brettdj) looks like he has fixed your VB Script for you, but you can also do it all using T-SQL if you wanted to (using xp_cmdshell to run a batch job)... Do you want to explore that ? Or, are you have with brettdj solution ? If you are, you will still need to give him some points for getting the script thingy done...

0

Author Comment

ID: 22697233
I've yet to check out the improved script but it certainly looks like what I need so happy to assign points. However the suggestion of doing it using a batch job sounds interesting. Would be keen to hear more on that.

I'll review the script tomorrow and will post again once I have.

Many thanks to both of you. Very much appreciate it.

SWD
0

LVL 51

Expert Comment

ID: 22697441
OK, so here is some SQL to get filenames from DOS dir command into a table (just a memory table), and then loop through all those entries and rename a file to feed.in then you would process that file, then move on to the next (if any)...

--
-- declare a few variables and a memory table (could also create table #tbl etc)
--
declare @doscommand varchar(1000)
declare @filename varchar(100)
declare @tbl table (id int identity,files varchar(200),newname varchar(100))
--
-- now get the DIR contents and store the output in our memory table
--
insert @tbl (files)
exec master..xp_cmdshell 'dir /b c:\ee\ct*.txt'

-- have a quick peek to see what it has captured...

select * from @tbl
--
--  now, there could be multiple results as well...
--  so, safer to set up a loop through that table
--
select top 1 @filename = files from @tbl order by files desc
begin
--
--  Now we have a real file to process, we can do the "feed.in" business, just copy for now,
--  but would be REN instead of COPY /Y if same path, else MOVE, or just copy and kill later...
--
set @doscommand = 'copy /Y c:\ee\'+@filename+' c:\ee\feed.in'
exec master..xp_cmdshell @doscommand
--
--  process feed.in e.g. call a stored procedure, in-line, run dtsrun script etc...
--  then update that filename to say "finished"
--
update @tbl set newname = 'c:\ee\feed.in' where files = @filename
--
--  get next file (if there is one)
--
select top 1 @filename=files from @tbl where newname is null
end

0

Author Comment

ID: 22701364
Hi Mark,

That certainly looks like a good solution, but currently getting this error:

Server: Msg 197, Level 15, State 1, Line 15
EXECUTE cannot be used as a source when inserting into a table variable.

Any ideas?

Many thanks
SWD
0

Author Comment

ID: 22701479
Hi brettdj,

Thanks again for the script you posted the other day - I've tried it out and it's renaming the file as expected and logging the result to a csv file, which is great. The task does fail though giving an error message of "Invalid Task Result value" ... I'm guessing I need to tell it at which point the task is deemed a success???

Cheers,
SWD
0

LVL 50

Expert Comment

ID: 22701528
Hi,

On which line is it producing the error message - the renaming is close to the end so I'm wondering if you have added any lines?

Cheers

Dave
0

LVL 51

Expert Comment

ID: 22701571
Sorry, that was sql 2005 syntax, can still do it, but need to use a temp table,

so simply change the @tbl to #tbl

The temp table will auto drop if running through a stored procedure or similar, but interactively, can add the drop table #tbl if you like...

declare @doscommand varchar(1000)
declare @filename varchar(100)
create table #tbl (id int identity,files varchar(200),newname varchar(100))

insert #tbl (files)
exec master..xp_cmdshell 'dir /b c:\ee\ct*.txt'

select * from #tbl

select top 1 @filename = files from #tbl order by files desc
begin

set @doscommand = 'copy c:\ee\'+@filename+' c:\ee\feed.in'
exec master..xp_cmdshell @doscommand
--
--  process feed.in call a stored procedure, in-line, whatever
--  then update that filename to say "finished"
--
update #tbl set newname = 'c:\ee\feed.in' where files = @filename

select top 1 @filename=files from #tbl where newname is null
end

select * from #tbl

drop table #tbl


0

Author Comment

ID: 22701637
brettdj,

This is what I currently have (but it doesn't give a line number with the error):

'**********************************************************************
'  Visual Basic ActiveX Script
'************************************************************************

Function Main()
Const FolderPath = "c:\inetpub\wwwroot\in"
Const logfile = "c:\inetpub\wwwroot\in\log\logfile.csv"

Dim fso, objFile, objfolder
Set fso = CreateObject("Scripting.FileSystemObject")
Set objfolder = fso.GetFolder(FolderPath)

For Each objFile In objfolder.Files
If fso.FileExists(logfile) = False Then
Set objlog = fso.createtextfile(logfile, True)
Else
Set objlog = fso.openTextFile(logfile, 8, True)
End If
objlog.WriteLine Now() & "," & objFile.Name
objFile.Name = "feed.in"
Next

End Function
0

Author Comment

ID: 22702227
I know this solution is very nearly completed (for which I'm very grateful), but could I ask another question on the same subject? If I need to raise another question then please let me know.

Now that I've renamed my file to feed.in I run a load of checks on the data and export it to a file called feed.out. I've also imported the data from logfile.csv into a table. The question is, how do I rename feed.out to be the same filename as the original name of feed.in (but with a .out extension of .in).

So for instance, the original feed file is called 'something123.in' and ultimately I need to be saving out a file called 'something123.out' - but obviously I won't know the filename beforehand hence the need to save it out to a csv file.

Many thanks
SWD
0

LVL 51

Assisted Solution

Mark Wills earned 250 total points
ID: 22703125
Well, using the script procedure, the table name is "known" and can be passed around (ie as @filename),  in the VB script, similarly, logfile has the tablename (though I did think it was going to be variable).

So, if we were to use a stored procedure to process / run the ETL process by way of an example, (or could simply xp_cmdshell 'dtsrun mypackage.dts'  instead of a stored proc), then your script could be readily altered to :
1) find filename.csv
2) rename to feed.in
3) process (ie load and do something)
4) rename to filename.out (and again, just an example, you may need to export from step 3, which is a little bit more work)
5) clean up and go home...

but first, create a stored procedure...

--
-- First have to create the stored proc to do the processing
--
-- doesn't do much, just a bulk insert - or whatever - it is just an example
--
-- creating is a "once-off" process, do not have to do it again, once the stored proc is there, can use as needed...
--

create procedure uProcess_Feed_In (@result int output)
as
begin

create table #tmp_staging_feed_in (col1 varchar(100), col2 varchar(100), col3 varchar(100), col4 varchar(100), col5 varchar(100), col6 varchar(100), col7 varchar(100), col8 varchar(100))

bulk insert #tmp_staging_feed_in from 'c:\ee\feed.in' with (fieldterminator=',', firstrow = 2)

select * from #tmp_staging_feed_in

set @result = 0

end
GO

--
-- now we can use that proc to do the actual ETL process, and if successful, rename it to .out (using copy at the moment)
--
declare @doscommand varchar(1000)
declare @filename varchar(100)
declare @result int
create table #tbl (id int identity,files varchar(200),newname varchar(100))

insert #tbl (files)
exec master..xp_cmdshell 'dir /b c:\ee\ct*.txt'

select * from #tbl

select top 1 @filename = files from #tbl order by files desc
begin

set @doscommand = 'copy c:\ee\'+@filename+' c:\ee\feed.in'
exec master..xp_cmdshell @doscommand
--
--  process feed.in call a stored procedure, in-line, whatever
--
exec uProcess_Feed_In @result output
--
--  then copy that file across as "ready" (ie ext of .out) or "errored" (ie ext of .err)
--
if @result = 0
begin
print 'processed successfully, renaming to .out '+@filename
update #tbl set newname = 'c:\ee\'+left(@filename,charindex('.',@filename))+'.out' where files = @filename
set @doscommand = 'copy c:\ee\feed.in c:\ee\'+left(@filename,charindex('.',@filename))+'out'
exec master..xp_cmdshell @doscommand
end
else
begin
print 'something went wrong in processing, renaming to .err '+@filename
update #tbl set newname = 'c:\ee\'+left(@filename,charindex('.',@filename))+'.err' where files = @filename
set @doscommand = 'copy c:\ee\feed.in c:\ee\'+left(@filename,charindex('.',@filename))+'err'
exec master..xp_cmdshell @doscommand
end
--
--  then check and get next file in the list
--
select top 1 @filename=files from #tbl where newname is null
end

select * from #tbl

GO

0

Author Comment

ID: 22703301
Couple of things that go wrong when I try do this:

1. When I parse the query 'EXEC sp_RenameFiles' to run the stored proc from the dts package (the second stored proc that you listed previously), I get the error message 'Invalid object name #tbl'.
2. When I actually run that stage of my dts package I get a different error message 'Could not bulk insert because file 'c:\inetpub\wwwroot\in\feed.in' could not be opened. Operating system error code 3' ... I've had this before and I think it's because our instance of SQL Server is on a virtual machine but the machine we create the dts on is not. It seems like most of the times I refer to a file on the local machine I'm on it's not a problem, but when using this Bulk Insert thing it looks at a different machine (one which I don't have access to).

Does that put a complete block on doing things this way??

Thanks
SWD
0

LVL 51

Expert Comment

ID: 22703565
1) would need to see the proc. Should be OK as long as #tbl is being created within that proc.
2) given 1 above, then did the file actually get copied ? The server has to be able to see the directory. More importantly the "user" running the DTS must be able to - how are you running it ? Via dtsrun and a scheduled bacth job, or via SQL Server Agent ? If the latter then it will most likely be the user profile that the service is being run from.

No, it shouldn't put a complete block, just need to chase through the security hurdles...
0

Author Comment

ID: 22703704
1) Well I created 2 stored procedures and then a sql task in the dts package to kick off the second one (and all I've changed is the file location).
2) The files are all there and I have all the necessary access but it just seems to be the way our sql server instance is set up that certain functions (such as Bulk Insert) are looking at a completely different server when looking for files. Most task types in SQL Server are looking at the local drives but Bulk Insert looks at the drives on the VM. For instance, when I add a Bulk Insert task to my package and goto the properties and then click on the browse button to find the source data file I'm shown a completely different file structure to our local server. But when I do the same thing to find a source file in any other type of task I'm shown the local file structure. I hope that makes sense!

I have brettdj's solution 'almost' working perfectly - do you think it would be easier to continue down that route given this issue I'm having??

Many thanks
SWD
0

LVL 51

Expert Comment

ID: 22703808
Sure thing...
0

LVL 51

Expert Comment

ID: 22707047
Curious, how are you doing the import using brettdj's solution ? If not bulk insert, then why are you trying bulk insert with the scripted solution ? shouldn't you be using whatever your import is ? That bulk insert was only an example after all...
0

Author Comment

ID: 22709410
I'm just using a Transform Data task. Apologies if I'm being a little dumb here but I can't figure out how to implement your code without using that first stored proc.

I've managed to sort out brettdj's original solution so my package is now succeeding, however it doesn't quite do what I need it to yet. If I try and explain exactly where I'm at with this now perhaps you can guide me in the right direction??

1. Rename file in the 'in' folder to feed.in and save the original filename to Logfile.csv
2. Clear T_IncomingFeedData
3. Import data from feed.in to T_IncomingFeedData
4. Check the data for errors and update each row with either a success or failure value
5. Run various update tasks within the db for each row that successfully passed the checks in Step 4
6. Clear T_FileNameLog
7. Import data from Logfile.csv to T_FileNameLog
8. Export data from T_IncomingFeedData to a file called feed.out

So basically I now need to look at the data in T_FileNameLog, order it by descending date order and take the latest filename, then rename feed.out to match it (but keeping the .out extension instead of the .in extension that would come from the db table).

I could also do with only importing the latest line from logfile.csv, or alternatively clearing logfile.csv before running Step 1. I can't figure out how to do this though i'm sure it's pretty easy. I couldn't even get the thing to delete, but maybe I was just having a bad day yesterday!!

I realise this question is dragging on now so I really am grateful for your continued help.

Many thanks
SWD
0

LVL 51

Expert Comment

ID: 22710534
I would have made it an SQL task, just paste the SQL, probably what you are doing with Active X. then do the Transform Data task etc...

Didn't really mean to confuse by having it all in the one T-SQL script, though, it can be done...

Confused a bit by step 7 being kind of a replicant of step 3...

But, why not create a global variable and stick the original name in that ?
0

Author Comment

ID: 22712587
Well Step 7 was to import the contents of the logfile into a table so that I could get back the filename, whereas Step 3 was to import the actual data from the feed. But I can see now that Global Variables were the better solution here so I have removed those steps. Thanks!

So I'm 99.9% done with this now and I'm more than happy to assign points at this point, but if I can get an answer to this final piece of the puzzle I'll be even happier!! :-)

So I've now managed to create the outbound feed file with the same name as the incoming feed but they're obviously being generated as a .in file, whereas I need them to be .out files.

I was hoping there would be a simple ActiveX script to change the extension of a file but I can't find one. Can you help??? I guess another way would be to strip the final 3 characters from the whole filename (i.e. the '.in' bit) and then add .out to the end.

Many thanks again
SWD
0

LVL 51

Expert Comment

ID: 22712677
Yep, just strip and put... Say your variable is @filename (yes I know it is not quite correct)...

left(@filename,charindex('.',@filename))+'out'
0

Author Comment

ID: 22712827
Cool - so where do I put that exactly (sorry if that's a dumb question!).

This is my final step so far (renaming the feed.out file) - would I put it in here somewhere? When I tried to bung it in I got a syntax error.
Function Main()
Const FolderPath = "c:\inetpub\wwwroot\out"
Const logfile = "c:\inetpub\wwwroot\out\log\logfile.csv"

Dim fso, objFile, objfolder
Set fso = CreateObject("Scripting.FileSystemObject")
Set objfolder = fso.GetFolder(FolderPath)

For Each objFile In objfolder.Files
If fso.FileExists(logfile) = False Then
Set objlog = fso.createtextfile(logfile, True)
Else
Set objlog = fso.openTextFile(logfile, 8, True)
End If
objlog.WriteLine Now() & "," & objFile.Name
objFile.Name = DTSGlobalVariables("FeedFileName").value
Next
End Function

0

Author Comment

ID: 22713337
Not to worry!! I found a script that does what I need - so I'm finally done!!!!!!

Huge thanks for your help - how do you suggest I split points?? I'm guessing 50/50???

Cheers
SWD
0

LVL 51

Expert Comment

ID: 22713637
50/50 sounds good, can accept brettdj as answer and give him 40% if you think there was more dialogue / more hand holding from one :)
0

LVL 50

Expert Comment

ID: 22738845
I'm not fussed re the points, I have enough :)

Your final code is pretty much the same as my initial post, mark_wills was tenacious on providing further input

So whatever makes sense to you

Cheers

Dave
0

LVL 51

Expert Comment

ID: 22738934
Now you are making me feel bad... was not so much tenacious as attentive... Give it to the Aussie then...

0

LVL 50

Expert Comment

ID: 22748620
It wasn't at all intended that way - your input was valuable and by tenancious I meant that you did stick with the query, ie I agree fully with the attentive

To me it is a clear 50/50

Cheers

Dave
0

Author Closing Comment

ID: 31505054
Many thanks to both of you for all of your help on this.
0

## Featured Post

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

JSON is being used more and more, besides XML, and you surely wanted to parse the data out into SQL instead of doing it in some Javascript. The below function in SQL Server can do the job for you, returning a quick table with the parsed data.
With User Account Control (UAC) enabled in Windows 7, one needs to open an elevated Command Prompt in order to run scripts under administrative privileges. Although the elevated Command Prompt accomplishes the task, the question How to run as script…
The purpose of this video is to demonstrate how to set up the WordPress backend so that each page automatically generates a Mailchimp signup form in the sidebar. This will be demonstrated using a Windows 8 PC. Tools Used are Photoshop, Awesome…
Using examples as well as descriptions, and references to Books Online, show the documentation available for date manipulation functions and by using a select few of these functions, show how date based data can be manipulated with these functions.