Want to protect your cyber security and still get fast solutions? Ask a secure question today.Go Premium

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 1673
  • Last Modified:

Execute batch files Remotely - IBM AS400 ->NT server

Hi

On our NT Server(ver 4.0) we have business intelligence (Cognos) software/applications, which basically build cubes(olap tool) using data derived from various data sources.

However, we wish to build the cubes remotely on the Server being driven from our IBM AS400 Machine – when data is available, eg:after overnight processing, but not relying on the Cognos scheduler, ie: fixed times.

Background
Currently we schedule our cube builds on our NT server – however, they are reliant on the data being available at the scheduled time.

On occasions this has become an issue, when over night processing on the AS400 takes longer than expected & runs past the scheduled time – even though we have built in a buffer of 1½ - 2 hrs to allow for minor delays/processing.

ie:
Data normally available 02:30
Cube build starts 04:00
Required by the business 07:00

So, we thought, right, rather than relying on fixed times, lets get the AS400 to drive/execute the jobs when the data is available.

- On the whole the cubes will be available earlier
- Building will be complete, when the first user accesses the system, save on resource etc
- In the event of data being unavailable at the normal build time – this no longer becomes an issue – as the AS400 drives the cube build process - earlier later etc

Currently we have several processes taking place, macro’s being run, sqlserver dts jobs etc – we have managed the following:-

1) AS400 to execute remote Dos commands
2) AS400 to execute remote Sqlserver dts jobs(import routines/table builds)

However, we are unable to get the AS400 to build remote cubes …..in either dos batch files or via a macro.

We even tried scheduling the existing cube build batch files using the ‘at’ command – the jobs get scheduled, but will not execute – security is at the Admin level for the account using/running the at command – process starts, but not given any cpu or memory.

Sample of the batch file

rem ** povendorcubebuild.bat
rem ** K Jones
rem ** ver 1.1
rem ** 26 Nov 2002
rem **
rem ** Amended so that the models are not saved on build completion
rem ** -s switch removed.
rem **
rem ** Builds PO cube
"D:\Program Files\Cognos\cer2\bin\trnsfrmr.exe" -i -n1 -r4 -nologo "E:\NEW COURTS BI\new po vendor.pyi"

**** Note the above works fine when scheduled or executed on the server ****

 
Bottom line is that we just want execute the above batch file remotely - comments please.

IBM AS400 – OS400 operating sys(ver5) – using client access express for remote tasks

Thanks
Keith

0
keithjones
Asked:
keithjones
  • 6
  • 5
  • 5
  • +1
1 Solution
 
pbarretteCommented:
Hi Keith,

How about running the batchfile from the NT server, but adding logic that determines if the data is available first?

pb
0
 
Flash828Commented:
start an rsh server on all of the machines.
0
 
Flash828Commented:
One place this is available, btw, is Windows Services for Unix, although it also exists out there.
0
[Webinar] Database Backup and Recovery

Does your company store data on premises, off site, in the cloud, or a combination of these? If you answered “yes”, you need a data backup recovery plan that fits each and every platform. Watch now as as Percona teaches us how to build agile data backup recovery plan.

 
Flash828Commented:
Okay... apparantly RSH server doesn't exist in the Services for Unix package.  Whoops.. disregard the above.  However since that is the case, you can make use of another solution (im sure about this one:)).

You may use the "At" command to schedule jobs on your machines.  However the "At" command may also be used to schedule jobs on remote machines.  So what you may do is create a file called "cubes.lst" on the server, listing all of the names of the machines that you would like to run the remote job on.  Then you use the good ole shell scripting on the server...  it should look something like this:

for /f %i in (cubes.lst) do at \\%i 3:30pm cmd "/c c:\myscriptoncube.bat"

the only thing here is you need to find out how to have the job run immediately, or you can have a script that will write this statement with the current time plus a minute or so...
0
 
Flash828Commented:
If time is not an issue, you can have the cubes running one minute behind the server, and use the current time on the server... but Im not sure if thats a good idea.
0
 
Flash828Commented:
Or you can have the following setup:

File Cubes.lst:
cube1
cube2
cube3

File Script.bat
for /f %i in (cubes.lst) do at \\%i %1 cmd "/c c:\myscriptoncube.bat"

Please note that myscriptoncube.bat should exist on the machien where the script is runnign on... not on the server.

Now you can call script.bat like this:

script 3:30pm

and you should have the same effect.  This way you can have something supply the current time plus a minute to the script.
0
 
keithjonesAuthor Commented:
Gentlemen

Thanks for the comments so far, problem being we have also investigated the ‘at’ command.

The batch files run on their own on the NT Server work fine – however, when scheduled from the remote IBM AS400 machine using the OS400 operating system ver5 via the client Access Express software - it places an entry into the scheduler but it doesn’t execute, either errors, or changes the execute date/time to tomorrow, once it passes the execute time.

I have also created a small dos looping script on the NT server that kicks the cube build process off, once a flag file arrives after all of the data has been ftp’ed over.
Again, if the NT server starts the loop script fine,  it all works, but when we get the AS400 IBM machine to remotely start the looping script, 1 minute before the flag file arrives(just to save on having numerous looping scripts constantly running) again the processes(pid id’s) get allocated when viewed through Task manager, but nothing else seems to happen.
0
 
keithjonesAuthor Commented:
******         ******
AS400          NT Srv
Data    -->    Cube Build
ftp'ed              
******         ******
0
 
keithjonesAuthor Commented:
The only thing that currently seems to work is the following, but its not pretty.

The AS400 machine holds all of the data, over night new data comes in (yesterdays sales etc) this is ftp’ed over before being processed & added to the base tables – saves time.

On the other side of the Network we have our NT Server that builds the cubes…..I started a looping dos script that looks for a flag file that comes over after all of the new data arrives.
New Data arrives & then the flag file is received from the AS400, looping script on the NT server see the flag file, builds the cube etc.

In reality there is a bit more to it, ie: 20 odd files are ftp’ed, product, store structures get refreshed via Sqlserver (imported into sqlserver holding db), the incremental daily sales data first gets appended to a large sqlserver table(holds couple of yrs) – takes seconds, then the cube builds would commence(several of these) .

Initially, because the AS400 team have 24hr support, we wanted them to drive/ be in control of all jobs, ie they would monitor the data being sent, then watch the builds commence etc.
Initially we got the AS400 to remotely copy data from the one directory to another, we then also successfully got the AS400 to remotely run Sqlserver import routines, the finally part was to get it to run the above batch file to build the cubes….

Directly that didn’t work so we started to explore the AT command, hoping that would get around/fool NT - this is where the problems started to occur, its almost like they don’t have the rights permissions, although they use the admin login etc.

Sorry to waffle on & for the long narratives, which hopefully make sense  
0
 
pbarretteCommented:
Hi Keith,

You may want to look into this:
http://www.sysinternals.com/ntw2k/freeware/psexec.shtml

It's meant to run processes as a specific user on a remote machine, but you could just as easily run it against the NT Server from your AS400 connection in the same way you currently run the AT command.

pb
0
 
keithjonesAuthor Commented:
pb

Thanks for that - however, we have been running the AT command with the interactive switch.

Now we are being prompted for Authentication & ODBC logins, even thou these are embeded in the cube model.

So batch file has been amended to include security, just the Auth to crack...
0
 
pbarretteCommented:
Hi Keith,

So does this mean you solved your own problem? It seems like part of the issue is that you are using ComputerA to log into ComputerB which then must pull data back from ComputerA. Then roundabout authentication seems to be breaking down somewhere in the middle.

What you might consider doing, is pushing the data from the AS400 to the NT Server, then remotely launching a job on the NT Server which acts upon the local data.

Either way, the data is going to be transferred over the network, so you aren't duplicating any data transfers, or wasting any time in the process.

So:
AS400 pushes data to NT Server,
AS400 launches build-script on NT Server,
NT Server build-script acts upon data found on NT Server.

Am I missing something here?

pb
0
 
pbarretteCommented:
Hi Keith,

So does this mean you solved your own problem? It seems like part of the issue is that you are using ComputerA to log into ComputerB which then must pull data back from ComputerA. Then roundabout authentication seems to be breaking down somewhere in the middle.

What you might consider doing, is pushing the data from the AS400 to the NT Server, then remotely launching a job on the NT Server which acts upon the local data.

Either way, the data is going to be transferred over the network, so you aren't duplicating any data transfers, or wasting any time in the process.

So:
AS400 pushes data to NT Server,
AS400 launches build-script on NT Server,
NT Server build-script acts upon data found on NT Server.

Am I missing something here?

pb
0
 
keithjonesAuthor Commented:
In essence that’s what we are doing now, we are just fine tuning it.

Yes, we have answered our own question, but I do appreciate peoples comments

Regards
Keith
0
 
pbarretteCommented:
Hi Keith,

No problem.

Since you have solved it yourself, you may wish to post a 0 point question in "Community Support" requesting that this question be deleted and your points refunded to you.

Be sure to post a link back here, so they know which question you are referring to.

pb
0
 
keithjonesAuthor Commented:
PB thanks for that useful comment - thanks also extended to everyone for their comments.

Request to have points refunded


http://www.experts-exchange.com/Community_Support/Q_20557066.html
0
 
SpideyModCommented:
PAQ'd and all 300 points refunded.

SpideyMod
Community Support Moderator @Experts Exchange
0

Featured Post

[Webinar On Demand] Database Backup and Recovery

Does your company store data on premises, off site, in the cloud, or a combination of these? If you answered “yes”, you need a data backup recovery plan that fits each and every platform. Watch now as as Percona teaches us how to build agile data backup recovery plan.

  • 6
  • 5
  • 5
  • +1
Tackle projects and never again get stuck behind a technical roadblock.
Join Now