Link to home
Start Free TrialLog in
Avatar of exhuser
exhuser

asked on

Close Open files in Library

How can I list and close all open files in a particular library?
Avatar of Member_2_2484401
Member_2_2484401
Flag of United States of America image

You can run WrkObjLck on a specific object, but I'm not aware of how to do it for an entire library.

I suppose you could dump all the file-names to an out-file (a table) and write a little CL program to loop through that table and do a WrkObjLck with output(*outfile). That might work.

HTH,
DaveSlash
ASKER CERTIFIED SOLUTION
Avatar of Gary Patterson, CISSP
Gary Patterson, CISSP
Flag of United States of America image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of Member_2_276102
Member_2_276102

I'm not aware of any such facility, though one could be created. Gary's comments give a good reason why it's commonly not a very useful idea. I have to second the thought that work management is a much better approach.

Tom
I recently completed a long-term consulting assignment in a very large IBM i environment as part of a dedicated performance testing team.  Our performance test system was shared between quite a number of people, and connected to a significant number of applications running on other systems.  We repeatedly faced challenges when trying to do isolated performance testing on large batch applications, especially when trying to clear files and reload them with specific test data due to locks from never-ending jobs.

Ending the subsystems was not always a viable alternative due to other testing that may have been going on at the time.  We scheduled tests so that we didn't overlap on critical tables and processes, but it was difficult at times to get dedicated access to the system to perform isolated testing.

So I wrote a quick and dirty tool that did what you want to do:  you gave it a list of files, and it found jobs locking those files, and ended them.  Sure enough, some of the client/server jobs would reconnect pretty quickly.  As a result, I modified the process to submit a second job to end all the lock holders for a given file, and then attempt to allocate each file exclusively.  When the file was allocated, the necessary operations were performed, the lock released, and the program proceeded to the next file in the list.

That approach generally worked.  I wouldn't use the same approach in a production environment, through, due to the risk of creating data inconsistencies in the database.  Better to end user jobs gracefully, end the user subsystems, perform the necessary operations, and resume necessary work.

One final note:

If this is related to backups, you may want to take a look at the Backup and Recovery guide for your OS version, and read up on Save-While-Active.  SWA allows you to perform backup operations on files that have locks, and allows you to do the saves with files in a consistent state relative to each other.  SWA is a widely-used and very stable, reliable technology.
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Don't get me started on frequently-omitted "orderly shutdown" handling in long-running processes.  You SHOULD be able to do a controlled shutdown of a job or a subsystem - but how many applications have shutdown detection built-in?  

Couldn't be much easier:  RPG, for example supplies the %SHTDN BIF to detect when a job is in the process of being ended, CL has RTVJOBA ENDSTS, Java has Job.GetStatus().

And of course applications with properly implemented commitment control usually tolerate job/subsystem shutdown pretty well.
Ditto. Lots can be said, but lots has been said in many forums for many years.

Tom