Link to home
Start Free TrialLog in
Avatar of pari205
pari205

asked on

Sed exit status

HI All,
         this is my code template

                             sed -e s/AAA/aaa/ -e s/BBB/bbb/ -e s/CCC/cccc/ inputfile > outputfile.$$
                              RC=$?
                             if [   $RC -ne 0 ]
                             then
                                    echo "Error with exit staus $RC"
                                   
                             fi

the above shell script will run concurrently at maximum 4.Somtimes the sed returns exit status 2 and the script fails saying the error message : "Error with exit status 2"
It happens sometime and sometimes its not happening.What will be the problem??
What is that exit status 2 means in sed command.
Plz explain me
Or give me some pointers or websites to study the description of the exit status 2 in the sed command.

-Pari.
Avatar of cjjclifford
cjjclifford


Hi,

try the following from the cmdline:

$ sed 's/aaa/AAA' file.that.does.not.exist
sed: can't read file.that.does.not.exist: No such file or directory
$ echo $?
2
$

So, it looks to be like occasionally the input file does not exist (try a "if [ ! -f inputfile ]; then echo File missing; fi" before the sed command). Perhaps the way the concurrency (not sure what you meant by that..) is being done some files are being listed for processing more than once, or something like that...

Cheers,
C
In errno.h, error 2 is:

#define ENOENT          2               /* No such file or directory */

So, apparently sed does not find the inputfile for some reason.

brett, could it be that one process opens the file for reading and it is momentarily locked to other processes? I really don't know enough about locking...

The other issue I can think of is that programs may not use errno.h, so the return codes would be whatever the programmer decided they should be...Is there a way of working out what standard include files were used?

pari205, what Unix (& version) do you use? It may be important to enable us work out where the problem lies...or at least get a definitive answer to what sed return code 2 means
> brett, could it be that one process opens the file for reading and it is momentarily
> locked to other processes?  I really don't know enough about locking...

Possibly, although I would expect an error of 13 (EACCES).


> The other issue I can think of is that programs may not use errno.h, so the return codes
> would be whatever the programmer decided they should be...Is there a way of working out
> what standard include files were used?

The common unix utilities usually return error codes from errno.h for consistency.
If you are using Linux, the full source to the utilities that shipped with your distribution
should be available.  But is is not possible to determine from a shipped (stripped) executable
which header files were included during compilation.


pari205, what Unix (& version) do you use? It may be important to enable us work out where the problem lies...or at least get a definitive answer to what sed return code 2 means
Avatar of pari205

ASKER

But i am sure that inputfile always will be there.

But as  tfewste said "could it be that one process opens the file for reading and it is momentarily locked to other processes? I really don't know enough about locking..."

it may be the problem.But why its not showing any error like file is not ready or No such file or directory.How can i solve this problem.?Do i need to put lock on the input file.And by checking lock file,
do i have to proceed sed command.??Can u plz give me the solution.

Is there any material to study abt this kind of locking in UNIX os.

My UNix flavour is AIX Version 5

-pari


From this and your other question, I suspect that somewhere in your script the error messages are being lost; Try

sed -e s/AAA/aaa/ -e s/BBB/bbb/ -e s/CCC/cccc/ inputfile > outputfile.$$  2> errors.$$            (or 2>> all_errors.log )

You have two problems in your code:
1. $$ doesn't always guarantee a unique filename.
2. You did not mention, how you are launching simultaneous processes.

In #2 if you are using some loop and launching it from a script, then you need to know
what's the limit on number of handles used by a process.

Check using unix commands limit / ulimit.

You can confirm the problem, by printing value of $$ and iteration of your loop.

The reason you don't always see the problem is bot #1 & #2.

Handles / Descriptors are the number of simultaneos files / process a process can launch.
ASKER CERTIFIED SOLUTION
Avatar of NovaDenizen
NovaDenizen

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial