SEGV: segmentation violation

We have a program here that intermittantly gives out segmentation violations. It is hoped that the program will be run constantly to monitor a certain directory for new files, and process those files. We are using version 5.004 on AIX. Does anyone know if this is a particular problem with Perl's memory allocation?  The program works fine for a couple of hours, then just crashes at random times.
CabbageAsked:
Who is Participating?
 
b2piConnect With a Mentor Commented:
I'm still looking at why it's happening. OK, actually, that's not true... I've moved to 5.005, which doesn't seem to have the problem.
0
 
b2piCommented:
Most likely a memory leak.  Also most likely it's due to your code rather than perl, since, depending on which version of 5.004 you were using (check perl -v), almost all memory leaks were fixed.) Does the process appear to stay the same, size-wise?

1.) Upgrade your perl to 5.00502 (the performance increases alone are worth it)
2.) Inspect code for obvious memory leaks
3.) Whimper.  These things can be hard to find:)
0
 
tozimekCommented:
I used AIX as well and encountered a similar memory problem.  I modified my program to tell me how much memory it consumed periodically.  I then ran it on multiple AIX machines, some with upwards of 4 GB physical memory. It always failed when I tried to run it at approximately the 400M mark, and this occurred on each of the machines.

If I recall, there is no theoretical limitation to the size of PERL variables, in this case hashes.  I scanned through PERLGUTS and didn't see any reference to this information, but I may have missed it.
0
The 14th Annual Expert Award Winners

The results are in! Meet the top members of our 2017 Expert Awards. Congratulations to all who qualified!

 
b2piCommented:
Nope.  No theoretical limits.
What modules are you using? (Tk can have problems with core sometimes, in particular).  Also, occasionally (you probably are aware of this) one needs to recompile all modules with a perl upgrade.  Such was the case between 5.003 and 5.004 (IIRC).

I'm out 'til Sunday
0
 
CabbageAuthor Commented:
The problem appears to occur when a variable is assigned to a backtick value, for example "my $variable = `ps -af`;", but as I said before, other than when this problem occurs, this assignment works with no problems.  
This assigment is made within a function that is performed repeatedly as the problem monitors processes on the machine.

Looking at the program running now, it does seem to be increasing in size as it runs.
0
 
CabbageAuthor Commented:
Adjusted points to 125
0
 
CabbageAuthor Commented:
Ooop, make that "my @variable = `ps -af`;" - the results are assigned to an array!
0
 
b2piCommented:
Is that in a loop ?
0
 
CabbageAuthor Commented:
Yes, but not a tight loop. It is within a function that is called from the main program loop.
0
 
b2piCommented:
Bingo.  Just discovered this one someplace else.... move the 'my' outside of the loop..., or explicitly assign the variable to something tiny (like undef)....  I'm still trying to figure out what's happening with this (which is biting me with a database), but try this:

sub Whatever {
   my(@variable) = `ps -af`;
.
.
.
   @variable = undef;
}

.Alternatlively, try this
my(@variable);
sub Whateverelse {
   @variable = `ps -af`;
   ...
}


0
 
CabbageAuthor Commented:
Well the program is (basically)

sub Main {
.
do {
    ...
    &dosomething;
    ...
} while ($Condition==1);
.
}

sub dosomething {

    my (@Processes) = `ps -af`;

    ....
    ....
}

The program is intended to run (in theory) forever, or until someone brings it down. The ps -af assignment is not directly in a loop itself, rather it's function is within a loop. Surely a my variables memory will be freed when the value goes out of scope when the function is complete.

0
 
b2piCommented:
I would have said so, but I've got a vaguely similar situation now...

Really, just try taking it outside of the subroutine... It's worthwhile as a debug step.
0
 
b2piCommented:
By the way, did you try upgrading your perl?
0
 
CabbageAuthor Commented:
Upgrading is a long process with the beauracracy around here - the unix admin guys are on holiday too! However, we removed some of the offending code, and rewrote it. The program seems to be working, and not leaking any memory. We're still not sure of exactly why it occured though.

Thanks for your help. Your tips were very useful.
0
 
CabbageAuthor Commented:
After we tested the new code, we found further problems. It seems that the problem is a general problem associated with backticks, and assigning the output to variables. - i.e.

    my (@Files) = `ls -1`;

and the like. Some of the system calls in this program are unavoidable and need to be made, so getting rid of them is not an option.

Once again, the problem seems to exist intermittantly, as the calls can be made a few hundred times or so before the segmentation fault occurs. The program does not increase in size anymore.


0
 
b2piCommented:
(Hmmm, I'll bet your problem (backtick assigns) and mine (unfreed my vars) are related).

Anyhow, any reason you can't replace
my(@Files) = `ls -l`;

with
open(PROC, "ls -l |");
while (<PROC>) { push(@Files, $_);}

0
 
CabbageAuthor Commented:
That seems to have done the trick! The problem seems to exist in backtick system commands (not just assignment backticks), but only sometimes, and only with certain backtick commands, since in some parts of the program, some still exist, and the program seems to work fine regardless. I'm not sure if its an operating system or a Perl problem.
Thanks for your help! BTW, how do I grade this thing? :)

0
 
CabbageAuthor Commented:
Thanks b2pi. We did a workaround on the problem with the open...while though I don't think any of us is clear as to what the problem is! Let me know if you find the root cause of the problem. We'll upgrade as soon as we are allowed.
0
 
snythingCommented:
(this is my first comment, so if i look noob-ish...)

I have tried:
my(@Files) = `ls -l`;

and

open(PROC, "ls -l |");
while (<PROC>) { push(@Files, $_);}

and found both to still be causing random segmentation faults (attempted to free undefined scalar... or something like that).

Finally, as a last resort, i am using something alongs the lines of:
system("ls -l > /tmp/.foobah.tmp");
open(FILE,"</tmp/.foobah.tmp");
@Files = <FILE>;
close(FILE);

Which seems to be quite reliable, no faults as of yet!

I know what you are thinking, urg! ur using the file system... Yes, i expect it to be slow, lucky I dont use it much. And that the @Files will contain linefeeds in every element.

But, on the other hand, it works, and doesnt crash my perl script, thats all i care.

It also proves the point, that the problem is related to perl trying to bring the data back to the array. (something to do with it not freeing up the scalar properly in the previous run through).
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.