Link to home
Start Free TrialLog in
Avatar of santsingh
santsingh

asked on

Segmentation fault(Core Dumped)

Hi,

I am trying to read a very large file (32 GB) in C language on Solaris platform. The program processes upto 30GB and then crashes by displaying this message

"Segmentation fault(Core Dumped)"

I need some experts' help for this. Any help in this regard will be appreciated.

[Note : The program is compiled with gcc on Solaris.]

Thanks
Sant Singh
Avatar of Kent Olsen
Kent Olsen
Flag of United States of America image


That's a really strange place in the I/O of the file for it to blow up!  You've obviously read millions of records and still plugging along.  But here's some things to check:

1)  Are you reading fixed length blocks or lines of text?  If you're reading lines of text MAKE SURE that the buffer is at least 2 characters longer than the longest line that you expect to read and that the record length in the fread() is shorter than the buffer length.  Also make sure that you use fread() and NOT fgets().

2)  How much memory managment (malloc(), free()) are you performing between reads?  While there is no problem mixing I/O and memory management, you could be allocating a buffer that's too small.

3)  Do you know the contents (or line number) where the application is bombing?  Make sure that the data at that location is not corrupt.


Sorry I can't give specifics, but there are pretty good general places to start.

Kdo
Avatar of santsingh
santsingh

ASKER

Thanks Kdo

I am using fread (buffer,1,sizeof(usage_struct),pFile) to read the record. The Record is of fixed length.

Memory to the structure is allocated only once before entring into the loop and the same structure is used in the loop.

The contents are in a binary file and I cannot verify by manually looking at it. Is there some wayto verify this.

I tried the smaller file and found that it is processing upto end but it crashes at the last record. The 30 GB which is processed is actually the size of the ouput file (a ascii file generated as output by the program).

Thanks
Sant Singh

Thanks Kdo

I am using fread (buffer,1,sizeof(usage_struct),pFile) to read the record. The Record is of fixed length.

Memory to the structure is allocated only once before entring into the loop and the same structure is used in the loop.

The contents are in a binary file and I cannot verify by manually looking at it. Is there some wayto verify this.

I tried the smaller file and found that it is processing upto end but it crashes at the last record. The 30 GB which is processed is actually the size of the ouput file (a ascii file generated as output by the program).

Thanks
Sant Singh


"The last record" -- that's usually a pretty telling event.

Check your end-of-file processing.  make sure that you close the file and drop any buffers after processing the data.

Perhaps end-of-file processing uses your input structure (buffer) for some other purpose and overflows it?

I don't know what trace capabilities you've got.  I think that I'd start by creating a test file that consists of the last 3 or 4 records and process it.  The add a few printf() statements to the code until you isolate the problem.


Good Luck,
Kent
No comment has been added lately, so it's time to clean up this TA.
I will leave a recommendation in the Cleanup topic area that this question is to:

Be PAQ'd/Points No Refunded

Please leave any comments here within the next seven days.

PLEASE DO NOT ACCEPT THIS COMMENT AS AN ANSWER!

Paul
EE Cleanup Volunteer
ASKER CERTIFIED SOLUTION
Avatar of Mindphaser
Mindphaser

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial