• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 639
  • Last Modified:

Segmentation fault(Core Dumped)

Hi,

I am trying to read a very large file (32 GB) in C language on Solaris platform. The program processes upto 30GB and then crashes by displaying this message

"Segmentation fault(Core Dumped)"

I need some experts' help for this. Any help in this regard will be appreciated.

[Note : The program is compiled with gcc on Solaris.]

Thanks
Sant Singh
0
santsingh
Asked:
santsingh
1 Solution
 
Kent OlsenData Warehouse Architect / DBACommented:

That's a really strange place in the I/O of the file for it to blow up!  You've obviously read millions of records and still plugging along.  But here's some things to check:

1)  Are you reading fixed length blocks or lines of text?  If you're reading lines of text MAKE SURE that the buffer is at least 2 characters longer than the longest line that you expect to read and that the record length in the fread() is shorter than the buffer length.  Also make sure that you use fread() and NOT fgets().

2)  How much memory managment (malloc(), free()) are you performing between reads?  While there is no problem mixing I/O and memory management, you could be allocating a buffer that's too small.

3)  Do you know the contents (or line number) where the application is bombing?  Make sure that the data at that location is not corrupt.


Sorry I can't give specifics, but there are pretty good general places to start.

Kdo
0
 
santsinghAuthor Commented:
Thanks Kdo

I am using fread (buffer,1,sizeof(usage_struct),pFile) to read the record. The Record is of fixed length.

Memory to the structure is allocated only once before entring into the loop and the same structure is used in the loop.

The contents are in a binary file and I cannot verify by manually looking at it. Is there some wayto verify this.

I tried the smaller file and found that it is processing upto end but it crashes at the last record. The 30 GB which is processed is actually the size of the ouput file (a ascii file generated as output by the program).

Thanks
Sant Singh

0
 
santsinghAuthor Commented:
Thanks Kdo

I am using fread (buffer,1,sizeof(usage_struct),pFile) to read the record. The Record is of fixed length.

Memory to the structure is allocated only once before entring into the loop and the same structure is used in the loop.

The contents are in a binary file and I cannot verify by manually looking at it. Is there some wayto verify this.

I tried the smaller file and found that it is processing upto end but it crashes at the last record. The 30 GB which is processed is actually the size of the ouput file (a ascii file generated as output by the program).

Thanks
Sant Singh

0
Technology Partners: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

 
Kent OlsenData Warehouse Architect / DBACommented:

"The last record" -- that's usually a pretty telling event.

Check your end-of-file processing.  make sure that you close the file and drop any buffers after processing the data.

Perhaps end-of-file processing uses your input structure (buffer) for some other purpose and overflows it?

I don't know what trace capabilities you've got.  I think that I'd start by creating a test file that consists of the last 3 or 4 records and process it.  The add a few printf() statements to the code until you isolate the problem.


Good Luck,
Kent
0
 
paullamhkgCommented:
No comment has been added lately, so it's time to clean up this TA.
I will leave a recommendation in the Cleanup topic area that this question is to:

Be PAQ'd/Points No Refunded

Please leave any comments here within the next seven days.

PLEASE DO NOT ACCEPT THIS COMMENT AS AN ANSWER!

Paul
EE Cleanup Volunteer
0
 
MindphaserCommented:
Force accepted

** Mindphaser - Community Support Moderator **
0

Featured Post

What does it mean to be "Always On"?

Is your cloud always on? With an Always On cloud you won't have to worry about downtime for maintenance or software application code updates, ensuring that your bottom line isn't affected.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now