Solved

Memory Error

Posted on 1998-12-05
17
180 Views
Last Modified: 2010-03-05
My perl program builds an HTML Table from data it pulls from oracle.  Is there any way I can either avoid the error or catch it and end the table before the process abruptly blows up?

Thanks
0
Comment
Question by:annakellers_pg
  • 8
  • 5
  • 2
  • +1
17 Comments
 
LVL 84

Expert Comment

by:ozo
Comment Utility
If you can post the code, perhaps we can suggest a way to use your memory more efficiently
0
 

Author Comment

by:annakellers_pg
Comment Utility
The code works fine for about 20,000 rows.  What I need is a method that checks memory.  Basically I cycle (fetch) though a cursor of data pulled from an oracle database building an HTML table.  The problem is that the PC runs out of memory when my customer asks for too much (like someone asked for data yesterday that pull 1,200,000 .. that is over a million rows).  Is there  a method out there to get the current level of memory ... if it is low I can issue a print "</TALBE>" and tell the user memory is maxed.  Do you know of a method/function out there I can use for memory safety nets.

Thanks

0
 

Author Comment

by:annakellers_pg
Comment Utility
Please help
0
 
LVL 84

Expert Comment

by:ozo
Comment Utility
It sounds like an obvious solution would be to stop at 20,000 rows
But it's still not clear why you need all 20000 rows in memory at once.
If your perl was compiled with -DPERL_EMERGENCY_SBRK, you might try setting $^M
0
 

Author Comment

by:annakellers_pg
Comment Utility
I need a function/subroutine that checks memory.

POINTS UPED
Anna
0
 

Author Comment

by:annakellers_pg
Comment Utility
I've upped my points as hight as possible.

I need to check pc resources.  Is there a system method/function to check memory or a subroutine that someone has used to check memory.

Please help.  I've given all the points I can spare.

Pleaseeeeeeeeeeeeeeee
0
 

Expert Comment

by:orchid2
Comment Utility
If I understand correctly, people use your program (presumably cgi) to query a database.  Then the results are supposed to be returned to them in an HTML table.  Somtimes the results are huge and can deplete the client machine of memory.

If this is correct, it seems to me that you should make your program recieve the query from the database, but only return a reasonable number of results to the user (say, 100 or possibly 1000), with the option to go to the "next page" to see the next 100 or 1000 results.  Or, maybe just return a summary of the query results.

If the users really need ALL the results all at once (for example if they want to drop the results into a spreadsheet or another  program), you could have the query returned as a file on the server that the user could download without opening it in their browser where they'll have memory problems.

Does this make sense?

-Orchid2
0
 

Author Comment

by:annakellers_pg
Comment Utility
The following is a good very-high level solution.  Can you give me some code example.
Here is your reply:

[Only return a reasonable number of results to the user (say, 100 or possibly 1000), with
the option to go to the "next page" to see the next 100 or 1000 results.  Or, maybe just return a summary of the query results. ]

Here is our code loop as of today:

   print "<TABLE BORDER=2>\n";
   $sql = "SELECT * FROM MYtable"
   $cursor = $db->prepare($sql) or die "Unable to Execute SQL: $DBI::errstr";
   $cursor->execute or die "Unable to Execute SQL: $DBI::errstr" ;
   while (@row = $cursor->fetchrow_array)
   {
      print "<TR>";
      foreach $Fields (@row)
      {
         print "<TD ALIGN=RIGHT>$Fields</TD>";
      }
      print "</TR>";
      $countit++;
   }
   print "</TABLE>";
}

Points Increased
Thanks

0
How to improve team productivity

Quip adds documents, spreadsheets, and tasklists to your Slack experience
- Elevate ideas to Quip docs
- Share Quip docs in Slack
- Get notified of changes to your docs
- Available on iOS/Android/Desktop/Web
- Online/Offline

 

Expert Comment

by:orchid2
Comment Utility
Hmmm.  Well, I'm new to experts-exchange.  But quite honestly and in all fairness, it seems to me that my answer, though high level, was appropriate to the question (which was rather general) and accurate.  I think you should give me the points for the answer you've received.

If you want help writing the actual code, I think you should make that a new question, and I or someone else can help out with it.  

Remember that responding to questions does take time.  If questioners consistently insist on responses to detailed follow-ups, the result will be fewer experts willing to devote the time to answer the questions.  This will diminish the value of the forum for all of us.

I think that's fair.  Don't you agree?

Regards,

Orchid2
0
 

Author Comment

by:annakellers_pg
Comment Utility
I have not been given a PERL programming memory solution.  I will not willinging give th points when the question has not been answered.  Give me a solutionf or perl.  You gave me a general solution for a complext question.  I did not offer 175 points for a managers answer to a developers questions.
0
 
LVL 5

Expert Comment

by:b2pi
Comment Utility
Something seems odd here.... Why are you running out of memory (I just don't want to get into the philosophy aspects).

1.) I would find it ugly to receive more than 20k rows from a single query, and you'd better hope your server is fast enough to get it all out within 5 seconds (or you'd hit timeout problems on CGI), but... let's assume you want all the rows out.  If we look at the code, we have to wonder what it is that's eating memory....

First, try moving @row up above that loop (could be an autoallocation problem?)
To do this, add

my(@row);

to the program, above the

print "<TABLE BORDER=2>\n";

(Which I'm assuming is NOT in a loop).  If that doesn't work, just as a debug exercise, comment out the code within the loop.  I'd really like to find out if you can execute the entire thing under those circumstances without a memory error....

0
 

Author Comment

by:annakellers_pg
Comment Utility
Have b2pi lock the questions so they can be awarded the points.
0
 
LVL 5

Expert Comment

by:b2pi
Comment Utility
I'm not sure I understand.  Did you find an answer?
0
 

Author Comment

by:annakellers_pg
Comment Utility
b2pi gave me an answer I gave to my developer ...they said it was fine.  Have b2pi lock the question and i'll give up the points.
0
 
LVL 5

Accepted Solution

by:
b2pi earned 170 total points
Comment Utility
I'd presume that it was the undeclared @row which was auto-revivifying?
0
 
LVL 5

Expert Comment

by:b2pi
Comment Utility
Oh, after complaining loudly that you didn't feel the answers you were getting weren't worth anything, thank you so much for expending the time and energy to answer this question, preventing it from autograding.  I'm sure all experts appreciate this just as much as I do.
0
 
LVL 5

Expert Comment

by:b2pi
Comment Utility
Hmmm, re-reading that, for some reason it sounds quite snide.  Not intended.
0

Featured Post

Free Trending Threat Insights Every Day

Enhance your security with threat intelligence from the web. Get trending threat insights on hackers, exploits, and suspicious IP addresses delivered to your inbox with our free Cyber Daily.

Join & Write a Comment

I've just discovered very important differences between Windows an Unix formats in Perl,at least 5.xx.. MOST IMPORTANT: Use Unix file format while saving Your script. otherwise it will have ^M s or smth likely weird in the EOL, Then DO NOT use m…
Email validation in proper way is  very important validation required in any web pages. This code is self explainable except that Regular Expression which I used for pattern matching. I originally published as a thread on my website : http://www…
Explain concepts important to validation of email addresses with regular expressions. Applies to most languages/tools that uses regular expressions. Consider email address RFCs: Look at HTML5 form input element (with type=email) regex pattern: T…
Here's a very brief overview of the methods PRTG Network Monitor (https://www.paessler.com/prtg) offers for monitoring bandwidth, to help you decide which methods you´d like to investigate in more detail.  The methods are covered in more detail in o…

762 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

11 Experts available now in Live!

Get 1:1 Help Now