• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 1666
  • Last Modified:

Out of memory error using memory datasets with a Delphi application

I have an application which reads data from a CSV files, import it temporary into a memory dataset, and then send the data into a physical database (it doesn't matter which RDBMS).

This application have worked for ten years, users loading data every day, in small chunks. Again ten years.
Data loads have increased over the time, but in all this time, it has been only like 10% to 20%.

But this week, due an internal change in the source of the CSV files (more detail was included) the size of these files doubled (100% to 150% increase) and my users started to get an "Out of memory" error in the middle of the loading process.

Note that I've tried with several dataset memory components, like the one with Rx, DevExpress, Jedi, and so on. Any of them, leads to the same error, aproximately in the same point, at the middle. So changing the component, didn't help.

I know I could rewrote the whole thing and avoid a memory dataset.
I know I could just tell the users to split the files into smaller chunks.
I know that somewhere, somehow, there are memory limitations and then having several gigabytes of RAM doesn't help at those situations.


But what I don't know is if there's a way to overcome these limitations.
A compiler directive may be?
Something to make Delphi aware of the huge amount of available memory and don't throw these stupids errors?

(This application is compiled with Delphi 7)

0
fischermx
Asked:
fischermx
2 Solutions
 
jimyXCommented:
Delphi is giving that message based on your system limitation/configuration. I am not sure, but I doubt if there is such compiler directive.
What you can do is change the system parameters and let's see how it works:
http://support.microsoft.com/kb/126962
0
 
Ephraim WangoyaCommented:

There really is no easy way to overcome this except by rewriting your code more efficiently, no compiler directive for sure.
Its not only the memory dataset that holds the data in memory but the csv file itself is also loaded in memory

Here is what I would do
1. The first thing to do is use a memory mapped file instead of loading the whole file in memory
2. Get rid of the memory dataset, use query objects instead with transactions
0
 
fischermxAuthor Commented:
Thanks for your help.

I found the problem and actually was not the memory dataset fault :(
It was a supposedly tiny array of a record, that didn't use to grow a lot, but with the business logic changes to the sources CSV files caused it to grow.
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Free Tool: ZipGrep

ZipGrep is a utility that can list and search zip (.war, .ear, .jar, etc) archives for text patterns, without the need to extract the archive's contents.

One of a set of tools we're offering as a way to say thank you for being a part of the community.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now