Link to home
Start Free TrialLog in
Avatar of fischermx
fischermxFlag for Mexico

asked on

Out of memory error using memory datasets with a Delphi application

I have an application which reads data from a CSV files, import it temporary into a memory dataset, and then send the data into a physical database (it doesn't matter which RDBMS).

This application have worked for ten years, users loading data every day, in small chunks. Again ten years.
Data loads have increased over the time, but in all this time, it has been only like 10% to 20%.

But this week, due an internal change in the source of the CSV files (more detail was included) the size of these files doubled (100% to 150% increase) and my users started to get an "Out of memory" error in the middle of the loading process.

Note that I've tried with several dataset memory components, like the one with Rx, DevExpress, Jedi, and so on. Any of them, leads to the same error, aproximately in the same point, at the middle. So changing the component, didn't help.

I know I could rewrote the whole thing and avoid a memory dataset.
I know I could just tell the users to split the files into smaller chunks.
I know that somewhere, somehow, there are memory limitations and then having several gigabytes of RAM doesn't help at those situations.


But what I don't know is if there's a way to overcome these limitations.
A compiler directive may be?
Something to make Delphi aware of the huge amount of available memory and don't throw these stupids errors?

(This application is compiled with Delphi 7)

ASKER CERTIFIED SOLUTION
Avatar of jimyX
jimyX

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of fischermx

ASKER

Thanks for your help.

I found the problem and actually was not the memory dataset fault :(
It was a supposedly tiny array of a record, that didn't use to grow a lot, but with the business logic changes to the sources CSV files caused it to grow.