I have an application which reads data from a CSV files, import it temporary into a memory dataset, and then send the data into a physical database (it doesn't matter which RDBMS).
This application have worked for ten years, users loading data every day, in small chunks. Again ten years.
Data loads have increased over the time, but in all this time, it has been only like 10% to 20%.
But this week, due an internal change in the source of the CSV files (more detail was included) the size of these files doubled (100% to 150% increase) and my users started to get an "Out of memory" error in the middle of the loading process.
Note that I've tried with several dataset memory components, like the one with Rx, DevExpress, Jedi, and so on. Any of them, leads to the same error, aproximately in the same point, at the middle. So changing the component, didn't help.
I know I could rewrote the whole thing and avoid a memory dataset.
I know I could just tell the users to split the files into smaller chunks.
I know that somewhere, somehow, there are memory limitations and then having several gigabytes of RAM doesn't help at those situations.
But what I don't know is if there's a way to overcome these limitations.
A compiler directive may be?
Something to make Delphi aware of the huge amount of available memory and don't throw these stupids errors?
(This application is compiled with Delphi 7)