Link to home
Start Free TrialLog in
Avatar of shawn857
shawn857

asked on

access violation at address in module ntdll.dll

Hi Experts, I got one of these errors during a run of my app while processing a large datafile. Oddly enough, during a similar run with same file 10 minutes prior, this error did not occur and my app ran clean. I rebooted after this error and ran my app again on this same large datafile (5 or 6 times) and the error did not happen again. Should I be concerned about my app, or does it look like just some kind of OS glitch that was cleared up upon reboot?

Also, regarding these "access violation" errors (and similar errors causing crashes), would there happen to be some sort of magical Delphi component that I can add to my project which would output some sort of log file indicating where in my app (source code line number?) the crash occurred?

Thanks
    Shawn
ASKER CERTIFIED SOLUTION
Avatar of Geert G
Geert G
Flag of Belgium image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Seem that you have a memory leak - when os is run long enough and a resources are little come out - maybe this error occurre. Maybe possible collision with some other process.

There is EurekaLog:
http://edn.embarcadero.com/article/39115
http://delphi.about.com/od/productreviews/ss/eurekalog.htm
https://www.eurekalog.com/

or madExcept:
http://madshi.net/

... tools/components/unit to help you out.
Personally, think that madExcept is a very good option - and do exactly you want.

How do you access/process large file? Did you access exclusive to other process?

---------

EDITED: seem that I was little late....
Avatar of shawn857
shawn857

ASKER

Thanks guys! I already use FastMM4 in my project - would using madExcept as well be okay, or would they "compete" with each other?

Sinisa: I read and process my large data file exclusively. No other process accesses the datafile.

Thanks!
    Shawn
You never know who want to access... (virus, anti-virus, fbi... :-)
FastMM is good for memory leak + speed string manipulation, but for exceptions is madExcept. Few times I have problem with FastMM and externall dlls. (both needs FastMM) - use latest version.
are you loading the complete (very large) file ?
how ? in a stringlist ?
Geert: To read in the very large file, I just read one record at a time into a normal string variable, process that record, then read another record in. No stringlist. Very plain vanilla.

Guys: I am trying madExcept and I think it will be what I need to catch exceptions... but when I have it enabled, it seems to slow down quite a lot the process of reading my large datafile. Is this normal?

Thanks
   Shawn
Geert - update to what I wrote earlier. After more testing, it seems my program runs about 10% slower when madExcept is enabled. Not the end of the world, but strange that it slows it down at all.

Thanks
   Shawn
i never leave it on for long. just initially  to find bugs and av

i don't really have mission critical software
and if something slow, the process gets chopped up and spread across app servers

whenever i need to process large amounts of data
> check if the writer of the large file can create a secondary file with only the items i'm interested in
> check if i can extract the data i need myself

fwiw, there is no point in analyzing a log file
the logfile was written based on processed data > read the processed data instead
Maybe you can run faster. How you read file? With Readln or you use allocated memory as buffer? Second one is a right direction.
Sinisa - I read a file of variable-length text lines (ie. a CSV file), so I think I have to use Readln, no?

Shawn
Not necessarily. Read buffer (let say 10k), parse csv and do the job, when you analyze all buffer read another portion and so on until all file is readed. When you detect CRLF chars - you have one record (calculated form previous CRLF).
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Thank you for that example code Sinisa, I'm going to try to play around with it.

Right now in my parsing engine, the bottleneck appears to be string manipulating and concatenating... not so much the File I/O of reading and writing.

Shawn
Guys, I think that this question and another I'm asking ("Faster string concatenation") has evolved away from the initial question I asked and have started to "overlap". As I just mentioned in that other question, when what I really should be doing is finding the fastest method to parse a CSV file and stop trying to split hairs like looking for faster string manipulation routines. I shall open a new question addressing just that and award points and wrap up this one (madExcept and EurekaLog were two very good answers to the original spirit of this question).

Thanks
    Shawn
Thank you Gentlemen!