We help IT Professionals succeed at work.

Check out our new AWS podcast with Certified Expert, Phil Phillips! Listen to "How to Execute a Seamless AWS Migration" on EE or on your favorite podcast platform. Listen Now

x

Memory usage in a DLL.

abesoft
abesoft asked
on
Medium Priority
973 Views
Last Modified: 2013-12-03
I need some tips on tracking memory (and handle) usage in a DLL.  Our system uses 100+ apps (mostly CGI's) that all share some common library code.  I have just finished moving the library from a static lib into a dynamic lib, figuring that this should produce a major savings in memory and disk footprint.

Wrong!  The size of most apps shrunk from ~700K to ~35K, but the memory usage when they are loaded jumps from ~800K to ~10Mb.  They have also moved from two handles to 60+.  When we are running 10-20 apps at a time, this is a fairly major amount of memory.  (I don't think it will stand up to thousands of simultaneous users!)

I realize that in the DLL form, all of the globals need to be initialized, whereas in the lib, they could potentially be excluded from the link.  I assume that this is what is going on here.

What I need to know is: Are there any utilities that will track the memory and resource (handle) usage for an app.  I'm not too sure what the NT task manager is using to calculate the VM size and mem usage.  I would love to know how much ram is used at the various points in the apps lives, and also what handles (how many of each type, etc) are being used by the app.)
Comment
Watch Question

Commented:
For tracking the opened handles try HandleEx by Mark Russinovich.
http://www.sysinternals.com/nthandlex.htm

Commented:
>>  I have just finished moving the library from a static lib into a dynamic lib,
>> figuring that this should produce a major savings in memory and disk
>> footprint
Not necessarily.  The only memory "kept in common" will be the code memory.  That is the 100 or so DLL's will share one copy of the code.  But each will have their own copy of data.

Commented:
That suggests there will still will be a savings, just not as large as you might think.  However, there might be no savings at all, because uisng an DLL adds memory usage for the tables used to impliment dynamic linking  (and if this is in C++ those tables can be massive!).

Commented:
>> but the memory usage when they are loaded jumps from  ~800K to ~10Mb
How are you measurring that?  that seems suspicious.  

Commented:
You might try importing and exporting from the DLL and apps using ordinals.  This is much more memory efficient (and links faster) than importing and exporting by name.

Author

Commented:
>> but the memory usage when they are loaded jumps from  ~800K to ~10Mb
>How are you measurring that?  that seems suspicious

I agree!  This comes from the NT Task Manager, adding up the "Memory Usage" and "VM Size" fields, both of which are in the 4-6 Mb range.  I am not sure what is added to get those figures, but previously (in the staticly-linked version) they were both <100Kb.  

Commented:
You could gladly use Performance Monitor that comes with NT.
  Bring up your application( that uses the DLL)
  and Perfmon( under admin tools ) .Press the '+' button in the toolbar.
 It stands for 'Add counter'. This brings up a dialog . Select Object
 'Process'.This gives you a bunch of processes listed as 'Instance' . Click
your process, And select a counter. You 'll see a lot of counters.You could select
'Handle count' and 'Page file bytes' for handles used and to keep track of memory.
Click Add and then Done.
You should watch out for the average values of these counters.Handle count
should vary but should never consistantly increase.This means that your DLL or App
leaks handles ( check close handles ).
Page file bytes should normaly start at some number and should
reach a max and should stop.This gives the total virtual memory usage I guess.
In any case this should'nt constantly increase for a server App for e.g .This implies
some virtual memory leaks like shared memory etc.
 Also check out MSDN for performance counter objects that you can implement
in your App. This causes your App to be listed under 'Objects'  in perfmon
and you could specify a bunch of custom counters.

Gopal


 

Author

Commented:
agopal: Thanks.  I have already used perfmon, and can confirm that the problem is not with an ongoing leak.  Instead it is an issue of a whole load of resources getting allocated during the load of the DLL.  I am hoping to track down which parts of the DLL are allocating these handles and memory, to see if I can clean up the situation.  

agreen: Thanks!  This utility has given me a whole lot of information on the actual handles in use for my app.  There seem to be a whole lot of anonymous "Event" handles, which I am going to try to track down.  This is much more informative than the perfmon approach, which gives you a handle count.

nietod: Thanks also for your comments.  (Do you do e-e as a full time occupation, or are you just very quick at typing up your comments.  I think I've seen you in every question I've read...)
As to the vtables taking up a lot of memory space, I would have thought that the various vtables could be stored in the code segment instead of the data segment, or at least in a single constant data segment, so that all running copies of the DLL could then share this one segment.  I may be a bit naive here, though.  (I'm using MicroSoft's tools, and I'm afraid that I don't know enough of the inner workings of the development tools to know how they arrange such things.  I'm looking into it, though.)

Commented:
>> You might try importing and exporting from the DLL and apps using ordinals.  This is much more memory efficient (and links faster) than importing and exporting by name.

Reference please?

Commented:
Alex, I'm not sure what you are asking about.  The reason that ordinals are better is that you don;t have to have huge tables that list the function names.  These tables must apear in both the DLL and the EXE or DLL that uses the DLL.  With C++'s decorated names these tables get huge!.  I have a DLL that 100's of K of exported function names.

abesoft,
>>(Do you do e-e as a full time occupation, or are you just very quick
>> at typing up your comments.  I think I've seen you in every question
>> I've read...)
In other words  nietod, get a life.

>> As to the vtables taking up a lot of memory space, I would have thought
>> that the various vtables could be stored in the code segment instead of
>> the data segment, or at least in a single constant data segment, so
>> that all running copies of the DLL could then share this one segment
I'm not sure what you ar thinking of here.  If you are talking about the virtual functiont tables used in C++, that is not what I was talking about.  Since those are constant they will be stored once per DLL.

What I was talking about is the tables that list all the exported or imported functions by name. See my explanation to Alex above .
Commented:
Unlock this solution with a free trial preview.
(No credit card required)
Get Preview

Commented:
Todd, there is a difference between dynamic linking to a DLL (the LoadLibrary() / GetProcAddress() approach) and "static" linking to it (using an import library).

The first approach would definitely benefit from ordinals.  The second, I'm not so sure.
If you have a (or a reference for) definite answer, please post it.

Commented:
Alex, we're not communicationg somehow.  My point has nothing to do with the way the library is being linked  (static or dynamic.)  Well, it has a little to do with it.   If the DLL exports by names it must contain every one of the names exported.  That can be huge.  Furthermore any DLL or EXE that links statically to it will also have a copy of every one of those names.  So both approaches benefit from using ordinals, although the dynamic linking approach benefits less.  As for references, I have none, but you remember my question a few days ago about how there are no exported functions from the MFC DLL's?  Well There is a reason I was curious.  I have a 700K Dll with about 300K of exported names.  The exported names take up more space than the entire (assembly language) accounting system the code is replacing.  I am going to have to switch to ordinals.   I'll let you know what happens.

Commented:
>> I'll let you know what happens.
Thanx

Author

Commented:
Sorry for not keeping up on this, but I had a few crises to manage.  Anyway, thank you all for your input.  I have accepted vladip's answer, although it really didn't answer my direct question: what tools can track this stuff (memory allocation) for you, on a module by module (dll) basis.  I'm not even sure if such a thing could exist, since dll's are so hard to distinguish from their exe's.

Thanks also to agreen, who provided a most useful tool in his first suggestion.  I have used it, and it is great.  (It didn't really help me to track down my problem, but it did rule out some of my hypotheses...)

Thanks again.

Commented:
I think this was quite an interesting discussion and I would like to invite you a similiar subject.

https://www.experts-exchange.com/Programming/Programming_Platforms/Win_Prog/Q_20536762.html
Which dynamic memory allocation method to use when?

--Filip
Unlock the solution to this question.
Thanks for using Experts Exchange.

Please provide your email to receive a free trial preview!

*This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

OR

Please enter a first name

Please enter a last name

8+ characters (letters, numbers, and a symbol)

By clicking, you agree to the Terms of Use and Privacy Policy.