CreateFileMapping granularity

Is it true that every time I issue CreateFileMapping with -1 as the file handle I allocate 64KB on a paging-file???
I found a notion to that in Richter's book but could not found any concrete evidence.
If it is true, is there any way around writing a private memory manager (assuming I need many varying chunks of shared memory)?
gilgAsked:
Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

x
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

nietodCommented:
I don't know about the 64K issue.  Never heard of it.

However the smallest chunk that could be shared is 4K because the memory pages are 4K.  So if the OS is going to be giving you 64K its is giving you 16 times the minimum that is possible.  That is wastefull, but not increadibly so.  How many "chunks"s are you really going to have and how large are they really going to be.  If it is only a few dozen I wouldn't worry about it.   If it is hundreds, I'd rethink things a bit.
0
chensuCommented:
The documentation says:
If hFile is (HANDLE)0xFFFFFFFF, the calling process must also specify a mapping object size in the dwMaximumSizeHigh and dwMaximumSizeLow parameters. The function creates a file-mapping object of the specified size backed by the operating-system paging file rather than by a named file in the file system. The file-mapping object can be shared through duplication, through inheritance, or by name.
0
nietodCommented:
I think gilg's question is that despite what size you request, does it always use at least 64K?  It certainly uses at least 4k if you request less than 4k.
0
HTML5 and CSS3 Fundamentals

Build a website from the ground up by first learning the fundamentals of HTML5 and CSS3, the two popular programming languages used to present content online. HTML deals with fonts, colors, graphics, and hyperlinks, while CSS describes how HTML elements are to be displayed.

dabblerCommented:
Yes, you must allocate with a granularity of 64K.  This is an OS requirement, and there is no workaround.  Note that the 64K granularity is actually determined by calling GetSystemInfo () [sample code shown].  On all extant Windows OS's, the value is 64K, however.

DWORD GetAllocGran (void)
{
DWORD dwRet;
SYSTEM_INFO si;
memset (&si, 0x00, sizeof (si));
GetSystemInfo (&si);
dwRet = si.dwAllocationGranularity;
return (dwRet);
}

0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
gilgAuthor Commented:
dabbler, I need some hard document proof to that. The only reference I found was a not so clear sentence in Richter's "Advanced Windows".
I'd expect the help on CreateFileMapping to have a large font 'WARNING: Each time you issue me I take up at least 64K on your swap file', wouldn't you?
So where do you get your information from?
Thanks
0
nietodCommented:
From the VC++ online help for GetSystemInfo()
**********************************************************************
dwAllocationGranularity

Specifies the granularity with which virtual memory is allocated. For example, a VirtualAlloc request to allocate 1 byte will reserve an address space of dwAllocationGranularity bytes. This value was hard coded as 64K in the past, but other hardware architectures may require different values.
*********************************************************************
This suggests that the granularity is currently 64K but they may change it in the future (or on different OS"s).  If you are hoping that they will decrease the size in the future you haven't been paying attention for the last 10 years.  If it changes, it probably won't be smaller.
0
nietodCommented:
Note that 64K is not rally all thay much (these days).  You swap file is at least a meg and in probably much more, usually sever times the RAM in your machine.  Even if you are only on an 8 meg machine.  (Well below the minimum required to run office 97!) your swap file is probably at least 16 meg.   How many mappings are you going to be creating? If you create a 100 such mappings that's 640K out of 16 meg.  So about 1/32 of the swap file is being used.  That is not great, but it is not terrible.
0
nietodCommented:
Sorry for the typos!  Hopefully you can understand what I said.
0
dabblerCommented:
See nietod's comment for the documentation (thanks, nietod! :) ).  I also read the info in Richter's book, too.
0
gilgAuthor Commented:
And no software kits for memory management? I would think someone will have the need.
I'm allocating at least a thousand chunks. Most of them very small but I don't want to manage my memory myself because they change often (you know segmentation and all that crap).
0
nietodCommented:
I think you need to rethink your design.  For a 1000 allocations you will be using 6.4 meg of virtual memory.  That might have a serious impact upon performance.  (Although it probably will work.)   What is it that your are trying to do?
0
gilgAuthor Commented:
It IS working. The problem is I already wrote the stuff. This is a DLL that supplies Dictionary services. I wanted the dictionaries themselves to reside in shared memory so they can be passed from a process to another just by passing an handle. I used CreateFileMapping for each new dictionary. Now I'll have to change the whole I guess.
0
nietodCommented:
I don't understand why you will need something like a 1000 mappings.  You should be able to create a few mappings (potentially one) and use different regions in the mappings.
0
gilgAuthor Commented:
That is what I ment when I asked about a memory management tool. If I create one mapping I must manage it somehow and I have a lot to think of like: defragmentation, reallocation with size changes and similar curses. I realy don't want to confront this mess in this stage of my life (I wrote this kind of stuff in the past but it was simpler and I was younger and full of energy then).
0
nietodCommented:
You could write a very inneficient (simple) memory manager and still get a huge memory savings (and therefore a likely performance increase) over your current design.  Without knowing more details I can't say much more.
0
gilgAuthor Commented:
Well, I think that sums up my options. I guess I'll change my atitude and get on with it.
Thanks alot for the good advices.
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Microsoft Development

From novice to tech pro — start learning today.