Mem.Allocated & Mem.Allocated.Shared is mainly at 40% of Physical RAM

Environment: Windows 2000 server

Because my users have been complaining about performance, i was wondering about this that all my servers including production (w/ 500+ users, 2 Gig RAM) and development (w/ 10-15 users with 1Gig RAM) are showing close to 40% of PhysicalRAM being Allocated.

Development server (w/ 1GB RAM) LN 6.5 Server:
>sh stat mem
  Mem.Allocated = 422841130  <<<<<<<<< 39.4 % of 1072693248 (Physical RAM)
  Mem.Allocated.Process = 16019496
  Mem.Allocated.Shared = 406821634
  Mem.Availability = Plentiful
  Mem.Free = 2,145,120,256
  Mem.PhysicalRAM = 1072693248

Production server (w/ 2 GB RAM) LN 5.0.10 Server:
>sh stat mem
  Mem.Allocated = 866,651,526      <<<<<<<<< 40.4 % of  2,147,483,600 (Physical RAM)
  Mem.Allocated.Process = 16,727,346
  Mem.Allocated.Shared = 883,378,872
  Mem.Availability = Plentiful
  Mem.Free = 5,840,863,231
  Mem.PhysicalRAM = 2,147,483,600

To experiment in development env , I have tried PercentAvailSysResources = 75   and it changed the values
>sh stat mem
  Mem.Allocated = 321703016     <<<<<<<<< 40 % approx of  804257792 (Physical RAM)
  Mem.Allocated.Process = 16014950
  Mem.Allocated.Shared = 305688066
  Mem.Availability = Plentiful
  Mem.Free = 2823741440
  Mem.PhysicalRAM = 804257792

I also tried 80 and 90 percents value and it changed values relatively.

Now, my problem is why i never found the Mem.Allocated or Mem.Allocated.Shared go above 40% on any busy day on the production server?  I have read that notes allocates its memory itself but ..

Is there anything i could do to allocate more memory to the notes server?

It seems like notes server processes has never utilized/utilizing atleast half of the total physical memory. Can i do something to tweek the allocated memory knowing that notes on windows can use upto 2 GB max?
navgupAsked:
Who is Participating?
 
qwaleteeConnect With a Mentor Commented:
Sorry for getting back so late.  92 is low, it should be in the 97 or better range.  You may have to play with NSF_Buffer_Pool-Size.  Domino normally handles this pretty well by itself, unless you have partitioned servers.
0
 
brwwigginsIT ManagerCommented:
I don't think I've ever seen a way to make Domino take up more memory. I usually only see methods to reduce the amount of memory available to domino.

I've seen some people use in R6 ConstrainedSHM=1 which will force a 2GB limit on Windows but I don't think that will force the server to use more of that value.
0
 
qwaleteeCommented:
Are you running partiotioned servers?

Anyway, you can't convince DOmino to use memory that it does not need.  On my unloaded dev server, it is currently using 20%, on my (not unreasonably) loaded prod server, I have 44%.
0
Free Tool: SSL Checker

Scans your site and returns information about your SSL implementation and certificate. Helpful for debugging and validating your SSL configuration.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

 
navgupAuthor Commented:
qualetee: i had the same situation except 1 server that gets heavily loaded during the peak hrs and still i have not seen the memory being used on that box > 40%. So i wondered when my users started to complain about response time.

Now when i experimented few things on dev env (having 1GB of RAM), i found  that if i leave everything to default, the memory would never be utilized above 40%. But when i set the NSF_Buffer_Pool_Size_MB = 819 (80% of 1GB)  and started loading my server with compact, updall, fixup, many users logging in to uses the resources.. i found the mem.Allocated went up to 910334162  which was higher than 80% of Physical RAM (what i specified in the NSF_buffer_pool_size_MB).

I jumped with joy.. seeing that i m now utilizing my unutilized resources on the box.. but yet one thing unknown is what IMPACT this may have when i push it to production having 500+ users? Should i do it or not ?  What are the risks ?

My prd envs are going to be in ND6 soon and Lotus suggests not to use NSFbufferpool size, instead use percentavailsysresources which didnot work in this case. Any ideas/suggestions?
0
 
qwaleteeCommented:
I have always played this by ear. When I start getting % reads from pool below high 90's, that's when trhere were usually issues.
0
 
navgupAuthor Commented:
In my case Prd Server 1 shows
Database.BufferPool.PercentReadsInBuffer = 92.26%,
Database.Dbcache.CurrentEntries = 17
Database.Dbcache.HighWaterMark = 166
Database.DbCache.Hits = 277596
Database.DbCache.InitialDbOpens = 497932
Database.DbCache.Lookups = 498124
Database.DbCache.MaxEntries = 2241
Database.DbCache.OvercrowdingRejections = 0
Database.BufferPool.Maximum.MegaBytes = 747
Database.BufferPool.MM.Reads = 1460
Database.BufferPool.MM.Writes = 33
Database.BufferPool.Peak.Megabytes = 746

My %read for pool is between 92-94% as you can see.. Do you think that might be a problem ?




0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.