Windows Server 2008 memory use - how does it work?

I have an HP DL380 server (July 2009) with 4GB of RAM and dual 2.27 Xeon quad core processor set running Windows Server 2008R2 Standard 64 bit.  It is a basic file server hosting an electronic medical records package that uses Ctree to access the databases stored on the server.  At any given time running the electronic medical records software, a single thick client can have 400 files open.  There are usually 35 clients online, meaning there are at least 14,000 files open and in use by thick clients of the server.

The problem I am running into is that I cannot figure out where all of the RAM is going.  I noticed today that the server was really slow while I was administering it over terminal server.  I went to the Performance tab in task manager and noticed that my RAM was 100% used.  I have attached a picture of the memory tab from Resource manager.

If I add up all of the processes, I only come up with about 1.6GB of RAM in use by the processes.  Where is all of the rest of the RAM being allocated and how can I tell? I know the OS itself uses some but I just want some way of knowing the exact allocation.   I am going to add RAM to the server anyway, but I want to understand what is happening here.

Thanks for any insight.

p.s. I wanted to mention that in the screenshot, I had already freed up 275MB of RAM by addressing the issue of DNS consuming a ton of memory. I was always at 0 available before.   I still wonder where all my RAM is, though.  :)
Steve BantzIT ManagerAsked:
Who is Participating?
In general, Windows 2008/Vista and above will use all available RAM for cache, which is perfect for a file server. If your server also holds a database in memory, that needs to have access to RAM as well. The thought is that the RAM might as well be doing something instead of being wasted as "free". Look at the following thread. The links to articles and blog posts are particularly useful.
By default SQL server will allocate just about all available memory for it's own cache. Most of this is not shown in the process working set. If SQL is the only thing running on the server this a good thing. If this causes a problem for other applications this usage can be reduced.

The concept of using all memory in the system for something is not new to Vista. The same principle was followed right back to NT 3.1, it is just that Vista and later are more efficient in dong this. Task Manager in XP and earlier tended to hide this.
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.