Solved

Server 2012 memory management

Posted on 2014-03-07
17
496 Views
Last Modified: 2014-03-12
I have a VM installed on a 2012 Standard server. It has 16 GB of RAM allocated to it. right now it is running DHCP and DNS, it is a DC and SQL 2012 is installed. no other applications are installed. SQL is used for a electronic document storage application named PowerFlow. Based on their recommendation the maximum memory for SQL is set at 6 GB. The OS seems to be keeping a large amount of memory in reserve. Task Manager indicates that 96% is in use. This seems to be affecting performance of PowerFlow. It is very sluggish. I also installed Process Explorer screen shot attached.
Procexp-scrnsht..docx
0
Comment
Question by:rettif9
  • 6
  • 3
  • 2
  • +3
17 Comments
 
LVL 7

Expert Comment

by:Lee Ingalls
Comment Utility
Rebooting is sure fire way but inconvenient. Are you running a large number of queries?
SQL Server automatic memory management will request memory from the OS and attempt to keep it as long as it can. SQL is suppose to release it by request from the OS.

Here are two commands:
DBCC DROPCLEANBUFFERS:  flush out data pages from the memory.
DBCC FREEPROCCACHE : flush out execution plans from the procedure cache.

Reference: http://sqlanddotnetdevelopment.blogspot.com/2012/05/releasing-unused-memory-in-sql-server.html
0
 
LVL 17

Expert Comment

by:OriNetworks
Comment Utility
The default for SQL server is basically to use as much memory as possible for performance reasons. This can cause a tug of war when running other service which is why its not good to mix roles such as SQL server on a DC, etc. In the sql server properties in sql management studio you should be able to set a max memory value.
0
 
LVL 7

Expert Comment

by:Lee Ingalls
Comment Utility
I believe rettif9 mentioned that he set the maximum memory value to 6GB based on his PowerFlow vendor's recommendation. That was my interpretation at least. Frankly, I doubled the application vendor's requirements for our SQL.

Like OriNetworks suggested also keep AD, DC, DNS roles separate from your SQL server.
0
 
LVL 7

Author Comment

by:rettif9
Comment Utility
I tried rebooting this morning but memory went right back to maxed out as soon as I checked after rebooting. I'm not a sql specialist more of an AD guy. I also read this but it got over my head. http://blogs.msdn.com/b/sqljourney/archive/2013/11/02/10402729.aspx @OriNetworks as I said the max memory in SQL management studio is set to 6 GB. I tried stopping all of the SQL services but that only freed up about 10 - 15% the rest was still held in reserve by the OS. I also tried Computer >> properties >> Advanced >> Settings >> Adjust for best performance but it made no difference. @ Lee Ingalls I don't think you would say a "large" number of queries. They mostly just save scanned documents to the database and occasionally view them. The PowerFlow software engineer said that 6 GB should be more than adequate.

I should also mention that yesterday I was on the phone with the software engineer. He directed me to the SQL memory management window. Maximum had not been configured, he told me to reset it to 6GB. At first after that the application worked very fast (In task manager memory usage was down to about 20%) but later in the day it was back to being sluggish and task manager showed memory usage back up to about 95%. I have verified that  the 6GB setting did save.
0
 
LVL 10

Expert Comment

by:PadawanDBA
Comment Utility
Are you worried about the performance of the PowerFlow application?  If so, what I would be most concerned about is memory pressure from the other roles that are installed on the system impacting the performance of SQL Server.  My thought would be to take a peek at the hard/soft page fault counts on the system and see if those are high.  In perfmon, you'll be looking at the Memory: Page faults/sec, which is both hard and soft faults (soft faults are when the page is still on the standby list in physical memory).  Memory: Page reads/sec will give you hard faults, which is when the VMM (virtual memory manager) has to go to disk to pull the data back into your working set (physical memory).  If you notice that this is exorbitant, specifically if you have a high amount of Memory: Page faults/sec *and* a lot of Page reads/sec, then you're looking at memory pressure.  In that case you have a couple options: relocate the other roles on the server in an attempt to alleviate the memory pressure, increase the amount of memory on the server, or investigate the Lock Pages in Memory option (which prevents the VMM from paging out SQL Server's allocated memory).  The latter sounds fantastic upon first glance, but could cause issues with AD and DNS.  If you want any more info on any of that feel free to ask, happy to help!
0
 
LVL 26

Expert Comment

by:Zberteoc
Comment Utility
SQL server has a very simple memory setting process by right clicking on the server name/node in Management Studio and in the window that pops up in the left panel choose Memory and then in the right panel you will have to specify the maximum value you want to allocate to it. After you do that SQL server will use as much memory it needs and it will NEVER give it back to the OS unles its process is set with lower priority, which should not be. It will stop however at that maximum value was specified.

SQL server itself doesn't "care" what application connects to it so the PowerFlow recommendations are based on their experience and application needs but has nothing to do with SQL itself. As a matter of fact SQL recommendation is for AS MUCH memory as possible and make sure you don't have other processes to run on the same box.

In the case you have a dedicated SQL server box the percentage to be allocated for SQL service is somewhere to 80-90% see here:

http://sqlserverperformance.wordpress.com/2009/10/29/suggested-max-memory-settings-for-sql-server-20052008/

So your problem could be caused by 2 factors:

1. Insufficient memory for SQL server in which case it has to go very often to disk to load pages as they are required. Also some bad queries or missing indexes could render the SQL response to longer time.
2. The other processes are filling up the remaining memory in which case they have to do basically the same thing, swapping pages on the hard drive as virtual memory.

In every case your bottleneck is most likely the disk I/O and the fact that is overloaded. It would be interesting to monitor the I/O to see what values you get.

The remedy should be to separate the SQL to a different box or, if that is not possible  to install more memory on the current box. Also check if there are some queries that are slow.
0
 
LVL 17

Expert Comment

by:OriNetworks
Comment Utility
Sorry it seems I wasn't the only one that missed the already attempted action of setting the memory limit. I want to point out that as a domain controller the same behavior applies where it will try to use as much memory as possible which is why mixing DC role with SQL is a bad idea.  

Does the vendor support installing the software on a domain controller? If they do maybe the shouldnt unless a workaround is possible. Also if you have the ability to move the system page file to a fast dedicated array of harddrives(possibly raid0) you may have less of a performance impact.

Another option might be possible memory leaks but that would be a little difficult to pinpoint.

Using taskmgr you could sort by memory(Private Working Set) and let us know the top few processes and how much they are using.
0
 
LVL 26

Expert Comment

by:Zberteoc
Comment Utility
To be honest I wonder why would somebody install any software on a DC. I personally never heard of such thing.
0
Get up to 2TB FREE CLOUD per backup license!

An exclusive Black Friday offer just for Expert Exchange audience! Buy any of our top-rated backup solutions & get up to 2TB free cloud per system! Perform local & cloud backup in the same step, and restore instantly—anytime, anywhere. Grab this deal now before it disappears!

 
LVL 7

Author Comment

by:rettif9
Comment Utility
Just so everyone has the right perspective this is a small business with about a dozen workstations. about 8 of the users are using this SQL application. This server recently replaced a std server 2003 and is the only server on site. The application worked fine on the old server which was also the only server on site. I am aware that the best practice is to install SQL on a member server rather than a DC but I've been told by MS support that it usually works without difficulty on a DC. As I said it worked on the old server although it was an older version of SQL (2005).

@ PadawanDBA I will attach a screenshot of perfmon displaying the items you mentioned
@ Zberteoc I also included disc read write queue length counters if there is another parameter that would be more useful let me know. I also included a couple Task manager screen shots while I was at it.
Perfmon-scrnsht..docx
0
 
LVL 7

Author Comment

by:rettif9
Comment Utility
@ Zberteoc I'm a < bottom feeder I support very small businesses who can't afford their own full time IT staff or the luxury  of multiple servers.
0
 
LVL 56

Expert Comment

by:Cliff Galiher
Comment Utility
I also support small (even micro) businesses. I am an SBS MVP and the number of points I have in the SBS category here can attest to my experience. And yet I will tell you I agree with the sentiment here. Loading SQL on a DC is just a bad idea. This isn't the 2003 era before Microsoft took security seriously. How these applications run has changed DRAMATICALLY (just look at UAC as an example of that.)

2012 Standard has 1+2 virtualization rights. That is one VM for a DC and one VM for SQL. And it will STILL perform better than loading it all on one installed OS with no change in hardware and no extra expense to the small business.
0
 
LVL 10

Expert Comment

by:PadawanDBA
Comment Utility
I was going to say, let's not beat him up, it doesn't necessarily make sense to separate everything (from a cost perspective) if you're a small company =)  If it's a VM, though, you could definitely split them onto separate VMs with scaled back resource allocations - just food for thought!  On that screenshot, can you separate the disk qs out to get the y axis labels specific to the faults ?  I suspect they're pretty low, but i can't tell for sure with the scales being so different.  It doesn't look bad from initial perusal, however.  If you tally the total memory separately of the 93% and calculate as a percentage of 16GB, I suspect that you're not at all hitting 93%.  There is some general oddness with the memory counters in VMs.  Can you try looking at the mem free counters in perfmon?
0
 
LVL 7

Author Comment

by:rettif9
Comment Utility
@ Cliff Galiher & Zberteoc OK, to give some thought to your point. SQL and PowerFlow aren't the only Applications the customer needs They also use Quick Books and a vertical market application that requires IIS, a SQL anywhere application that will be replaced in the near future by a cloud version and an application that is mostly in the cloud but that has some server based data. This is a collision repair facility that uses software for estimating and for workflow management.  If I have a physical host and then one VM for a DC, GC, FSMOs, DHCP and DNS that leaves one VM for everything else. Are you opposed to that? Whats the best practice for calculating installed memory requirements? I hadn't planned for a 3rd OS, I'm a little concerned about HDD space.

@PadawanDBA I'll upload some new screen shots for you just a sec.
0
 
LVL 56

Expert Comment

by:Cliff Galiher
Comment Utility
Quickbooks is notoriously evil for stomping on things. Given the updated facts, I'll make a minor revision. Add a 2012 essentials license to the mix. It is relatively low cost, and then you are set to run three VMs. One essentials and 2 standard. Essentials will be your DC, DNS, DHCP, and can even take light fileserver work. All preconfigured. 1 VM for your SQL app since it seems particularly performance sensitive. And 1 VM for the other stuff, quickbooks, etc.
0
 
LVL 7

Author Comment

by:rettif9
Comment Utility
@PadwanDBA I know these random samples may not be representative because they are of short duration but if you want something longer just say so.

Just a fast mental estimate in Task Manager of memory in use is far short of 16 GB. This article doesn't apply to 2012 but I found it interesting reading nonetheless; http://rickardnobel.se/windows-2008-r2-memory-counters-in-task-manager/
Perfmon-scrnsht..docx
0
 
LVL 10

Accepted Solution

by:
PadawanDBA earned 500 total points
Comment Utility
I suspect it's ballooning the use of your memory to as close to 100% as hyper-v deems appropriate (i'm guessing the memory allocation for the VM is static, and this is preferred for SQL Server, btw).  It does this for memory allocation purposes, I do believe.  This accounts for the "high" memory utilization and also nicely explains the low faulting.  An interesting thing to try would be to set your minimum sql server memory to 5 GB to force SQL to steal that memory for it's own.  Additionally, those write queues are concerning...  I would focus in on some of the disk perfmon metrics to see if they have more of a correlation with your performance issues.  I am thinking the memory utilization is, in this case, a red herring.
0
 
LVL 7

Author Closing Comment

by:rettif9
Comment Utility
Sorry it took so long but the problem was finally identified. PowerFlow is a document storage application for scanned documents. We finally tracked down the cause of high memory usage. The scanner software was defaulting to a high resolution file size which was very large. None ofyou had enough information to effectively troubleshoot the problem but since PadawanDBA pointed me to look in other directions I have decided to award the solution to him. My thanks to all for your effort.
0

Featured Post

Free Gift Card with Acronis Backup Purchase!

Backup any data in any location: local and remote systems, physical and virtual servers, private and public clouds, Macs and PCs, tablets and mobile devices, & more! For limited time only, buy any Acronis backup products and get a FREE Amazon/Best Buy gift card worth up to $200!

Join & Write a Comment

The reason that corporations and businesses use Windows servers is because it supports custom modifications to adapt to the business and what it needs. Most individual users won’t need such powerful options. Here I’ll explain how you can enable Wind…
JSON is being used more and more, besides XML, and you surely wanted to parse the data out into SQL instead of doing it in some Javascript. The below function in SQL Server can do the job for you, returning a quick table with the parsed data.
Via a live example, show how to backup a database, simulate a failure backup the tail of the database transaction log and perform the restore.
Using examples as well as descriptions, and references to Books Online, show the documentation available for datatypes, explain the available data types and show how data can be passed into and out of variables.

762 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

6 Experts available now in Live!

Get 1:1 Help Now