Still celebrating National IT Professionals Day with 3 months of free Premium Membership. Use Code ITDAY17

x
?
Solved

Server 2012 memory management

Posted on 2014-03-07
17
Medium Priority
?
553 Views
Last Modified: 2014-03-12
I have a VM installed on a 2012 Standard server. It has 16 GB of RAM allocated to it. right now it is running DHCP and DNS, it is a DC and SQL 2012 is installed. no other applications are installed. SQL is used for a electronic document storage application named PowerFlow. Based on their recommendation the maximum memory for SQL is set at 6 GB. The OS seems to be keeping a large amount of memory in reserve. Task Manager indicates that 96% is in use. This seems to be affecting performance of PowerFlow. It is very sluggish. I also installed Process Explorer screen shot attached.
Procexp-scrnsht..docx
0
Comment
Question by:rettif9
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 6
  • 3
  • 2
  • +3
17 Comments
 
LVL 9

Expert Comment

by:Lee Ingalls
ID: 39912407
Rebooting is sure fire way but inconvenient. Are you running a large number of queries?
SQL Server automatic memory management will request memory from the OS and attempt to keep it as long as it can. SQL is suppose to release it by request from the OS.

Here are two commands:
DBCC DROPCLEANBUFFERS:  flush out data pages from the memory.
DBCC FREEPROCCACHE : flush out execution plans from the procedure cache.

Reference: http://sqlanddotnetdevelopment.blogspot.com/2012/05/releasing-unused-memory-in-sql-server.html
0
 
LVL 17

Expert Comment

by:OriNetworks
ID: 39912533
The default for SQL server is basically to use as much memory as possible for performance reasons. This can cause a tug of war when running other service which is why its not good to mix roles such as SQL server on a DC, etc. In the sql server properties in sql management studio you should be able to set a max memory value.
0
 
LVL 9

Expert Comment

by:Lee Ingalls
ID: 39912595
I believe rettif9 mentioned that he set the maximum memory value to 6GB based on his PowerFlow vendor's recommendation. That was my interpretation at least. Frankly, I doubled the application vendor's requirements for our SQL.

Like OriNetworks suggested also keep AD, DC, DNS roles separate from your SQL server.
0
Back Up Your Microsoft Windows Server®

Back up all your Microsoft Windows Server – on-premises, in remote locations, in private and hybrid clouds. Your entire Windows Server will be backed up in one easy step with patented, block-level disk imaging. We achieve RTOs (recovery time objectives) as low as 15 seconds.

 
LVL 7

Author Comment

by:rettif9
ID: 39912635
I tried rebooting this morning but memory went right back to maxed out as soon as I checked after rebooting. I'm not a sql specialist more of an AD guy. I also read this but it got over my head. http://blogs.msdn.com/b/sqljourney/archive/2013/11/02/10402729.aspx @OriNetworks as I said the max memory in SQL management studio is set to 6 GB. I tried stopping all of the SQL services but that only freed up about 10 - 15% the rest was still held in reserve by the OS. I also tried Computer >> properties >> Advanced >> Settings >> Adjust for best performance but it made no difference. @ Lee Ingalls I don't think you would say a "large" number of queries. They mostly just save scanned documents to the database and occasionally view them. The PowerFlow software engineer said that 6 GB should be more than adequate.

I should also mention that yesterday I was on the phone with the software engineer. He directed me to the SQL memory management window. Maximum had not been configured, he told me to reset it to 6GB. At first after that the application worked very fast (In task manager memory usage was down to about 20%) but later in the day it was back to being sluggish and task manager showed memory usage back up to about 95%. I have verified that  the 6GB setting did save.
0
 
LVL 10

Expert Comment

by:PadawanDBA
ID: 39912751
Are you worried about the performance of the PowerFlow application?  If so, what I would be most concerned about is memory pressure from the other roles that are installed on the system impacting the performance of SQL Server.  My thought would be to take a peek at the hard/soft page fault counts on the system and see if those are high.  In perfmon, you'll be looking at the Memory: Page faults/sec, which is both hard and soft faults (soft faults are when the page is still on the standby list in physical memory).  Memory: Page reads/sec will give you hard faults, which is when the VMM (virtual memory manager) has to go to disk to pull the data back into your working set (physical memory).  If you notice that this is exorbitant, specifically if you have a high amount of Memory: Page faults/sec *and* a lot of Page reads/sec, then you're looking at memory pressure.  In that case you have a couple options: relocate the other roles on the server in an attempt to alleviate the memory pressure, increase the amount of memory on the server, or investigate the Lock Pages in Memory option (which prevents the VMM from paging out SQL Server's allocated memory).  The latter sounds fantastic upon first glance, but could cause issues with AD and DNS.  If you want any more info on any of that feel free to ask, happy to help!
0
 
LVL 27

Expert Comment

by:Zberteoc
ID: 39912752
SQL server has a very simple memory setting process by right clicking on the server name/node in Management Studio and in the window that pops up in the left panel choose Memory and then in the right panel you will have to specify the maximum value you want to allocate to it. After you do that SQL server will use as much memory it needs and it will NEVER give it back to the OS unles its process is set with lower priority, which should not be. It will stop however at that maximum value was specified.

SQL server itself doesn't "care" what application connects to it so the PowerFlow recommendations are based on their experience and application needs but has nothing to do with SQL itself. As a matter of fact SQL recommendation is for AS MUCH memory as possible and make sure you don't have other processes to run on the same box.

In the case you have a dedicated SQL server box the percentage to be allocated for SQL service is somewhere to 80-90% see here:

http://sqlserverperformance.wordpress.com/2009/10/29/suggested-max-memory-settings-for-sql-server-20052008/

So your problem could be caused by 2 factors:

1. Insufficient memory for SQL server in which case it has to go very often to disk to load pages as they are required. Also some bad queries or missing indexes could render the SQL response to longer time.
2. The other processes are filling up the remaining memory in which case they have to do basically the same thing, swapping pages on the hard drive as virtual memory.

In every case your bottleneck is most likely the disk I/O and the fact that is overloaded. It would be interesting to monitor the I/O to see what values you get.

The remedy should be to separate the SQL to a different box or, if that is not possible  to install more memory on the current box. Also check if there are some queries that are slow.
0
 
LVL 17

Expert Comment

by:OriNetworks
ID: 39912789
Sorry it seems I wasn't the only one that missed the already attempted action of setting the memory limit. I want to point out that as a domain controller the same behavior applies where it will try to use as much memory as possible which is why mixing DC role with SQL is a bad idea.  

Does the vendor support installing the software on a domain controller? If they do maybe the shouldnt unless a workaround is possible. Also if you have the ability to move the system page file to a fast dedicated array of harddrives(possibly raid0) you may have less of a performance impact.

Another option might be possible memory leaks but that would be a little difficult to pinpoint.

Using taskmgr you could sort by memory(Private Working Set) and let us know the top few processes and how much they are using.
0
 
LVL 27

Expert Comment

by:Zberteoc
ID: 39912912
To be honest I wonder why would somebody install any software on a DC. I personally never heard of such thing.
0
 
LVL 7

Author Comment

by:rettif9
ID: 39912918
Just so everyone has the right perspective this is a small business with about a dozen workstations. about 8 of the users are using this SQL application. This server recently replaced a std server 2003 and is the only server on site. The application worked fine on the old server which was also the only server on site. I am aware that the best practice is to install SQL on a member server rather than a DC but I've been told by MS support that it usually works without difficulty on a DC. As I said it worked on the old server although it was an older version of SQL (2005).

@ PadawanDBA I will attach a screenshot of perfmon displaying the items you mentioned
@ Zberteoc I also included disc read write queue length counters if there is another parameter that would be more useful let me know. I also included a couple Task manager screen shots while I was at it.
Perfmon-scrnsht..docx
0
 
LVL 7

Author Comment

by:rettif9
ID: 39912938
@ Zberteoc I'm a < bottom feeder I support very small businesses who can't afford their own full time IT staff or the luxury  of multiple servers.
0
 
LVL 59

Expert Comment

by:Cliff Galiher
ID: 39912943
I also support small (even micro) businesses. I am an SBS MVP and the number of points I have in the SBS category here can attest to my experience. And yet I will tell you I agree with the sentiment here. Loading SQL on a DC is just a bad idea. This isn't the 2003 era before Microsoft took security seriously. How these applications run has changed DRAMATICALLY (just look at UAC as an example of that.)

2012 Standard has 1+2 virtualization rights. That is one VM for a DC and one VM for SQL. And it will STILL perform better than loading it all on one installed OS with no change in hardware and no extra expense to the small business.
0
 
LVL 10

Expert Comment

by:PadawanDBA
ID: 39912955
I was going to say, let's not beat him up, it doesn't necessarily make sense to separate everything (from a cost perspective) if you're a small company =)  If it's a VM, though, you could definitely split them onto separate VMs with scaled back resource allocations - just food for thought!  On that screenshot, can you separate the disk qs out to get the y axis labels specific to the faults ?  I suspect they're pretty low, but i can't tell for sure with the scales being so different.  It doesn't look bad from initial perusal, however.  If you tally the total memory separately of the 93% and calculate as a percentage of 16GB, I suspect that you're not at all hitting 93%.  There is some general oddness with the memory counters in VMs.  Can you try looking at the mem free counters in perfmon?
0
 
LVL 7

Author Comment

by:rettif9
ID: 39913046
@ Cliff Galiher & Zberteoc OK, to give some thought to your point. SQL and PowerFlow aren't the only Applications the customer needs They also use Quick Books and a vertical market application that requires IIS, a SQL anywhere application that will be replaced in the near future by a cloud version and an application that is mostly in the cloud but that has some server based data. This is a collision repair facility that uses software for estimating and for workflow management.  If I have a physical host and then one VM for a DC, GC, FSMOs, DHCP and DNS that leaves one VM for everything else. Are you opposed to that? Whats the best practice for calculating installed memory requirements? I hadn't planned for a 3rd OS, I'm a little concerned about HDD space.

@PadawanDBA I'll upload some new screen shots for you just a sec.
0
 
LVL 59

Expert Comment

by:Cliff Galiher
ID: 39913058
Quickbooks is notoriously evil for stomping on things. Given the updated facts, I'll make a minor revision. Add a 2012 essentials license to the mix. It is relatively low cost, and then you are set to run three VMs. One essentials and 2 standard. Essentials will be your DC, DNS, DHCP, and can even take light fileserver work. All preconfigured. 1 VM for your SQL app since it seems particularly performance sensitive. And 1 VM for the other stuff, quickbooks, etc.
0
 
LVL 7

Author Comment

by:rettif9
ID: 39913125
@PadwanDBA I know these random samples may not be representative because they are of short duration but if you want something longer just say so.

Just a fast mental estimate in Task Manager of memory in use is far short of 16 GB. This article doesn't apply to 2012 but I found it interesting reading nonetheless; http://rickardnobel.se/windows-2008-r2-memory-counters-in-task-manager/
Perfmon-scrnsht..docx
0
 
LVL 10

Accepted Solution

by:
PadawanDBA earned 2000 total points
ID: 39913289
I suspect it's ballooning the use of your memory to as close to 100% as hyper-v deems appropriate (i'm guessing the memory allocation for the VM is static, and this is preferred for SQL Server, btw).  It does this for memory allocation purposes, I do believe.  This accounts for the "high" memory utilization and also nicely explains the low faulting.  An interesting thing to try would be to set your minimum sql server memory to 5 GB to force SQL to steal that memory for it's own.  Additionally, those write queues are concerning...  I would focus in on some of the disk perfmon metrics to see if they have more of a correlation with your performance issues.  I am thinking the memory utilization is, in this case, a red herring.
0
 
LVL 7

Author Closing Comment

by:rettif9
ID: 39924083
Sorry it took so long but the problem was finally identified. PowerFlow is a document storage application for scanned documents. We finally tracked down the cause of high memory usage. The scanner software was defaulting to a high resolution file size which was very large. None ofyou had enough information to effectively troubleshoot the problem but since PadawanDBA pointed me to look in other directions I have decided to award the solution to him. My thanks to all for your effort.
0

Featured Post

New feature and membership benefit!

New feature! Upgrade and increase expert visibility of your issues with Priority Questions.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Why is this different from all of the other step by step guides?  Because I make a living as a DBA and not as a writer and I lived through this experience. Defining the name: When I talk to people they say different names on this subject stuff l…
Windows Server 2003 introduced persistent Volume Shadow Copies and made 2003 a must-do upgrade.  Since then, it's been a must-implement feature for all servers doing any kind of file sharing.
This tutorial will walk an individual through the process of installing of Data Protection Manager on a server running Windows Server 2012 R2, including the prerequisites. Microsoft .Net 3.5 is required. To install this feature, go to Server Manager…
Viewers will learn how to use the SELECT statement in SQL to return specific rows and columns, with various degrees of sorting and limits in place.

721 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question