SBS 2011 Server CPU Utilization

First... I have always been under the impression that if CPU Utilization is running a constant 20-30% or higher that you have a problem. I have been told that processes could spike to 80 or even 90% but as long as they drop back to zero really quick it isn't a problem. So what have others heard about when a constant CPU utilization is a problem as far as percent?

We are having some Excel problems and all the spreadsheets are on the server. It is like some sort of speed issue. On the Server the System (NT Kernel & System) process is always bouncing between 5 and 20%. The average being maybe 10%. This is constant. Never goes to zero. Isn't that a problem?
LVL 15
LockDown32OwnerAsked:
Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

x
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Cliff GaliherCommented:
Not really a problem. Especially on SBS, which has a ton of background processes. They add up.
0
LockDown32OwnerAuthor Commented:
Well... I am going to dwell on the part of the question about constant CPU Utilization... at what point is it a concern?
0
JohnBusiness Consultant (Owner)Commented:
I have a Server 2012 here running and support a couple of dozen users, file and print, and so on but NOT Exchange (outsourced) or SQL (different server ) and CPU is 10%, memory is 15% and all is well.  I do agree with Cliff's comments as well.

Run Task Manager or Resource Monitor and see what is using the CPU.
0
HTML5 and CSS3 Fundamentals

Build a website from the ground up by first learning the fundamentals of HTML5 and CSS3, the two popular programming languages used to present content online. HTML deals with fonts, colors, graphics, and hyperlinks, while CSS describes how HTML elements are to be displayed.

LockDown32OwnerAuthor Commented:
So John what single process on your 2012 Server is running a constant 10% CPU utilization? I run a 2012 Server too and nothing is running even close to a constant 10% utilization....
0
David NeedhamFreelance ConsultantCommented:
I agree.  This could be nothing, but if you're concerned then have a look at you physical memory usage and pagefile.  Quite often when you see constant CPU usage, these can be involved.
0
JohnBusiness Consultant (Owner)Commented:
Background processes = 53

Agent for Windows
Apache HTTP Server processes
Backup Exec processes
DFS
DNS
File Replication
Host Processes
Java
Logmein
QuickBooks
Symantec
Some SQL
Spooler
DHCP
and a bunch more.

Still 10% an hour later.
0
LockDown32OwnerAuthor Commented:
On all my servers nothing run a constant 10%. Even the overall server will more times then not drop down to 1%. So I want to ask the first part of the question again because so far no one has addressed it:

First... I have always been under the impression that if CPU Utilization is running a constant 20-30% or higher that you have a problem. I have been told that processes could spike to 80 or even 90% but as long as they drop back to zero really quick it isn't a problem. So what have others heard about when a constant CPU utilization is a problem as far as percent?
0
JohnBusiness Consultant (Owner)Commented:
I have always been under the impression that if CPU Utilization is running a constant 20-30% or higher that you have a problem.

In my world, yes, I would see this is as a problem to investigate. I know what I am running, not what you are running, so we do not know your servers' issues.

I have been told that processes could spike to 80 or even 90% but as long as they drop back to zero really quick it isn't a problem

That is definitely my experience
0
LockDown32OwnerAuthor Commented:
The question was not specific to anyone's particular world.  No two computer are alike. I don't really need a list of processes you are running. It was a generic question about CPU Utilization and Processes in general.
0
JohnBusiness Consultant (Owner)Commented:
It depends on what you are running.

In a bland server or workstation, CPU should run 10% or less more than 95% of the time.

But that will not mean anything if you (or someone else) is running something different.

So as a "generic" question, there is NO one answer to that.
0
LockDown32OwnerAuthor Commented:
I believe there is a generic answer. Even if you are running SQL with huge databases the CPU Utilization shouldn't run a constant 60%. Nothing should. If it does then something is amiss. As mentioned before it is not unusual for processes to shoot up to 80-90% but as long as it doesn't stay there and comes right back down it isn't a problem. I asked the same question on SuperUser and they are responding with their rough, generic answers as to when a process might need more attention. No two computer are alike. No two computers run the same thing. You yourself just gave a generic answer even though it is based on overall CPU Utilization. I would prefer getting a generic answer on a process basis but I can do a rough translation.
0
David NeedhamFreelance ConsultantCommented:
Lockdown32: You seem to have disregarded my suggestion.  Does this mean that you've looked at what I've suggested and neither are involved?
0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
JohnBusiness Consultant (Owner)Commented:
I give up. Gone
0
LockDown32OwnerAuthor Commented:
I haven't disregarded it. Just haven't had time to look at it. What exactly am I looking for?
0
David NeedhamFreelance ConsultantCommented:
High physical memory usage and constant pagefiling.
0
LockDown32OwnerAuthor Commented:
Well... Task Manager says total 16106MB, cached 486, available 480 and free 4.If you go in to Resource Monitor it says Used Physical Memory and is has to be at about the 95% mark. Exchange is by far the biggest hog at about 9GB.

Page file is Automatically Managed. 16GB on drive C: and none on Drive D: D: is the data drive with a ton of free space. I an running out of space on C: :)
0
David NeedhamFreelance ConsultantCommented:
Then as long as there is no reason not to, I would move your pagefile to the other partition.

Also, if it's an issue there is a hack to limit the the amount of RAM that the Information Store can use.  This isn't recommended though.
0
LockDown32OwnerAuthor Commented:
OK. So since physical memory usage is pretty close to 100% does that represent a problem?

Even when you look at the page file it says recommended 24GB yet it only allocated 16GB. Should I just make it all "System Managed" on D: and D only or go custom with the recommended 24159 as both initial and max? What would you recommend?
0
David NeedhamFreelance ConsultantCommented:
Yes.  100% usage can cause such issues.

If it were me I would manually define the pagefile on D: as you suggest at 24,576MB.
0
David NeedhamFreelance ConsultantCommented:
It might also be worth considering bringing your physical memory up to 24GB, if you have a budget.
0
LockDown32OwnerAuthor Commented:
Thanks David. I think you are on the right track. I read an article about limiting the memory SQL uses and limited it. It took the physical memory usage down to 88% and immediately the overall CPU% dropped to 1-2% and hasn't gone above it. I'll manually define the page file tonight after hours and then quote them on 24GB. Let you know what happens in the morning.
0
David NeedhamFreelance ConsultantCommented:
No problem.  I know that SBS can be a pain sometimes.
0
LockDown32OwnerAuthor Commented:
That did the trick David. The server is old enough that it probably isn't worth putting the money in to memory. I will discuss with the customer. In the short term I did the Exchange Hack. It took the memory usage down to 52% and everything is running great. The System process that started the question and was constantly between 5 and 20% completely fell off the radar and the average CPU utilization is 1-2%. I am sure the Excel issue has probably gone away too. Much appreciated.
0
David NeedhamFreelance ConsultantCommented:
I'm glad that it's sorted! :)
0
Lee W, MVPTechnology and Business Process AdvisorCommented:
FYI, so long as your not hitting 100% and staying there, it's not a problem.  30% usage means 70% of the time the CPUs are sitting IDLE and doing NOTHING.  WASTING AWAY.  This is the whole point to virtualization - recapture the idle CPU time for other servers... instead of one server running at 10, 20, even 30%, put THREE on one piece of hardware and let it sit at 90%!  (well, the general rule is 80% or more and you want to think about upgrading CPUs or the server... USUALLY people's performance problems are related to RAM or disk and sometimes network.  CPU is rarely a problem except where CPU is actually needed.
0
LockDown32OwnerAuthor Commented:
I hate to do this Lee but you are saying that a Server that has a constant 60-90% CPU utilization is running fine? Even John Hursh took issue with that.  I have always been told that if you have a constant CPU utilization of 20-40% that you have a problem. That is what I have seen. If a server is running a constant 30% CPU utilization that the speed and performance is severely impacted. You are saying that is OK?
0
Lee W, MVPTechnology and Business Process AdvisorCommented:
If you read what I said:
(well, the general rule is 80% or more

I have always been told
By who?

It really looks like you're picking and choosing the words you want to read.
Quoting John:
It depends on what you are running.

In a bland server or workstation, CPU should run 10% or less more than 95% of the time.

But that will not mean anything if you (or someone else) is running something different.


I agree - in a BLAND server or workstation that does NOTHING, usage should generally be low on any recent hardware and OS combination.

SBS is far from a BLAND server - Exchange, File Services, AD, Web services, WSUS, and the dozens of services supporting those capabilities and basic needs of Windows, it's FAR from a BLAND server.
0
Cliff GaliherCommented:
Apples and oranges. A single OS at a constant 30% *usually* (not always) has another problem. Take YOUR situation. Your problem wasn't CPU. It was memory. Yes, high swapping caused your CPU to be slightly elevated, but that was neither a cause or an actual symptom (if you believe otherwise then you are ignoring facts and can stop reading now.

But in a virtualized environment, you are intentionally stacking many OS installs. Just as I said above about SBS, those background processes stack up. Running high CPU is an established practice in virtualized environments. As long as you aren't hitting other bottlenecks, you keep adding OS workloads until something bottlenecks. You reduce TCO and increase ROI. That's been stock-standard with virtualization for over a decade.
0
Cliff GaliherCommented:
Basically I am saying Lee is  right and you cherry picked your responses.
0
LockDown32OwnerAuthor Commented:
I am not cherry picking responses. You aren't answering the questions. 30% is 30%. Even if it is a result of something else the 30% is still a problem.  From what you are saying there is no CPU% Utilization that is a problem. If it is less than 100% it is not a problem. 99% is not a problem. I don't think so. Do you? Really?
0
Cliff GaliherCommented:
Did I say 99%? Yes you cherry pick responses and flip things around to hear what you want. John Hurst, in this question, bailed. You were rude and I don't blame him. You ignored Lee's thoughtful explanation. Generally you and I never got along in any question. So I'm used to it.

But when you actively ignore and anger multiple highly ranked and regarded experts here, not one, not two, but three...all with a proven track record and expertise....maybe...just maybe...it is time to reassess your position. Just saying.

I'm out too.
0
Lee W, MVPTechnology and Business Process AdvisorCommented:
That's not what I'm saying.  For the THIRD time:
(well, the general rule is 80% or more

Further, the KIND of CPU usage also is important.  Seti-at-home (and other similar types of programs) will make your CPU run at 100% - but that isn't significantly slowing you down because it's running as very low priority so ANYTHING ELSE can use the CPU.

I feel I've given useful information to those who want to read.  

To quote John Hurst:
I give up. Gone
0
LockDown32OwnerAuthor Commented:
Sorry Lee. It was a simple question. Didn't mean to confuse you. This was your comment not mine "FYI, so long as your not hitting 100% and staying there, it's not a problem". How many ways can you interpret that?
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
SBS

From novice to tech pro — start learning today.