Avatar of LockDown32
LockDown32
Flag for United States of America asked on

SBS 2011 Server CPU Utilization

First... I have always been under the impression that if CPU Utilization is running a constant 20-30% or higher that you have a problem. I have been told that processes could spike to 80 or even 90% but as long as they drop back to zero really quick it isn't a problem. So what have others heard about when a constant CPU utilization is a problem as far as percent?

We are having some Excel problems and all the spreadsheets are on the server. It is like some sort of speed issue. On the Server the System (NT Kernel & System) process is always bouncing between 5 and 20%. The average being maybe 10%. This is constant. Never goes to zero. Isn't that a problem?
SBS

Avatar of undefined
Last Comment
LockDown32

8/22/2022 - Mon
Cliff Galiher

Not really a problem. Especially on SBS, which has a ton of background processes. They add up.
LockDown32

ASKER
Well... I am going to dwell on the part of the question about constant CPU Utilization... at what point is it a concern?
John

I have a Server 2012 here running and support a couple of dozen users, file and print, and so on but NOT Exchange (outsourced) or SQL (different server ) and CPU is 10%, memory is 15% and all is well.  I do agree with Cliff's comments as well.

Run Task Manager or Resource Monitor and see what is using the CPU.
Your help has saved me hundreds of hours of internet surfing.
fblack61
LockDown32

ASKER
So John what single process on your 2012 Server is running a constant 10% CPU utilization? I run a 2012 Server too and nothing is running even close to a constant 10% utilization....
David Needham

I agree.  This could be nothing, but if you're concerned then have a look at you physical memory usage and pagefile.  Quite often when you see constant CPU usage, these can be involved.
John

Background processes = 53

Agent for Windows
Apache HTTP Server processes
Backup Exec processes
DFS
DNS
File Replication
Host Processes
Java
Logmein
QuickBooks
Symantec
Some SQL
Spooler
DHCP
and a bunch more.

Still 10% an hour later.
Get an unlimited membership to EE for less than $4 a week.
Unlimited question asking, solutions, articles and more.
LockDown32

ASKER
On all my servers nothing run a constant 10%. Even the overall server will more times then not drop down to 1%. So I want to ask the first part of the question again because so far no one has addressed it:

First... I have always been under the impression that if CPU Utilization is running a constant 20-30% or higher that you have a problem. I have been told that processes could spike to 80 or even 90% but as long as they drop back to zero really quick it isn't a problem. So what have others heard about when a constant CPU utilization is a problem as far as percent?
John

I have always been under the impression that if CPU Utilization is running a constant 20-30% or higher that you have a problem.

In my world, yes, I would see this is as a problem to investigate. I know what I am running, not what you are running, so we do not know your servers' issues.

I have been told that processes could spike to 80 or even 90% but as long as they drop back to zero really quick it isn't a problem

That is definitely my experience
LockDown32

ASKER
The question was not specific to anyone's particular world.  No two computer are alike. I don't really need a list of processes you are running. It was a generic question about CPU Utilization and Processes in general.
Experts Exchange is like having an extremely knowledgeable team sitting and waiting for your call. Couldn't do my job half as well as I do without it!
James Murphy
John

It depends on what you are running.

In a bland server or workstation, CPU should run 10% or less more than 95% of the time.

But that will not mean anything if you (or someone else) is running something different.

So as a "generic" question, there is NO one answer to that.
LockDown32

ASKER
I believe there is a generic answer. Even if you are running SQL with huge databases the CPU Utilization shouldn't run a constant 60%. Nothing should. If it does then something is amiss. As mentioned before it is not unusual for processes to shoot up to 80-90% but as long as it doesn't stay there and comes right back down it isn't a problem. I asked the same question on SuperUser and they are responding with their rough, generic answers as to when a process might need more attention. No two computer are alike. No two computers run the same thing. You yourself just gave a generic answer even though it is based on overall CPU Utilization. I would prefer getting a generic answer on a process basis but I can do a rough translation.
ASKER CERTIFIED SOLUTION
David Needham

Log in or sign up to see answer
Become an EE member today7-DAY FREE TRIAL
Members can start a 7-Day Free trial then enjoy unlimited access to the platform
Sign up - Free for 7 days
or
Learn why we charge membership fees
We get it - no one likes a content blocker. Take one extra minute and find out why we block content.
Not exactly the question you had in mind?
Sign up for an EE membership and get your own personalized solution. With an EE membership, you can ask unlimited troubleshooting, research, or opinion questions.
ask a question
John

I give up. Gone
Get an unlimited membership to EE for less than $4 a week.
Unlimited question asking, solutions, articles and more.
LockDown32

ASKER
I haven't disregarded it. Just haven't had time to look at it. What exactly am I looking for?
David Needham

High physical memory usage and constant pagefiling.
LockDown32

ASKER
Well... Task Manager says total 16106MB, cached 486, available 480 and free 4.If you go in to Resource Monitor it says Used Physical Memory and is has to be at about the 95% mark. Exchange is by far the biggest hog at about 9GB.

Page file is Automatically Managed. 16GB on drive C: and none on Drive D: D: is the data drive with a ton of free space. I an running out of space on C: :)
I started with Experts Exchange in 2004 and it's been a mainstay of my professional computing life since. It helped me launch a career as a programmer / Oracle data analyst
William Peck
David Needham

Then as long as there is no reason not to, I would move your pagefile to the other partition.

Also, if it's an issue there is a hack to limit the the amount of RAM that the Information Store can use.  This isn't recommended though.
LockDown32

ASKER
OK. So since physical memory usage is pretty close to 100% does that represent a problem?

Even when you look at the page file it says recommended 24GB yet it only allocated 16GB. Should I just make it all "System Managed" on D: and D only or go custom with the recommended 24159 as both initial and max? What would you recommend?
David Needham

Yes.  100% usage can cause such issues.

If it were me I would manually define the pagefile on D: as you suggest at 24,576MB.
Get an unlimited membership to EE for less than $4 a week.
Unlimited question asking, solutions, articles and more.
David Needham

It might also be worth considering bringing your physical memory up to 24GB, if you have a budget.
LockDown32

ASKER
Thanks David. I think you are on the right track. I read an article about limiting the memory SQL uses and limited it. It took the physical memory usage down to 88% and immediately the overall CPU% dropped to 1-2% and hasn't gone above it. I'll manually define the page file tonight after hours and then quote them on 24GB. Let you know what happens in the morning.
David Needham

No problem.  I know that SBS can be a pain sometimes.
This is the best money I have ever spent. I cannot not tell you how many times these folks have saved my bacon. I learn so much from the contributors.
rwheeler23
LockDown32

ASKER
That did the trick David. The server is old enough that it probably isn't worth putting the money in to memory. I will discuss with the customer. In the short term I did the Exchange Hack. It took the memory usage down to 52% and everything is running great. The System process that started the question and was constantly between 5 and 20% completely fell off the radar and the average CPU utilization is 1-2%. I am sure the Excel issue has probably gone away too. Much appreciated.
David Needham

I'm glad that it's sorted! :)
Lee W, MVP

FYI, so long as your not hitting 100% and staying there, it's not a problem.  30% usage means 70% of the time the CPUs are sitting IDLE and doing NOTHING.  WASTING AWAY.  This is the whole point to virtualization - recapture the idle CPU time for other servers... instead of one server running at 10, 20, even 30%, put THREE on one piece of hardware and let it sit at 90%!  (well, the general rule is 80% or more and you want to think about upgrading CPUs or the server... USUALLY people's performance problems are related to RAM or disk and sometimes network.  CPU is rarely a problem except where CPU is actually needed.
Get an unlimited membership to EE for less than $4 a week.
Unlimited question asking, solutions, articles and more.
LockDown32

ASKER
I hate to do this Lee but you are saying that a Server that has a constant 60-90% CPU utilization is running fine? Even John Hursh took issue with that.  I have always been told that if you have a constant CPU utilization of 20-40% that you have a problem. That is what I have seen. If a server is running a constant 30% CPU utilization that the speed and performance is severely impacted. You are saying that is OK?
Lee W, MVP

If you read what I said:
(well, the general rule is 80% or more

I have always been told
By who?

It really looks like you're picking and choosing the words you want to read.
Quoting John:
It depends on what you are running.

In a bland server or workstation, CPU should run 10% or less more than 95% of the time.

But that will not mean anything if you (or someone else) is running something different.


I agree - in a BLAND server or workstation that does NOTHING, usage should generally be low on any recent hardware and OS combination.

SBS is far from a BLAND server - Exchange, File Services, AD, Web services, WSUS, and the dozens of services supporting those capabilities and basic needs of Windows, it's FAR from a BLAND server.
Cliff Galiher

Apples and oranges. A single OS at a constant 30% *usually* (not always) has another problem. Take YOUR situation. Your problem wasn't CPU. It was memory. Yes, high swapping caused your CPU to be slightly elevated, but that was neither a cause or an actual symptom (if you believe otherwise then you are ignoring facts and can stop reading now.

But in a virtualized environment, you are intentionally stacking many OS installs. Just as I said above about SBS, those background processes stack up. Running high CPU is an established practice in virtualized environments. As long as you aren't hitting other bottlenecks, you keep adding OS workloads until something bottlenecks. You reduce TCO and increase ROI. That's been stock-standard with virtualization for over a decade.
Experts Exchange has (a) saved my job multiple times, (b) saved me hours, days, and even weeks of work, and often (c) makes me look like a superhero! This place is MAGIC!
Walt Forbes
Cliff Galiher

Basically I am saying Lee is  right and you cherry picked your responses.
LockDown32

ASKER
I am not cherry picking responses. You aren't answering the questions. 30% is 30%. Even if it is a result of something else the 30% is still a problem.  From what you are saying there is no CPU% Utilization that is a problem. If it is less than 100% it is not a problem. 99% is not a problem. I don't think so. Do you? Really?
Cliff Galiher

Did I say 99%? Yes you cherry pick responses and flip things around to hear what you want. John Hurst, in this question, bailed. You were rude and I don't blame him. You ignored Lee's thoughtful explanation. Generally you and I never got along in any question. So I'm used to it.

But when you actively ignore and anger multiple highly ranked and regarded experts here, not one, not two, but three...all with a proven track record and expertise....maybe...just maybe...it is time to reassess your position. Just saying.

I'm out too.
Get an unlimited membership to EE for less than $4 a week.
Unlimited question asking, solutions, articles and more.
Lee W, MVP

That's not what I'm saying.  For the THIRD time:
(well, the general rule is 80% or more

Further, the KIND of CPU usage also is important.  Seti-at-home (and other similar types of programs) will make your CPU run at 100% - but that isn't significantly slowing you down because it's running as very low priority so ANYTHING ELSE can use the CPU.

I feel I've given useful information to those who want to read.  

To quote John Hurst:
I give up. Gone
LockDown32

ASKER
Sorry Lee. It was a simple question. Didn't mean to confuse you. This was your comment not mine "FYI, so long as your not hitting 100% and staying there, it's not a problem". How many ways can you interpret that?