Link to home
Start Free TrialLog in
Avatar of Doug Van
Doug VanFlag for Canada

asked on

Power and BTU ratings calculation

Hello all,

Hopefully someone can help with my confusion. I am trying to calculate the power requirements and BTUs between two servers that are identical with everything but the storage devices.

I am building two servers, where the chassis power supply specifications are:
1000W Output @ 100-140V, 12-8A, 50-60Hz
1280W Output @ 180-240V, 8-6A, 50-60Hz

All the additional components (motherboard, cpu, memory, etc) will come from different manufacturers but will be the same in both servers.

The main difference between the two servers will be:
Server 1: 12 x 4TB traditional 3.5" hard drives
Server 2: 12 x 2TB SSD drives

The confusion I am having is that I would expect that the power requirements to run Server 1 would be significantly higher than Server 2 because traditional hard drives require significantly power power (at startup and normal operation). Solid state drives require a lot less power. However, the manufacturer is telling me that the power ratings are the same because the power supply dictates how much power is used.

So, assuming that the manufacturer is correct on the power requirements. What about the BTUs? This is definitely going to be a very different value. The problem is that BTUs are usually calculated with voltage and current.

Any advice is appreciate.

Thank you,
Doug
ASKER CERTIFIED SOLUTION
Avatar of CompProbSolv
CompProbSolv
Flag of United States of America image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Just in case I wasn't clear, the manufacturer's comment that " the power supply dictates how much power is used" is nonsense!  The only impacts it has are the total limit of what it can produce (which you should try to stay far away from) and its efficiency.
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of Doug Van

ASKER

So sorry it has taken this long to respond and grade this question.

Thank you so much everyone. Your answers were correct and insightful. :)
BTW, I am curious about one characteristic that I observed and I think Danny Child first predicted this...
"I would also suggest that in real usage, you will not see any significant difference between the power and BTU use of these 2 servers. "

I began this thread because I thought that is was unlikely that a server using all mechanical hard drives (12 in total) would consume the same amount of power as an identical server with all flash drives (12 in total). In other words, the two servers are identical except for their storage arrays.

Well, I placed a power meter on both units and I was surprised to learn that the real (typical) power consumption was roughly the same for both units. Both units typically consumed around 680 watts when working on a typical workload.

I am puzzled by these results. I always thought the SSD drives consumed significantly less power.

Any thoughts?

Thanks.
What you probably haven't factored in is how much of those 680 watts is directly attributable to the hard drives themselves.
I looked up some Seagate drives and found the following
ST2000VN0001 (conventional HD) has a 6.4W typical operating power consumption, 4.5W in idle
XF1230-1A1920 (SATA SSD) has a 4.5W max. active average power, 0.7W in idle

With 12 drives (assuming the specs above are realistic in both cases), you should see about 22W more draw with the conventional HDs when active and about 45W more when idling.

Those differences aren't dramatic when looking at 680W overall, but still should have been measurable.  Was the power draw varying enough that you'd not have noticed a 22W difference?
You are absolutely correct. Thank you. I just compared the manufacturer's specifications for the SSDs vs the HDDs and I was surprised to learn that there is only an average 3 watts (each) difference between them or about 36 watts in total per chassis. I always thought that the difference was much greater.

What puzzles me is that physically, the mechanical hard drives seem to generate a higher temperature than the solid state drives. I just checked and I swear that the SSD drives appear physically cooler. But the SSD drives are encased in plastic or resin and the HHDs are encased in aluminum. Perhaps the different physical properties create an illusion that one is actually running hotter than the other?

Thank you again. :)
"the mechanical hard drives seem to generate a higher temperature"

I'd bet that part of the HDD (where the motor is) does get significantly hotter whereas the SSD is a pretty consistent temperature throughout.

Keep in mind that the specs I found have the HDDs dissipating about 50% more energy than the SSDs.  Virtually all of that goes into heat, so you would expect the HDDs to be hotter.  3W is not insignificant when over a small volume.
Now you are dealing with efficiency.  Any energy that is created as heat doesn't get converted to work.  SSD's convert more of their energy to work.  Under a heavy load I've noticed some ssd's getting rather warm. Many years ago I had 2 x 1GB SCSI 5.25 Full Height drives that drew 35 Watts and it was like having two light bulbs inside of the case even at idle.
"SSD's convert more of their energy to work"

I think that if you look at the energy output of almost all devices, nearly 100% of their energy consumed goes into heat.  What is the "work" that the SSD produces that doesn't get turned into heat eventually?  Unless it is transmitting e-m signals, sound, or mechanical output (that would likely end up as heat eventually anyway), or stores the energy in some fashion (which would be temporary), it all ends up as heat.
This is great information and a good reminder of a basic science principle... so, when calculating BTUs, the formula is P(BTU/hr), where 1 watt = 3.412 BTU per hour, this formula assumes correctly that nearly 100% of the consumed power will eventually transform into heat (like CompProbSolv stated).

Thanks again.