Measuring power draw from a computer

How could I go about measuring the actual power consumption of a PC after it is built?

For example, if always use a 350w power supply, is that number the maximum it can supply? If my hardware specs are light (few if any PCI cards, integrated motherboard, Celeron 2.0GHz CPU), what am I actually using?

The reason that I am asking is because I want to use a reasonably sized, inexpensive UPS but do not want to create an unsafe condition. I am only trying to protect against brownouts and momentary power outages.
LVL 2
wmilligaAsked:
Who is Participating?
 
Lee W, MVPConnect With a Mentor Technology and Business Process AdvisorCommented:
Callandor - cool link.  

Yes, the wattage on the power supply is the maximum watts it will provide.  I've found that most computers (P3s and P4s) use ~175 watts on average.  If you run SETI@HOME often, then the use will increase.   I wanted a definitive answer to this myself and found a cheap device that gave me answers - http://www.p3international.com/products/special/P4400/P4400-CE.html  You can find them on e-bay (search for "kill-a-watt") for $15-30.  Very handy.  I use it frequently.  I've found that: a P3 1GHz, 512 MB RAM, 3 hard drives, DVD Writer, no monitor, AND a 27" TV, VCR, Cable box, and Stereo ampliflier (on with volume at reasonable levels uses ONLY 250 Watts of power using this device.  I also connected it to a Single outlet that a 700 VA UPS was connected to.  That UPS, with power strips, supported 2 PCs (P4 2.8 & Celeron 2.8), full tower systems, a laser printer in standby mode, Three monitors (one 19" and two 17") as well as a small network switch and a portable phone, 2 PDA chargers, and a small VoIP box from Vonage were using 500 Watts.

0
 
CallandorCommented:
Wattage calculator
http://www.jscustompcs.com/power_supply/

The wattage rating of a power supply is the maximum number of watts it can output.  This, however, can take the form a lot of current on some rails and very little on others.  The 12v rail is usually a good indicator of how good the power supply is, because it is difficult to get a lot of output there if the supply is cheap.

You should consider a UPS from ebay, where you can get an APC 1400VA unit for a little over $200 plus shipping.
0
 
RyanChCommented:
Unfortionately I havn't seen this table updated, but there are several ways you can guestimate how much power you're using.
http://www.microsoft.com/nz/presscentre/articles/2001/august-01_consumption.aspx

Here is an interesting article to read:
http://www.firingsquad.com/guides/power_supply/page2.asp

I would definately suggest reading Tom's Hardware guide to power supplies before purchacing one.  
http://www4.tomshardware.com/howto/20021021/index.html
They also have some informative PS comparison material here:
http://www4.tomshardware.com/howto/20030609/index.html
http://www4.tomshardware.com/howto/20040122/index.html

Leew, you should be careful with that estimate, computers now are using MUCH more than that.  The specifications for the Nvidia 6800 was something like 150W on its own!  With the setup that is described here I don't think that you'll have a problem with a 350 W supply as long as your other compoents (video card, HDs, CD drives...etc) aren't absurd power hogs.  As the first tom's hardware guide shows, often the rating of the power supply is not the amout it can in actuality handle.  

UPS's are very good as well to have, their batteries usually last about 10 years before a new one is needed.  
0
Free Tool: ZipGrep

ZipGrep is a utility that can list and search zip (.war, .ear, .jar, etc) archives for text patterns, without the need to extract the archive's contents.

One of a set of tools we're offering as a way to say thank you for being a part of the community.

 
Lee W, MVPTechnology and Business Process AdvisorCommented:
There's nothing wrong with my estimate.  I did not say it was fact and I encouraged the purchase of a device to calculate that.  MOST people do not have $400 video cards, especially people who describe their systems as above.  The faster thiings get, the more power they tend to take, but even the NVidia card is not going to draw 150 Watts of power while being generally idle.  When playing Unreal Tournament at 100 fps, sure, but not when idle.
0
 
RyanChCommented:
However one cannot look at their purchacing options based on the idle power draw from their PC.  While it is true that very few people have $400 video cards, the purpose was to show an extreme example.  I have seen several problems when people under-estimate their power consumption (I thought that the device linked was excellent), but it is usually best to err on the safe side.  You are correct on the estimates on the idle load, It is just better to be safe and do your calculations of power under load.  It is like overclocking, one measures the max temperature under load, not while idle.  

0
 
Lee W, MVPTechnology and Business Process AdvisorCommented:
Fair enough - but keep in mind what the question was asking.  
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.