how much does it cost to run a computer (electricity)

Hi,

What formula can I use to calculate how much it costs to run my computer per hour.

Electricity costs

0.15 cents per kWh (in New Zealand)

Power supply on the computer is 300W and I'm not sure about the monitor, its a 15" CRT.

Please give me the formula for working it out,

Thanks,
Ant
antumAsked:
Who is Participating?
 
Lee W, MVPTechnology and Business Process AdvisorCommented:
There's no real formula... you need to know how many watts your computer uses in an hour.  The exact amount of electricity used depends on a variety of factors, including usage.  For example, if you run SETI@HOME (or a similar program, it will use more electricity than if you let the CPU be idle).  There are calculators on the web, but they are largely INACCURATE - because they use MAXIMUM power draws to calculate it - and the maximum is often FAR ABOVE the actual.  I would expect a device like the "kill-a-watt" meter could be adapted to work in New Zealand.  I use it in the USA and I have found that my computers (depending on CPU and accessories use between 30 watts per hour (a low power IBM system with 2 sticks of RAM, a hard disk, and a network card) and about 140 watts per hour (an AMD Athlon 64 X2 Dual Core system with 8 hard drives, CD-ROM, 3 PCI Controllers, and 4 sticks of RAM - no monitor).  The 15" LCD I have I believe usually draws 55-65 watts.  So if I were to guess, based on my systems, I would say yours PROBABLY (based on my estimation of the "average" computer plus 15" monitor) uses 150-175 watts per hour and costs between 16.00 and 19.00 dollars per month, or about (assuming both are left on with NO power savings) 2.2 - 2.6 cents per hour.
0
 
nobusCommented:
just adding to leew's comment  - if you do not have a watt meter, you can calculate it approx by measuring the current drawn. then the formula is  volts x amps= watts. If this is 160 W, your PC uses 160 W/hr, The cost will be around 0.160 x 0.15 = 0.024 cent - or did i miss a comma, leew?
0
 
Lee W, MVPTechnology and Business Process AdvisorCommented:
24 Hours Per Day X 30 Days per Month (average) = 720 hours per month
150 Watts Per Hour X 720 hours per month = 108,000 Watts per month
108,000 Watts per month / 1000 Watts per Kilowatt hour = 108 kW Hours
108 kWh X  .15 cents per kWh = $16.20 per month
$16.20 / 720 hours = .0225 dollars per hour/ 2.25 cents per hour
0
 
Jbirk1Commented:
You don't know your computer's wattage though.

 You have to measure it.  Get an amp metor and connect it in series with the power cord taht goes to your computer.  Write down the number of amps the system is actually using.  Do the same for the monitor.

Now, switch to volt meter and measure teh voltage in New Zealend or wherever you are.  Multiply the voltage times the current (amperage) to get wattage.

Now add the wattage of your computer and your monitor together.

That is how many watts of power your computer uses when running.  Now, figure out how long you have it running each day and how much a killowatt hour costs.

1 Killowatt hour is 1000 watts for 1 hour.  So if a killowatt hour costs 10 cents, and your system takes 250 watts, then you will use 1 killowatt hour in 4 hours.  So it will cost you 10 cents each 4 hours or 2.5 cents/hour.  You get the idea.  It is just simple math.

The only problem is you must know what your system uses.  Just becuase you have a 300 watt power supply or a 600 watt power supply doesn't mean it uses that much.  IT probably uses much less because that is the maximum the system can provide.

Justin
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.