so i have two computers... one uses a 450 watt psu and one has a 220 watt psu
in the bios... the 220 watt psu runs at 11.91 amps
HOWEVER.. the 450 watt psu runs at 11.55 amps... and my graphics card brings a "not enough power" message intermittedly.... so...i just bought the power supply yesterday....
should i take it back? (its a 3 hour drive....)
i am concerned cause the tech support guys at the company for the card (leadtek) said it should be very very close to 12 A on the 12 v line! they said that 11.6 is way too low... and is something i should be concerned about on my whole system... not just my card. are they right? i dont know anything about this... but i am learning slowly... and i want to make sure i learn right. i am running a geforce 6800 gt card by the way... its a hog for energy!
i switched the power supplies... just to run in bios... and test... same result. so... this psu is only churning out a 11.55 amp power on the +12v line.
also... is there something like... if you increase the watts... you also increase errors? i mean... the one that runs nearly perfect to 12 A is a 220 W psu. since this psu jumps to 450 W... isn't it possible with all that extra energy... some gets dissipitated in quality ? (amps) i am asking cause people say... its better to go with lower watts and better quality... also... i know cache in system memory also works this way... the higher the energy output... the more chances you have to make mistakes... (look at the P4 HT EE 3.46 GHz cpu... so much power... and with all that cache... it constantly misses its memory location targets... err... thats another topic, sorry).... so what i need...
is someone tell me if i am going down the right path on this.... should i get a new power supply?
what about a multimiter? i dnt know what one looks like... never even seen it! so how do i use one? i was thinking it might help if i did a physical test of the amps.... any ideas on that.... and how to do it... details please... i dont know how to do this stuff!!!!