samhalperin...
Power supplies is PCs are rated similarly to the power ratings
of a home stereo system, in that the ratings refer to peak
capablilities. For example, your home stereo may be rated at
100 watts per channel, or 200 watts total output, but the fact
is that no one (well almost no one) listens at maximum volume,
and the typical user actually uses less than 10 watts at a
reasonable volume.
So why have a 200 watt-rated system? The analogy to PC power
supplies holds here, as well. Most stereos will begin to
introduce distortion at somewhere around half-volume, so
the higher your rating, the lower you can set the volume,
and the less distortion you'll have. Likewise, with PCs,
if the power supply is rated at 400 watts and you only use
100, there are fewer 'glitches' in the supplied voltages.
An excellent discussion of the subject can be found on the
Target PC website, by William Yaple:
http://www.targetpc.com/hardware/power_supplies/measure/index2.shtml
http://www.targetpc.com/hardware/power_supplies/measure/index3.shtml
The simplest way to determine actual usage is to hook an
ammeter or ampmeter between your PC and the outlet. This
will measure the current used by your PC in different states
from idle to full-tilt usage, as occurs with a benchmark
program such as WinTune or a PC game.
The formula to determine power consumption is then used:
P = I x E, or Power = Amperage (current) x Voltage.
Since your outlet voltage is 120VAC, if your current
was .5 amps, it would be .5 x 120 = 60 watts.
Another factor to consider is that power supplies use up
some of the power themselves, converting it to heat and
other losses:
"Power supplies in general are very inefficient pieces indeed.
Did you ever notice that, at least for computer rated models,
they have 5-7 amp fuses? For the 120VAC models, this means that
the total input power can be as high as 600-840 watts, even
though the output rating is only in the 250-300 watt range.
What gives? Have you been gypped? Computer switching supplies
have a typical efficiency of around 65%. This means that for
every 100 watts of input power (i.e. 120 volts x 0.833 amps),
about 65 watts can be used at the output. Nearly 35% of the
input power is lost as heat and conversion errors. To find
your actual power usage, you must reduce your power figures
by 35% or simply multiply them by 0.65."
http://www.targetpc.com/hardware/power_supplies/measure/index2.shtml
Mr. Yaple goes on, on the next page, to test three different
systems in both idle and peak states. The one with the greatest
consumption used 109 watts at idle, and 121 watts at peak.
http://www.targetpc.com/hardware/power_supplies/measure/index3.shtml
Taking into account the efficiency conversion formula, this
means that the actual computer components used 71 watts at
idle, and 79 watts at peak. Nonetheless, the computer AND
the power supply are using 121 watts at peak, the equivalent
of two 60 watt bulbs. Your own PC, of course will have a
unique figure, but it will be in the same ballpark.
The Inspiron 8200 is rated at 1.5 amps at 90 volts AC, at
full load, this translates to 1.5 x 90 = 135 watts input
power. Output power is rated at 90 watts:
http://www.dell.com/us/en/dhs/products/model_inspn_3_inspn_8200.htm#tabtop
How much of that is used by your laptop can only be determined
by testing the actual current draw, as illustrated above. It is
likely, however, that your laptop actually uses only a percentage
of the rated output power, as did the PCs in Mr. Yaples
tests, and that the usage is less than that of one 60 watt bulb.
I hope that clears things up for you.
Please do not rate this answer until you are satisfied that
the answer cannot be improved upon by means of a dialog
established through the "Request for Clarification" process.
sublime1-ga
Searches done, via Google:
"power supply" usage rating PC
://www.google.com/search?q=%22power+supply%22+usage+rating+PC
inspiron 8200
://www.google.com/search?q=inspiron+8200 |