I recently bought a little tool that measures the amount of electricity used by your electrical gadgets.  It measures watts, amps, and voltage, in addition to a bunch of stuff I’ve never really heard of (frequency, power factor, and VA).  I work for a consulting firm that does work in the electric utility industry, and just let my nerdy/professional curiosity get the better of me.  I tested it on a few devices around the house and the results are pretty interesting.

By far the most energy intensive device was the microwave, which draws 1,500 Watts (1.5 kW) when in use, and about 3 Watts when not in use (to power the clock).  Next up was the TV/home theater system/PS 3, which drew about 270 Watts when in use and about 5 Watts when not in use.  However, during 40 minutes of streaming a show on Netflix, we used about 0.18 kWh worth of electricity, which really isn’t very much.  In fact, at the rate we pay to our utility, Pacific Gas & Electric, during summer evenings, we only spent about $0.02 on electricity while watching TV.  (According to PG&E’s rate schedule, October still counts as a summer month, because September and October are usually the hottest months in the Bay Area.)

While the TV and the microwave were fairly constant in their usage, it was interesting to see the spikes in usage when using the computer.  When it’s starting up or performing a task that is particularly CPU-intensive, usage spikes up to about 250 Watts, but when listening to music with the screen turned off, usage drops down to about 140 Watts.

The final device I tested out was an air filter/fan that is continually running in our bedroom.  That device draws about 9 Watts when on, which, again, isn’t very much at all.  If we turned it off, we would only save about a dollar on our monthly electricity bill.

Generally speaking, I was surprised that the use when these devices are on is fairly small, and that the problem of “shadow load” (load drawn by devices when they are turned off but still plugged in) seems almost non-existent.  I think this is mainly due to the fact that most new appliances are Energy Star certified, meaning that they are designed to minimize this problem.

One Response to “Kill-a-watt”

  1. The thing that surprizes me the most is the power used by a computer, both when “working” and during non-demanding tasks, basically in standby. 250 Watts is the equivalent of 2.5 VERY strong incandescent bulbs, and 140 Watts is not nothing either. Which makes me think that a computer wastes a ton of energy producing heat as opposed to “thinking”… In a household that doesn’t mean very much, but in just one building like the Kleberg building, where there are hundreds of CPUs running idle overnight to allow for network maintenance, that has to be a substantial electrical bill on a monthly basis. Which brings up the question: is it really necessary to leave our CPUs running all night? Can’t they be powered on and off remotely just long enough to do whatever needs to be done? Whoever comes up with a real solution for this issue can save/make a ton of money… Just my $0.02 worth :)