Thursday, January 6, 2011

Steph’s Science Corner: How to Understand Your Electric Bill.

 Send me your topic suggestions at

christmas lightsWith the holiday season passed, two months of winter (and the rhinoviruses they bring) still looming, and the constant concern over the rising cost of energy ever-prevalent, many of us have turned a wary but resigned eye to our utility bills.

In these months of darkness and sub-zero temperatures, it’s hard for me, a raging-environmentally-conscious-hippy-liberal-Jew-who-projects-maximum-Grinch-itude-during-the-holidays-because-I-hate-consumerism-and-waste, to hide my bitterness for those blazing symbols of holiday spirit and cheer that adorn every gutter, rooftop and tree branch. That’s right, I hate Christmas lights. Sure, they’re lovely, calming, and evoke various false senses of community, cheer, and holiday spirit etc. I just can’t get over my gut reaction that they’re a drain on a precious, costly, and limited resource for an ephemeral aesthetic gain. In a world of exponentially spiking demand for consumer electronics and the strain its putting on our natural resources and international relations, I have a sensitivity to the wanton waste of power.

Yet when one of my more “house-ridden” roommates who typically logs about six to ten hours a day on an Xbox360 that happens to be linked to 42-inch high-definition plasma TV made a derisive comment about how my sister’s LED Christmas lights represented a dramatic and irresponsible cost increase that he didn’t feel he should have to pay, I changed sides.

This week, I embarked on a personal research project to gather information about typical electricity usage in the modern American household. I wanted some hard numbers to back up my arguments with. In the process, it occurred to me that this information might be worth sharing with my loyal and intelligent readers, in the event you ever find yourself similarly entrenched in a bills-battle. And since that aforementioned rhinovirus seems to be following me like a shadow all winter long, and this is a week of post-vacation recovery, it seemed like a good way to fill a post.

Understanding conductors

gnome insulator or conductor Electricity is an expression for the movement of electrons through a physical medium. In electronics, materials are typically divided into two types: conductors and insulators. Conductors are very good at letting electrons move through them; insulators are very bad at it. Most solids are insulators. They hold very tightly to all of the electrons orbiting their nucleus. Most metals, however, are good conductors, because they don’t have a particularly tight grip on those outermost electrons. Copper is one of the best conductors we have, though it’s not 100% perfect. The outermost electrons that are orbiting* the copper atom’s nucleus are loosely bound, and they can be easily pushed off to hop to other atoms nearby. If an outside influence pushes electrons across a material in a single direction, the electrons propagate along a path in a domino fashion. We make analogies between this movement of electrons and a river, saying that electricity “flows” through a wire.

Understanding current

power linesWe quantify electric charge with a unit called a coulomb (C). A single electron has a charge of 1.602 x 10-19 C. Current is a flow rate of charge through a material in a singular direction. It is measured as the total amount of charge (in coulombs) that passes through a cross-section of wire in a second. This unit is called an amp (C/s).

Current kills. The numbers can vary, but the general consensus is that a current of around 50-100 milliamps can stop your heart. That’s 0.05-0.1 amps, by the way. A whole amp is bad news. Twenty amps will kill a human instantly, regardless of where it passes through. But current isn’t the only thing that we talk about when we talk about electricity. You’ve probably also heard of voltage.

Understanding voltage (or electric potential)

voltage outlet

Voltage is a measurement of the electric potential energy stored per unit charge (remember that’s a coulomb).  Think back to high school physics and you may remember that “potential energy” is sort of a theoretical quantity – it’s the stored, possible energy that could be released as another form of energy if a system is prompted to release it.  There’s a potential energy associated with chemical reactions, with motion, and with electricity. We measure voltage by figuring out the change in electric potential energy over a certain length of wire. It is the work that would have to be done to move a single charge from point A to point B. The unit for [the physics definition of] energy is called a joule, and a volt is a joule per coulomb. All standard American household outlets are 120 v. European outlets put out 240 v.

When people say that current kills, voltage doesn’t, that’s not entirely true. At some point, if you make the potential difference from one end of your body to the other high enough, your body will break down into a pretty good conductor.  So don’t just go around telling your kids voltage can’t hurt you.

Understanding [electrical] power

Power, in scientific terms, is the amount of energy used per unit time. Notice that an amp is a (coulomb/second) and a volt is a (joule/coulomb), and so multiplying the units for voltage by the units for current allows us to get the units for power, known as the Watt:

From the incredibly useful and wonderful website called Hyperphysics, hosted by the George State University Department of Physics and Astronomy, which is an excellent resource for everything.

A watt is a rate, just like speed (miles per hour, joules per second). Most electrical devices (like light bulbs) will list their power output in terms of watts. A 100-watt device will use 100 joules of energy in a second. Like the instantaneous speed that you drive your car at throughout the day, the amount of watts that an appliance uses can vary. For example, a refrigerator is always plugged in, but it’s not always pulling a current. When it has to cool your food down, it will draw a high current for a short time, and when it has reached its desired temperature it will draw nearly nothing. Because the voltage coming from the wall socket is always the same, the change in current is what’s changing the power used by the fridge.

Maybe you’ve noticed that the electric companies charge you by something called the kilowatt-hour (kWh). A kilowatt is a thousand watts. The “–hour” piece indicates the total amount of power used over a period of time, like say, a month.

The watt is an instantaneous measurement – like your speed at the moment when you’re clocked by the state trooper. The watt-hour is not itself a rate, it is an accumulation of total usage. It is your rate multiplied by the total number of hours you had that device running in that month.

Paying for your electricity

Rates vary by location, by provider, and sometimes just randomly. Currently, rates range anywhere from 10 to 50 cents per kWh. Some utility rates are tiered, meaning that higher usages are billed at higher rates, similar to income tax brackets. Electric rates in my state are about 10 cents per kWh, or at least close enough to it, so I’ll use that number for the examples that follow.

In the United States, the average a household uses around 920 kWh per month (according to the DoE). That means that, as an average household, I would expect an electric bill of about $92.00.

A string of LED Christmas lights (25 to a set) uses about 2.5 watts, total. Lets say we used ten strings (a gross exaggeration, we were not really that festive), for a total of 25 watts (0.025 kilowatts). If we left them on for 12 hours a day every day in December (372 total hours), we would be charged for 0.025 kW x 372 hours for a total of 9.3 kWh. At 10 cents per kWh, displaying this string of 250 lights would cost us 93 cents for the entire month.

Now lets compare.

pie chart of electricity usage from The typical power consumption of a 42-inch plasma TV is around 400 Watts (0.4 kW). If the TV were used for five hours a day every day, that would be 62 kWh, which would cost about $6.20 per month. An Xbox360 uses about 185 watts when it’s turned on. A PS3 draws about 194. At five hours a day that’s an extra $2.00-$3.00. A Wii uses only 18-20 watts. A high-intensity, 100 watt light bulb left on for 24 hours a day every day would use 74.4 kWh in a month, or $7.00 dollars. If it was on for the same amount of time as the TV, it would cost $1.55. A clothing dryer uses about 4,500 watts, so each hour you dry your clothes for in a month costs about 45 cents. A refrigerator uses 200-700 watts, depending on whether its actively cooling or not. A dishwasher, which uses hot water, can use around 3,600 watts for just the washing cycle – the drying cycle takes more. A desktop computer draws 150-340 watts when its turned on, 1-20 when its in sleep mode. A coffee maker uses 900 watts.  An electric furnace can use anywhere from 7,900-27,000 watts.

So what’s my point? A week’s worth of showers probably cost more than those lights do.

Some myths, some truths

A hot topic of recent discussion is the realization that some devices use energy even when they’re not turned on. My Smartphone kindly reminds me to unplug my charger when I’m done. It’s true that many devices such as printers, televisions, and chargers have “standby power,” but in most cases, it’s hardly significant. Here’s a sense of scale: if you leave a cell phone charger plugged into the wall 24 hours a day, you waste about a watt. Just a single one, which might account for 0.77 kWh in a month, or about seven cents. Standby or “vampire” power accounts for less than five percent of typical residential energy use, or about $4.60 of my average monthly bill.

Some people also seem to have this idea that turning things off and on will use more power than simply leaving them on. I’ve heard this said about the heating systems and about desktop computers. This is a myth. Turning something on doesn’t use more power than having it on. That being said, most modern computers have incredibly efficient sleep modes, maybe only using around two watts, which isn’t much more than they vampire-draw when they’re completely off.

Now that I’ve gotten my vindication, here are my concluding thoughts

I hate that I’ve just written this piece for a number of reasons. It goes against a lot of my conservational instincts. It’s not necessarily always about the cost, it’s also about the use of an important and limited resource. Energy isn’t endless, nor is the only cost worth considering strictly monetary. Energy has to be produced somewhere, somehow, in order to get into our homes so that it can power all the of these devices that we’re so fiercely dependent upon.

Yes, I still secretly hate the Christmas lights that pop up in our neighborhood, despite this argument that I’ve constructed here today, and I hate the amount of cumulative electricity that we use as a country simply so that we can feel warm and fuzzy during one of the most tumultuous and polarizing months of the year. It’s a month in which we use the these force-fed distractions to ignore one of our coldest and darkest months of the year, as many suffering citizens of this country are painfully away aware of in a way that I in my warm, comfortable, electronically-rich home can’t even begin to comprehend.

That being said, in my petty, simplistic, and overly comfortable world, I want my roommate to stop whining about his wasted nine cents and get off the TV. And if someone’s going to scold me for accidentally leaving the light on in my room for a few hours, I want to be armed with the numbers. Keep them in mind the next time you leave your game running on pause all night just because you can’t find a save point. I’m watching you.

Most of the data used in this article was taken as an average from various sources including personal website data, DoE statistics, product specifications, and electronics review sites.

It’s the new year, and I’m going to need new topic suggestions! If you have any requests, I’d love to hear them. Email me at!