Should you unplug chargers when you're not using them - tested

Someone disagreed, but gave no reason. I wonder if he disagreed that they plugged into 120AC, or that any such device can short out. Either one is provably true, and so the disagreeing person looks rather ridiculous.
Let's try and remember that the 'Disagree' button is no different in concept than the 'Agree' button; simply an expression of concurrence or non-concurrence of a statement made by someone else. Please do not take it personally, and like agreements where there is often no reason given, there is none required for a disagreement.

I totally agree ...when the issue is opinion. What I stated were two absolute facts. To disagree with a provable fact requires an explanation. So although I agree with the agree/disagree policy, in this instance I totally disagree with its application.
 
Considering the millions of street lights pointlessly lighting empty streets and the yard of light every farm house in the country lighting an empty yard, I have a hard time getting excited about this.
maybe they will make long range motion sensors for instant-on LED street lights.
 
I admit now that I read the article and then skipped right to the end to make my point so if I'm repeating something tuff.

Chargers by there nature act like an alternator in a vehicle, what they do is change AC current from the mains to DC current which the batteries use so even when nothing is being charged there is something churning in there yes it may be small but there is a :icon_arrow: resistance therefore money is being burned. If you want proof of this take a scissors to the cable that is coming from the idle charger but do expect to be electrocuted.
 
The article stated it was not intended to be "scientific". Then goes about proving it was anything but.

No control devices, no idea of what was actually being measured, no tolerances on anything. No high school sophomore chemistry student would set up such an "experiment".

The vast majority of small appliance chargers today are of the "floating" variety. This terminology applies to a charging device which does not constantly try to charge a fully charged battery.

Older chargers would feed a constant Voltage/Amperage to the battery which, when full, had reached its physical limits. Trying to cram more "charge" into an already full container does exactly what you would expect should you try to put ten pounds of potatoes into a five pound potato sack. Unfortunately, there is no "over run" capacity in a battery.

Heat builds up in fully charged batteries during the charging process which eventually weakens the electrical circuit which is the storage tank of the battery. At some point the heat build can be such that the battery risks exploding as more and more Amperage is stuffed into it.

The users of these non-floating type chargers were warned they should immediately unplug the charger once full charge was indicated or risk damage to the device being charged and possibly to the charger itself. This was a bit of a PITA and meant the user must sit by waiting for the charger to complete its task or risk a small but potentially dangerous explosion.

This problem with constant charge devices led to the development of the floating style charger. When the battery has reached its maximum charge, the charging device shuts down and does not continue to supply Voltage/Amperage therefore preventing over charging which could lead to damage to the battery.

Depending on the design and construction of the battery, the fully charged battery can be maintained at a fully charged level indefinitely if it is left installed in the charging device. However, over time any battery will lose some amount of charge simply sitting in a drawer under no load. If the battery were still installed in a charging device, the charging circuit would be cycled on once the low charge status was confirmed and the battery would be topped off and then the cycle of non-charging/charging would replay until the battery was removed from the charger.

If the Voltage measured by the author was simply that which powered the LED within the charger, then this spec could have been calculated more easily and more accurately than the techniques employed by the author. Voltage, as tirediron said, is more or less consistent in value. The meter on your home does not measure Voltage as it is relatively steady at a plus or minus value relative to 120 Volts.

Voltage in theory and explanation is simply the potential for work to be done. Alone Voltage can achieve no work.

Amperage is what does the work. This is what needed to be measured by the author though this would still have resulted in an incomplete experiment.

The higher the Amperage (or current) draw, the more electricity is being "used". Obviously, when your refrigerator, HVAC or washing machine kicks in, your electrical draw/usage is higher than when you use your coffee grinder or a USB charger. All of these devices run on 120 VAC if they are being used in the USA. A step down/step up transformer adjusts the incoming Voltage to a usable value for the type of device being powered. Each of these transformers has a certain amount of power loss.

Switching type power supplies use substantially less "power" than do the older analog devices due to the way they operate. Most wall warts are consider switching types which feed "digital" components.

Measuring Voltage draw from the 120 VAC extension cable alone is no more informative than telling me your meter on your home reads electrical usage when you are using electricity. We have no idea of the Voltage or Amperage being used by the chargers measured in this article. Or the state of charge/fully charged devices they are plugged into.

Voltage or Amperage ("electricity") cannot flow if there is no load to complete the circuit. Simply plugging a charger into an outlet does not allow for the flow of "electricity" unless the cumulative load of the LED's alone is what is being measured. Spec'ing the LED's would have been much more informative than the measurement techniques used by the author which actually tell us nothing.

Science is further offended by the author but it is fair to say a scientific research article does not exist here. It is, in fact, pure BS.



The conclusion made by the author is to say the measured (?) Voltage draw is so minimal that you could make much greater gains in power reduction by looking elsewhere in your home.

Indeed!

Power losses occur in all electrical circuits and devices. What goes in is never what comes out. Until you insert "device loss" into the equation, you cannot even begin to accurately predict consumption. Until you insert a time value you cannot predict consumption. Many electrical devices have large power draws at start up and then quickly settle into a lower and more constant draw.
Yet, it simply stands to reason a HVAC system will have greater loss at any time than will a 5VDC/2.4 Amp USB type wall wart charger.

The point of this article should be, IMO, if I assume each of us makes an average of $20 per hour, it has just cost us approximately $5.45 to read this stuff. The doctors, lawyers and IT tech folks out there blow that figure well out of the water.
 
Last edited:
Get ready, I'm going to poo-poo this story. Sorry for being a Debbie Downer.

I understand from one person's perspective, it doesn't seem like all that much. It's only $0.34 per person per year. Not bad, right?

Assuming the author's numbers are accurate... now take that 2.628 kilowatt hours (kWh) and multiply it by approximately 275,000,000 adults (over the age 18) in the US (source: Kids Count Data Center). (Although, I should probably include kids, because they have so many electronics nowadays.) I'm assuming all adults would have similar vampire power demand as the author.

So, now we have 722,700,000 kWh being used by chargers that are not in use. Multiply that by the author's $0.1298 per kWh, and we have a cool $94 million dollars being wasted a year. Ouch.

Not only is that money wasted, power is wasted. Power has quite the negative effect on the environment. We waste a heck of a lot more than just $0.34 a year per person. Billions of gallons of water are withdrawn EVERY DAY here in PA for power plants. Now include the rest of the US. Oof.

Not only that, other natural resources such as coal, oil, natural gas, and plutonium have to be mined and transported to the facilities. Where does all the waste go? Back into the earth as waste. Water withdrawn can impinge/entrain aquatic communities in the intake systesm. Heated wastewater has negative impacts on the aquatic community downstream of the facility.

I read a similar article from someone in California a few months ago that essentially said it was ok to leave the water running while brushing/etc, because 'so little water' is used. Yes, by one single faucet, not a lot is used. For the population as a whole, it's another story. A bad story.

I'll be the first to admit that I don't always unplug when I should, but I think articles such as this give a false impression of our impact on the environment. They pull a very tiny piece of the entire process and say its ok. We need to look from cradle to grave.. look at the process holistically.

These articles try to justify our laziness. IMO, that's not right.

End rant.
Your assuming that all those chargers are really plugged in. You have no idea how many chargers are plugged in. It's a moot point.
 
Of course a single charger doesn't waste a lot of power. You have to look at the bigger picture.
There are MILLIONS of smartdevices out there. Personally I own a smartphone, a tablet, a smartwatch and my Nikon charger is also always plugged into the wall.
So 6 chargers use about 0.3W. That means a single charger uses 0.05W. That's not a lot, but now imagine a big city. If one million people leave their charger permanently plugged in that's roughly 50.000W! That is a LOT of wasted energy and not a far fetched scenario for a bigger city.
Not to mention that there are actually roughly 2 BILLION smartphones currently in use.
 
Istead of worrying about 34¢ of electricity being used by a plugged-in phone charger, why not offset that by installing just ONE single CFL or LED lamp in your house or office?

I think nearly all the bulbs in our house are either LED or CFL types. The few that aren't were not available in the bulb type.
 
Let's try and remember that the 'Disagree' button is no different in concept than the 'Agree' button; simply an expression of concurrence or non-concurrence of a statement made by someone else. Please do not take it personally, and like agreements where there is often no reason given, there is none required for a disagreement.
Where more than 1 point is being made agree suggests you agree with all of them, or at least the overall feeling of the post. Disagree however might be just regarding one of the points so some clarification is helpful IMO.

One of Didereaux's two 'facts' is actually not the case here in the UK at all. Any chargers plugged in will have~240V applied to them not 120V.
The risk of fire is pretty minuscule due to contact breakers/fuses which are designed to cut the power in the event of a failure that otherwise could start a fire.
I rather suspect the disagree was due to this second factor as I think Buckster is based in the US and would have 120V mains :)
 

Most reactions

Back
Top