There are articles I won't click based on the title:
- have a yes/no question
- have the word 'expert' in it
Guess which one this article falls into.
This is a most excellent place for technology news and articles.
There are articles I won't click based on the title:
Guess which one this article falls into.
Yes.
Porque no los dos
non lo so..
You could save yourself cents per year!
They don't actually say how much power a charger uses on standby, and make an unsubstantiated claim that they "wear out" due to "voltage fluctuations". Sure thing.
Absolutely pointless article.
I would argue that you are much more likely to break the cable physically by constant unplugging and replugging.
Or wear out the wall switch.
I have never heard of someone having to replace their wall outlets due to wear and tear
Most wall outlets if they are a frequent use location only last a a handful of years before they start to get loose and maybe a decade before shit just doesn't hold at all.
A plug that you plug something into once or twice ever and leave it in can basically last forever.
As someone who just moved into a 1965 house - yep, plugs absolutely wear out. These are some sloppy bois.
Around here, people usually have to replace the wiring in such old houses, since they tend to only have two wires (i.e. no PE). But the Schuko sockets themselves are most likely fine.
I've encountered a number of outlets in American airports that should be replaced due to wear. They have very little friction on the prongs after millions of uses.
Life pro tip: bend the prongs a little to give your device more grip if you encounter outlets like this.
Pro tip: bend the wall where the outlet is to assert dominance.
I have had to do that - so now you have heard of someone... My house was built in 1973, some of the outlets in locations I believe previous owners would have plug/unpluged often have worn out, and thus I had to replace them. (think kitchen appliances or vacuum cleaners - the same outlets I'm using all the time). There other other outlets that still work, but they don't grip plugs as well as they should anymore and I am planning to replace. Despite the above, the vast majority of outlets I've replaced have been perfectly fine, but with young kids around I wanted modern TR outlets anywhere the kids are likely to be playing.
You could save yourself cents per year!
That's pretty much it. Maybe even tens of cents. In pre-USB era that actually made sense, Nokia chargers with a barrel jack (and other that era wall-warts) consumed even several watts on idle but (assuming a good quality) modern USB-bricks are way more efficient. They still consume a non-zero amount of power when plugged in but you're not going to see that on your power bill. You'll waste far mor energy if you forget your bathroom lights on overnight, even with LED bulbs.
Tbh, you are supposed to ALSO unplug EVERYTHING else you arent using to actually start saving money.
It doesn't completely generalize that way. I have an old stereo which uses like 7W on standby. And an old pair or computer speakers which don't really care if I press the button to turn them off. I mean that's not the power brick, but the device after that, so a bit out of scope for this article. But if I weren't unplugging them... 10W standby is 26€ a year and not just a few cents.
Key thing here being that they're old...there are limits to how much power electrical equipment is allowed to consume in standby mode (at least in the EU) today, it's not allowed to consume more than 0.5W in standby mode, 0.8W if it has a status display.
Yes, that's why I say it doesn't generalize. They mention this in the article. These old power bricks from the 90s with a heavy copper transformer inside waste a lot of power on standby compared to the modern switch-mode power supplies. But times have changed. On the flipside we have a lot more electronic gadgets these days and things in fact add up. So if you have modern things like 5 smart lightbulbs in the house, then a network switch, an internet router and a wifi extender, plus a few USB chargers at the bedside, the livingroom, a TV set with a PS4 and a soundbar plus subwoofer. A few LED strips in the gaming den... Then you might own a dishwasher and washing machine with wifi, the oven has a display, the microwave above yet another one, the cable TV has some booster in the basement... You're likely paying more than a few cents for that. And the things which run unattended 24/7 for decades, buried somewhere, tend to not get replaced every few years, so you might still own a power brick from the 90s. So I'd say it's worth looking into... I mean not super important, you can as well skip it and just pay the amount... But it's a thing. And I mean if you're unalike me and buy a new stereo every 10 years or so, that's also not necessarity helping the environment, and they cost money. So it's a bit complicated and a balance. At least I can somehow relate to the article, because the multi-outlet power strips behind the TV and my desk with the computer kind of look like the pictures there...
Tl;dr: Consider unplugging them, they all consume some small amount of standby power and that adds up. Also they wear out.
Though: I've never noticed any of the 24/7 devices I own wear out, I think that might be a myth?
I have a whole bunch plugged in constantly for various synth nonsense. Running off solar, I thought it would be worth seeing how much of a difference it makes turning everything off at night, and basically it wasn’t worth the effort. It’s like a percentage point on top of the things like my fridge that run constantly, and is way less than using my toaster once a week. That said, if you’re on mains, it’s probably a worthy consideration if a lot of people were to do it, but it’s also probably comparable to using ChatGPT once a day or something
Cool cool cool. I'll just continue not using chatgpt and we'll call it a wash.
Yeah, that’s pretty much my attitude. Why am I worrying about a watt here and a watt there when muppets are constantly asking LLMs inane questions and getting them to make dodgy hentai.
The article is very light on details. There are better articles with some real numbers.
Chargers for a phone draw 0.1W roughly. That's 0.9 kWh per year, and with a price of €0.35/kWh would be €0.32 / year / charger you leave plugged in. That's not even a rounding error compared to what my heat pump uses.
Devices with an indicator light barely use anything more. The ones with a display or clock do use more power, usually a few watt, what then comes down to maybe €10 / device / year using napkin math.
never noticed any of the 24/7 devices I own wear out, I think that might be a myth?
USB develops rapidly. My fleet of chargers always went outdated well before they went old.
I have a drawer full of them that aren't useful because half an amp isn't enough to charge anything anymore.
that's because phone makers were pumping out garbage chargers with bare minimum performance for every single phone, isn't it?
I mean, transistors and ICs do degrade over time, hovewer, out of all the power supplies I've repaired, the vast majority had dead caps, and those kinda tend to dry out with time regardless of whether they're in use. So, kinda negligible, just like the power consumption in standby.
The capacitors have a limited lifetime.
Maybe but is that lifetime limited by on/off cycles or by wall time?
Electrolytic capacitors wear out due to age whether they are used or not. Heat makes them wear out significantly faster. If the power supply is not under load, it shouldn't be producing any noticeable amount of heat.
Nobody considers the risk of them going up in flames at night. They have a temp trip safety, but there is still some risk left. Especially for cheap Chinese power blocks.
Wouldn't temperature be much more of a problem while charging and not while the charger is unplugged from the device it is supposed to charge?
No, it is when not charging you need to worry. When charging odds are you are there and so if it starts to burn you will smell it and take action before it starts a larger fire. When you are not charging you might not even be home and thus the fire will spread.
When things are normal the charger will not burn. When something goes wrong you have to worry. Most of the time you are not charging.
Not sure. Most time I read in the newspaper when some apartment burnt down, or this happens somewhere in my vicinity, it is something that was plugged in. It's super rare that this happens with unplugged chargers. So I'm pretty sure there is some chance this happens, but it's more complicated than that. For example during sleep, the human nose seems to be on standby as well, so you might not notice if your e-bike battery or your hoverboard which you're charging during the night will catch on fire. Until it might be too late. You won't be noticing the unplugged charger either, but it's less likely to fail catastrophically. But you should be worried about both. Especially the hoverboard. And not being present has the downside the fire is going to spread. But as an upside you can't die from the fire if you're not there. But yeah, if you're sitting next to it and act quickly, you can stop a situation from escalating.
And firemen always tell, people are surprised how quickly a fire turns from small to all the furniture and plastics stuff burns and it's not something a regular person can extinguish or contain any longer. So you really have to be right here. One room apart might not be enough.
Unplugged vs plugged in is moving the goal posts. I agree that a device that isn't plugged in is less likely to start a fire (while not impossible, it is very very unlikely), but that is a different situation.
I agree. We already moved the goalposts a few comment further up the thread... (And I meant plugged-in chargers that have nothing attached with my bad phrasing with "unplugged") I just wanted to tell that some of the ways to mitigate for the risk aren't very straightforward. What seems to do a lot is get smoke detectors, and a fire extingusher, so you don't spend the deciding 2 minutes in the bathroom, filling up a bucket. And some risk is always there, we almost all own quite an amount of electronic devices and batteries. (And then what people said here, don't use dubious products with less failsafes in their design, and entirely unplugged things (without a battery) are safe and will not cause a fire.)
some risk left. Especially for cheap Chinese power blocks.
You can leave away the "cheap Chinese". I have tried to find some of high quality once, and that was a super hard task. They are all the same as the cheap Chinese, whether it's written somewhere or not.
They're almost all Chinese, but not all cheap. You can get ones with or without protection circuitry.
In EU they are not allowed to consume more than 0.5 Watt on idle. And this regulation has been in force since 2008.
Since mostly everybody design for that, I expect this norm also benefit other countries. So this is not really an issue, unless you are in a country without such regulation, and you buy some cheap off brand charger.
https://energy-efficient-products.ec.europa.eu/product-list/standby-networked-standby-and-mode_en
Since the standby power is so low, the wear is most likely insignificant too.
Having an idle unit that uses 0.5 Watt on constantly for a month, consumes about 1/3 kWh, but since this regulation has been in force since 2008, I suspect idle is improved further for most devices. 0.5 is a maximum allowed value, and most would prefer to stay below that to not get into trouble.
Yeah, I'm going to remove those from my walls now...
old power supplies used transformers, which when not loaded behave like inductors, and this causes reactive current to flow. not a problem for the last 15+ years because everything uses switching mode power supplies now
I unplug the switching power supplies when I'm not using them because I don't want to listen to the RFI that they produce.
But it is particularly concerning for cheap, uncertified chargers. These often lack appropriate levels of protection and can be a fire hazard.
I mean, you could hypothetically have an unsafe charger that plugs into wall power, but I don't think that that's specific to chargers. Any electrical device that plugs into wall power could hypothetically be unsafe.
In the case of chargers, the power supply is external to the device being powered and uses a standard interface, so it's easy to examine and replace. I think that the only thing that comes close are external, semi-standardized power supplies with barrel plugs. So if you want to make sure that you have, say, all UL-marked chargers (in the US; a CE mark isn't really the same thing in the EU but is the closest analog that I'm aware of) you can do that fairly easily compared to ripping an internal power supply out of a device. But I'm not convinced that USB chargers in particular are especially problematic relative to other forms of power supply or wall-power-connected device.
The switch of my power strip broke before any of my USB chargers by turning it on and off everyday. So I stopped turning it off/on and it wont break again, I prefer wasting cents of idle power than having to buy a new power strip each 6 months. Cheaper and easier.
Has any study been done on how efficient they are as heaters? The electricity they use when idle doesn't vanish; it's given off as heat. In the winter it might be worthwhile to not bother to unplug them because what they're giving off could offset what other, more conventional, heat sources might otherwise provide. i.e. you leave a charger plugged in, and your house heating goes off half a second sooner, saving you the pennies there that the charger costs otherwise.
Admittedly, this doesn't apply to summer and hotter climates, so most people, most of the time, probably ought to be unplugging them, but there's a small percentage of cases where the reverse might actually be beneficial.
100% is the typical claimed number as it is easy to measure watts of electric in and find that is exactly equal to watts of heat out - or if not the difference is easially explained by measurement error. There is no hypothesis (much less theory!) of where the energy could go it it isn't heat and conservation of energy is enough to also decide 100% efficient.
The above isn't the whole story though. If you could somehow measure watts from the power plant output you would discover that 4-12% (depending on a bunch of factors) of the energy is lost before it even gets to your house and so your efficiency goes down. If you measure fuel into the power plant, those range for 10% (old systems from the 1920s only run in emergencies) to 60% (combined cycle power plants) - I'm not sure how you would measure wind and solar. Eventually the universe will die a heat death and 0% long term.