Electrical Engineers: some wall-wart advice, please?

Grab a snack:

My wife was gifted a souped-up snow globe for Christmas. The base has a light bulb and a fan for circulating the “snow,” which is some tiny, shiny pieces that reflect the light, brightly. Sparkle, sparkle. Also a timer to shut the thing down and start it up twenty hours later (I do not rely on it, it’s shut off when not in use).

The thing eats batteries, so I thought about hooking it up to a DC adapter. I found one in the Box of Old Wall-warts and checked out the specs on the label – 4.5 volts (the thing uses three AAs), 600mA (seems about ample), tip is negative. I hook it up to the volt meter to make sure it works and the VM says 6.9 volts! WTF? I graft it into the circuit for a test run and everything seems to function. My wife says it’s brighter than before and the fan runs faster, but I can’t really tell.

So, two-part question: Is the adapter damaging/will damage the thing and what is the deal with the difference in displayed voltage versus the reality – is this common practice in the electronics manufacturing world?

TIA

Old (transformer/rectifier/filter) wall-warts were rated for X volts at Y amps. They were unregulated, so pulling less current would mean that the voltage was higher.

And yes, you can burn up something if the voltage gets too high…

Sounds pretty normal for a random unregulated power supply. You could test it under a load to see what voltage is actually supplied, ripple, and so on, but for running a fan, who cares?

It’s probably just a transformer followed by a rectifier with a capacitor on it.

Sounds like it is an unregulated supply.

Measure the voltage when it is connected to the snow globe. What is it?

If the voltage is ≥ 4.8 V when it is connected to the snow globe, you can easily reduce it to something closer to 4.5 V by inserting a series Schottky diode (around 0.3 V reduction) and/or a series silicon diode (around 0.7 V reduction). It’s not the most efficient solution from an electrical perspective, but it’s cheap & easy.

You folks are fast with the info. I will check the voltage under load and report back.

Thanks for the help, Dopers!

This isn’t what you asked and I’m probably stating the obvious, but another solution would be to use rechargeable batteries. No rewiring or worries about things burning up.

Most things that use an external power supply aren’t TOO picky about the voltage, especially if it also takes batteries. Just don’t connect it backwards and keep the voltage within, say, 30% of nominal if you can help it.

I recommend one of these for this kind of household use.

https://www.amazon.com/3A-72W-Adjustable-Universal-Switching/dp/B0CJLGGJ1C/

The voltage output is going to measure higher than the nameplate rating across the terminals when it isn’t loaded. As suggested see what it measures under load, when running.

Also, older wall warts can be pretty energy inefficient and have some residual current drain even when everything is off. Older transformer based wall warts can consume 1-3W while idling (device is off). Modern switching poer supplies are better, and typically <.5W.

This isn’t a lot of power, but it’s up to you to decide if its worth it. A wall wart consuming 1W, if left plugged in all year, will use about 8.8kWh of energy. If you pay .20/kWh, that’s $1.76 per year to power that wall wart.

You can tell if your wall wart is a transformer type or a switch-mode type by weight. The older transformer based ones are much heavier.

Measured the Ww under load and it read 6.1 volts. Way too high. Am chucking the adapter solution in favor of markn_1’s re-chargeable batteries idea and will be kicking myself the rest of the night for not thinking of it on my own (probably because of an incident many years ago with re-chargeable batts).

Thanks all for your input.