Say I have a small AC adapter designed for use in North America: 120V/60Hz input, 9V output. And say I try using it at 100V/50Hz input in eastern Japan. What’s the resulting output likely to be, and is it going to go kablooey on me?
I assume that the lower input voltage will result in lower output voltage, but what’s the effect of the lower input frequency (50Hz as opposed to 60) going to be?
Actually, I would expect that a 10% change in voltage would not make a difference in the output. If it is trying to put out 9 volts it will still put out 9 volts. Only if it is trying to put out 7.3% of the input voltage, then the voltage will go down, but I would be surprised if adapters work that way.
As to the frequency output, I’d think that would also not make a difference. About the only way to know is to try it - if you’ve got a friend with an electronics or physics lab, you can wire it up to 100 V @ 50 Hz and see what it does. I bet it will still work just fine.
Now, if you were to plug it into some 220 V system, I’d suspect it would not work as expected. You may let the magic smoke out.
Yeah, I suspect it’ll work fine. But I still want to know, you know, even if just for the sake of knowing. Surely there must be a handy formula that I can write down and carry around in my wallet. No?
120V AC input --> Stepped down to a lower AC voltage (by a transformer, usually 15 to 20 VAC --> then rectified (AC converted to a pulsating DC value equivalent to AC ÷ .707, let’s say in this example about 28VDC {20 ÷ .707 ) --> then filtered (any remaining AC components removed/supressed, usually by capacitors) --> then regulated (Pure DC voltage clamped down to the correct value for whatever device you’re powering, in this case 9VDC).
The main job of the regulator is to keep the 9V level constant, regardless in changes of AC input or loading conditions (how much juice you try to draw out of the thing). In this example, the AC input would have to fall to less than 80V for the regulator to not have enough DC input to yield 9V output, so you’re probably OK.
About the line frequency thing- Power transformers can operate on 50-60 Hz just fine, because the AC component is being filtered out anyway. However if the device to be plugged in actually needs the line frequency rate for something (a clock source, perhaps), then you might notice some peculiur behaviour. Off the top of my head I can’t think of any examples of such a device.
I’m not sure that I’d call this a handy formula because there are a couple of variables involved. We don’t know exactly how much your particular transformer steps-down the AC before rectification, and we also don’t know what the tolerance of the regulator is (what its minimum input voltage requirement is before it throws up its arms in disgust). Some regulators will continue to furnish a constant 9V output as long as they get at least 9V in, but some need a few volts more than what they are rated to put out. Since you’re only talking about a drop of 20 VAC from the line, I still say your plenty safe.
OK, so it won’t fit in my wallet. But that’s a damn cool answer anyway.
I’m still curious about the frequency thing – simply out of curiosity, not for any practical reason.
Let’s say I’ve got a simple step-down transformer (no rectifying the AC into DC, and no regulator). At 120VAC/60Hz, say it yields 10VAC with no load. Now we hook it up to a 120VAC/30Hz power source. What’s going to happen to the output voltage? Will it be affected by the input frequency?
Transformers can lose their efficiency rapidly as frequency changes.Generally the larger it is in mass the less the variation and vice-versa.
That change of frequency you speak of is of the order of 16% which is quite large.
When I worked repairing electronic musical equipment we had a batch of stuff from the US that refused to work.(Digitech, not exactly cheap stuff)
It turned out that the transformer ratios were correct but they were designed for 60Hz not our 50Hz and the smoothing could not cope with the ripple so all the regulators shut down.
Regulators typically need at least 2volts across them so to get 9v out you need 11v in but that is not the whole picture as any ripple on the dc which causes that 11v to fall will shut the regulator off. In practice on a badly smoothed circuit you might need 15volts or so.
These little power units are usually built extremely cheaply - unless it is for a laptop - and are often working near their limits.The transformers are usually very peaky, that is the output from them falls of quite rapidly as load is increased.If you measure the output while increasing the load you would be surprised at how great a change there is, many do not have a proper regulator in them they just rely on a couple of diodes and a couple of capacitors, though there might be a zener diode to hold things together.
The change in frequency may well cause it to get hot because its impedance will fall thus the primary will draw more current, again if it is a cheapie it may cause it to get hot enough to fail.
The only real way is to plug it in, keep an eye on it or get one designed with international standards settings such as for a laptop.
Not to hijack the intent of the thread but since the topic’s got your attention…
How in the world does Targus make their universal notebook PC AC-DC adapter (expensive little devils @ $120 each) http://www.targus.com/accessories_power.asp that is, at about 8 oz, 1/2 to 1/4 the weight and bulk of conventional adapters. It does get real hot even when not being used, a lot more so than most adapters I’ve used. It’s long and flat and skinny and I assume this design is to radiate heat away efficiently.
Are they using regular ferrite core x-former technology or something different. Can’t find anything on their website about it. Any ideas?
It’s not a linear-regulated power supply, it’s a switch-mode power supply. Without going into the details, the transformer in the unit isn’t operating at 50 or 60 Hz, but at some much higher frequency (probably somewhere between 50kHz and 500kHz). This allows the iron transformer core to be much smaller, among other advantages. Special circuitry converts the incoming 50 or 60Hz AC sinewave into DC, then it’s chopped at a much higher frequency to produce the waveform going into the transformer. It’s much more complicated than a cheap wall-adapter with a 60Hz transformer.
The OP’s wall adapter will probably be ok. Unless the transformer and load current are at the hairy edges of working, it should still put out 9V and not get too hot. If it doesn’t work, it’d be because there was too much ripple at 50Hz, as casdave pointed out, or the transformer itself overheats (not as likely). It also depends on the load you apply to it- a heavy load will be less likely to work. I’ve got the equipment to test it if you want to mail it to me
Not enough information to give a correct answer. All the previous answers are making too many assumptions. So…
Your basic, cheap, AC adapter is not regulated. Output voltage will go down proportionally with input voltage going down. Note that, being unregulated, the output voltage is only nominal and will vary with current demand.
Depending on design the transformar will probably heat up a bit more running at lower frequency. If it was running too hot in the first place this may be a problem.
A somewhat better AC adaptor may have a regulated output and this will be more stable but if the input voltage falls below a certain level, the output voltage will too.
Then you have switch mode power supplies whic often are designed to accept input voltages from 100 to 240 volts and 50 -60 Hz.
In other words, you have to tell us more precisely the AC adaptor and what you are using it for.
I checked this page and was surprised to see Japan does indeed use 100 VAC and both 50 and 60 Hz. That is one lousy system!
I have a lot of experience using devices designed for 120 VAC at 60 Hz in 220 VAC 50 Hz countrues. Almost all of them worked just fine running through a transformer(even a microwave oven).
I only had problems with two things: ice cream makers and cordless phones. Computers, printers, scanners, radios, TVs, VCRs, everything else, whether it used 110 50 Hz direct from the 220/110 stepdown transformer or whether it used an AC adaptor plugged into the stepdown transformer worked just fine.
The only problem I had with appliances that ran directly off of 110VAC 50 Hz from a stepdown transformer (without an AC adaptor) were one ice cream maker that ran too slow and got hot and another one that stopped too soon.
The only problem I had with appliances that used AD adaptors plugged into a stepdown transformer producing 110 VAC at 50 Hz was with cordless (not cellular) phones that wouldn’t recharge their batteries. I had this problem with a Sony, a GE, and a PacBell. I tried using 220VAC 50 Hz adaptors that were supposed to put out the correct DC voltage at an adequate amperage. Didn’t work!. Now I’m back in the States and at least two of the phones seem to be permanently dead. I don’t know what is going on but my suggestion to you is DO NOT expect the AC adaptor for a cordless phone to recharge the phone’s battery unless it is plugged into the exact power source it was designed for.
I’d be grateful if someone could explain why the cordless phones were such a problem.
Aside from AC-DC wall power requirements a cordless phone designed for US line interface, ringing and line voltages will work fine in Japan? Had not heard this. Thought the two systems were different somehow.
I should have said that my PHONE experience is limited to Indonesia where US phones and modems work just fine. I wouldn’t be at all surprised to have problems with European phones.
I brought over a few U.S. electronics to the 100V 50Hz area, including a computer with a bunch of peripherals (some of which use AC adapters) and a stereo. Haven’t had a single problem. At first I used a step-up transformer but then I found out that a lot of cheaper PCs sold in Japan have power supplies designed and rated for 115V 60Hz. I haven’t bothered with a transformer since.
As Yeah said, some appliances that run directly off AC power have problems because some types of motors turn faster with a higher line frequency. This includes motors used to run most timers (e.g. toaster ovens). Such devices sold in Japan sometimes have two sets of markings on the timer dial, one for 50Hz and other for 60Hz.
>> I’d be grateful if someone could explain why the cordless phones were such a problem.
It has nothing to do with the fact it is a cordless phone. It had a cheap, unregulated, AC adapter to recharge the batteries and just lowering the voltage a bit will prevent the batteries from recharging. Also just increasing the voltage by not much will lead to overcharging them and damaging them. It is just the combination of charger/batteries. The phone part does not matter.
Sailor, your response begs the question, "Why are the AC adaptors sold with phones of such poor quality (much worse than the adaptors sold with any other product I have ever purchased)?
I think it has more to do with the re-charging aspect than the possibility that phone DC adapters are worse than other AC-DC adapters. Batteries and battery packs are a lot more particular than simple DC motors about the range of voltage/amperage they can stand. A volt or two of difference in DC output can undercharge or overcharge and
destroy a battery pack.
Well, most AC adpators are shitty because that’s all you can afford to put on cheap stuff. Under normal conditions it does the job and you do not need more. All the adaptors I have around here (scanner, printer, speakers, cordless phone, answering machine, etc) are of this cheap unregulated type, just a transformer, rectifyier and capacitor. The transformer is tiny and designed to burn out if overloaded so voltage regulation is not really good. This may affect some items more than others. My laptop has a nominal input voltage of 18 volts but will work fine anywhere between 10 and 20 volts (probably more but I did not want to find out the limit). The printer also will work within a wide range of voltages as will all the other items.
The problem is with charging batteries where voltages are much more critical. a slightly lower voltage than required will fail to charge a battery and a slightly greater voltage will overcharge it.
An expensive item like a laptop can have a more sophisticated charging circuit but a cheap cordless phone cannot afford that.
Regarding the cordless phones, after an extended stay in bad-electricity land the batteries are fried. You can go to an electronics shop or even Rite-Aid or some other store (Wal-Mart?) and find batteries for all types of phones. They run about $10 bucks a pop. You could, if the batteries are made of off-the-shelf ni-cad cells, disassemble the battery pack and figure out the construction, buy the individual cells needed, then reassemble. This may involve spot-welding. You can easily tell if something is off-the-shelf because it will look like a bundle of N standard batteries wrapped in shrink-wrap, or something. Slick-looking injection-molded cases for your battery pack mean it either conceals an inner off-the-shelf pack, or is proprietary and you must purchase one.
I’ve got a friend who can build battery packs for my phones, but the most recent time one failed I just picked one up at Rite-Aid because I couldn’t imagine asking my friend to spend half an hour futzing around with stuff to save $10.
I was feeling hunky dory until I got to Yeah’s post about ice cream makers (you brought ice cream makers overseas??) and cordless phones, and then astro’s subsequent remark:
My little AC adapter is for a cordless phone with, of course, rechargeable batteries inside the handset. And yes, it looks very much like a $2 adapter (“Made in China”), so I wouldn’t expect the regulating features to be too spectacular.
So, it sounds like it might go kablooey on me after all. The battery pack, anyway. Or did the word “destroy” above only apply to overcharging? Hmm.