Low voltage DC power supplies -- how do they make them efficient?

It’s common to make DC power from AC supplies by using a transformer (if voltage change is needed) and a full wave diode bridge, often followed by capacitance and active regulator circuits. But in silicon that bridge drops 1.2 V because of the forward voltage of the diodes. Even with a center tapped transformer and half wave rectification in germanium, there’d be about 0.3 V. If you’re building a power supply for a low voltage, do you just accept that?

I’m thinking for example of chargers for individual cells in a large storage system, where you might be delivering a lot of energy at 1.5 or 2 V. The diode drops themselves would make for a 20% to 80% additional loss over the power delivered.

Perhaps there are even lower voltage uses for serious DC power? Maybe, I dunno, electroplating or other electrochemical industrial processes?

Could they use huge FETs with some appropriate circuit to sense the AC polarity and drive the gates?

Jeez, I should have thought of ways of finding this before I posted…

Almost everything uses switched mode power supplies. And the rectifier is usually upstream of the switching supply. So a 1.2V diode drop is only a 1% inefficiency, regardless of output voltage.