EDFA is the acronym for Erbium-Doped Fiber Amplifier. Erbium is an interesting element. When it’s struck with light at 980 nm, it emits photons at around 1550 nm, which is one of the frequencies on which multi-channel systems operate.
Optical amplifiers work through the application of Stimulated Raman Scattering - a fiber nonlinearity characterized by high-energy channels pumping power into low energy channels. The optical amplifier harnesses this phenomenon to amplify optical signals which have weakened over distance. They’re really quite simple and reliable devices; a great improvement over the regenerator. Optical amplifiers consist merely of: the input fiber, a pair of optical isolators, a erbium-doped coil of fiber, a pump laser operating at 980 nm, and the output fiber. Thus the optical signal is amplified directly with these devices, where the regenerators required a costly optical-electrical-optical signal conversion process.
So, when the weak transmitted signal reaches the doped-coil, the erbium atoms, now excited by the energy from the pump laser, bleed power into the weak signal at precisely the desired frequency - 1550 nm - causing amplification of the weakened 1550 nm input. The optical isolators are there simply to prevent undesireable light from “back-scattering” into the optical conductor which would create noise.
Interestingly, EDFA’s amplify in exactly the same fashion as an electrical amplifier - they will amplify any noise transmitted along with the signal. So there’s still a need for periodic regeneration in long-haul systems. Regeneration, though the O-E-O conversion has the capability of filtering out unwanted noise.
Fiber losses - the numbers given so far in this thread don’t even approach reality. When I’m desiging fiber networks (for HFC, or FTTx networks), I use a general loss figure of .35 dB/km at 1310 nm, and .25/km at 1550 nm. It’s not uncommon in the HFC architecture to transmit an optical signal 20 miles from the headend to the optic node - and we could go probably about twice as far except for the losses inherent in the that splices required on a run of that length. I figure on about 1 splice per mile and .15db loss per splice.
Transatlantic cables - in the early 1990’s, when the 1550 nm channel became available, TAT-9 & TAT 10 (TAT being designations for Transatlantic optical cables) were laid. TAT-9 & TAT 10 provided 565 Mbps of bandwidth each and the repeaters were spaced at 60 miles. Since even then we’ve come a long way. TAT-12 & TAT-13 have been laid between the USA, England & France providing 10Gbps of bandwidth and employing EDFA technology. Although I’m not sure what the EDFA spacing worked out to be, my assumption would be that the 60 miles still holds true.
Armoring Transatlantic cables - those cables near to shore, where they’re more likely to be struck by human activities, are fully armored. Cables laid out further - more than 150 miles are not typically armored.