fiber optic transmission distance

I took electronics (many, many moons ago) and dabbled in fiber optics. As I recall, the transmission distance was limited by strength of the light source and the transmission medium. I thought that in most cases this limitation was only several hundred feet (at which point there needed to be a repeater/magnifier). Talking with a friend today he brought up fiber optics being used trans-atlantic. How is this done? Please dumb down the answer as the brain ain’t what it used to be.

Thanks.

Robert

From Wikipedia.

Alternatively, you get what you pay for.

That 300-500 meter number is low, or old. 10Gb ethernet LR optics generally can reach 10 km on single mode fiber, ER optics are up to 40km.

-lv

And now that I read the cite properly, I feel like a dummy. the 500 is on multi-mode, not single mode.

-lv

Okay, so regeneration must occur many times under the ocean? Am I reading that right? So that means they must run electricity out there and strategically place amplifiers?

Yep.

Although I’m not sure about whether they’re amplifiers or regenerators, or a combination of the two. A regenerator recovers the digital signal, then transmits it again down the line. This is all that was available several years ago. But then they developed EDFAs, things that amplify the signal right in the fiber, so it doesn’t have to get regenerated.

It’s amazing if you think of it, that the signal is going through 50 or so miles of glass and can still be detected at the other side. That’s some pretty pure glass.

The attenuation of the commonly-used single mode optical fiber (SMF-28) is typically 0.25 dB/km and frequently lower. Using the typical value, half the input light would come out of a fiber 12 km long. (That’s 7.46 miles.)

You can get 10 GbE equipment today which can support 80 km transmission distance over single mode fiber. I have seen equipment which specifies 2 km for for 1 GbE over multimode fiber, but this is not very common. Equipment specified to 800 m or 1 km is easily available.

That’s amazing. So how is it maintained? Have they perfected the technology to the point that it needs little maintenance, or are there folks in submarines fixing broken amplifiers and stuff? How big in diameter is the cable? I’m imagining it’s probably thoroughly protected so as not to be easily damaged by whatever might go on on the ocean floor - so does anyone know how exactly they protect it?

Enquiring minds need to know. :slight_smile:

There’s a huge (56 pages) but fascinating article on transatlantic cables in Wired from 1996. Worth reading if you’re into this sort of stuff.

It goes into some fair detail (for a popular magazine, rather than a tech journal) on the cables’ construction, how they’re laid and repaired.

The cable is protected with several layers of steel wire and then a water-proof coating. In shallow areas , especially where there are trawlers, the cable is actually buried in a trench. The actual regenerators are very reliable , with built in redundancy.

This is nothing new. When the first telephone cables were laid in the 50’s they used valve ( thermionic tubes ) in the amplifiers. Everything was designed for long life. The tubes chosen were of very old design but had been under life test in the Bell Telephone Labs since the 30’s . The heaters in the tubes were run on half power and there were parallel circuits so that if one failed the other parts kept on running. The manufactures of the equipment had it written in their contract that , if there was a complete failure of the kit within the first 15 years , they had to foot the bill for recovering the cable and amplifier , repairing it and re-installing it. This was a great incentive to get it right first time.

EDFA is the acronym for Erbium-Doped Fiber Amplifier. Erbium is an interesting element. When it’s struck with light at 980 nm, it emits photons at around 1550 nm, which is one of the frequencies on which multi-channel systems operate.

Optical amplifiers work through the application of Stimulated Raman Scattering - a fiber nonlinearity characterized by high-energy channels pumping power into low energy channels. The optical amplifier harnesses this phenomenon to amplify optical signals which have weakened over distance. They’re really quite simple and reliable devices; a great improvement over the regenerator. Optical amplifiers consist merely of: the input fiber, a pair of optical isolators, a erbium-doped coil of fiber, a pump laser operating at 980 nm, and the output fiber. Thus the optical signal is amplified directly with these devices, where the regenerators required a costly optical-electrical-optical signal conversion process.

So, when the weak transmitted signal reaches the doped-coil, the erbium atoms, now excited by the energy from the pump laser, bleed power into the weak signal at precisely the desired frequency - 1550 nm - causing amplification of the weakened 1550 nm input. The optical isolators are there simply to prevent undesireable light from “back-scattering” into the optical conductor which would create noise.

Interestingly, EDFA’s amplify in exactly the same fashion as an electrical amplifier - they will amplify any noise transmitted along with the signal. So there’s still a need for periodic regeneration in long-haul systems. Regeneration, though the O-E-O conversion has the capability of filtering out unwanted noise.

Fiber losses - the numbers given so far in this thread don’t even approach reality. When I’m desiging fiber networks (for HFC, or FTTx networks), I use a general loss figure of .35 dB/km at 1310 nm, and .25/km at 1550 nm. It’s not uncommon in the HFC architecture to transmit an optical signal 20 miles from the headend to the optic node - and we could go probably about twice as far except for the losses inherent in the that splices required on a run of that length. I figure on about 1 splice per mile and .15db loss per splice.

Transatlantic cables - in the early 1990’s, when the 1550 nm channel became available, TAT-9 & TAT 10 (TAT being designations for Transatlantic optical cables) were laid. TAT-9 & TAT 10 provided 565 Mbps of bandwidth each and the repeaters were spaced at 60 miles. Since even then we’ve come a long way. TAT-12 & TAT-13 have been laid between the USA, England & France providing 10Gbps of bandwidth and employing EDFA technology. Although I’m not sure what the EDFA spacing worked out to be, my assumption would be that the 60 miles still holds true.

Armoring Transatlantic cables - those cables near to shore, where they’re more likely to be struck by human activities, are fully armored. Cables laid out further - more than 150 miles are not typically armored.

To add insult to injury, I spoke with our electrical engineer this morning. When asked to explain the difference between regeneration (as it applies to fiber optics) and amplification, he said that regeneration is basically reconstituting the signal (which can be achieved by simply running the signal through flipflops). This, he further explained, gives the signal nice sharp edges again (as well as clean up timing). So, again, I’m reading into this, but dull edges implies bad slew rate (to me). How can attenuation of signal result in a slew rate change? Light on, light off. How can there be a warm up period? Please help.

Signed the criminally insane.

It’s a question of the signal to noise ratio. With regeneration the signal is taken in, converted to electrical signals and then re-transmitted optically. That means the signal was (ideally) recovered 100%. In the case of some setups the signal has forward error correction schemes can fix lots of misread bits. The newly transmitted signal has next to no noise in it.

Now optical amps add energy to everything, the ones and zeros. Eventually you wind up with a smear of energy arriving at the receiver which makes identifying the pulse edges (for pulse detection and clock recovery) extremely hard to do.