Light and sound can both reflect off objects. When sound reflections last long enough, they’re perceived as reverberation, right? So what would be the equivalent(s) of reverberation when it comes to visible light, infrared or radar, either when a computer or brain is processing the data?
Is there something analogous to reverberation when people see an afterimage after looking at a bright light? Or when people get visual trails while high? I.e.: The information/electrochemical flow is reverberating through their visual cortex?
Are you asking about something that happens to waves, or something that happens to brains?
Reverb is basically a lot of echoes, that are so densely packed (in time) that humans hear them as a kind of smear.
There are light echoes. Light echo - Wikipedia
I can’t really say as regards to vision. There are different types of afterimaging: exhaustion of rods/cones in the eye, versus changes in the feedforward and feedback pathways in visual cortex, drug-induced synaesthesia, and exhaustion of the neurons due to overstimulation for significant periods of time (resulting in trails).
In wireless telecommunications, multiple echos are commonplace and problematic. The technical term is “multipath”. Signals arrive at the receiver after following multiple paths, with different delays. The echos interfere with each other causing intersymbol interference (the digital messages overlap and interfere with each other), fluctuations in the signal level, and other problems. Complex modulation formats like OFDM (orthogonal frequency division multiplexing) are used to try to overcome these problems. Those of you who remember broadcast TV will recall ghosting. You’d see slightly offset ghosts of the image, caused by reflections of the TV signal.
Another example is laser speckle, caused by the interference of multiple paths of light from a highly coherent light source (e.g. a laser).
I’m wondering how much some elements related to reflections, information systems and the mind may be the same underlying phenomenon.
That’s the one that seems to lend itself to the concepts of reflection or echo. For example, right now, I just looked out the window on a bright day for a few seconds then I closed my eyes. I could still perceive a big bright rectangle in the darkness behind my eyelids. Is that the equivalent of a ghost image slowly fading away as it echoes through the feedforward/back of the cortex?
I realize this is difficult but could you explain those solutions to the multipath problem like I’m a 10 year-old?
It took me some time to get it. Do I understand correctly that the reason for the speckle is constructive/destructive interference based on phase? I.e.: A laser beam is made up of many directionally aligned particle paths. When the beam hits something, those paths can vary in length relative to each other which changes how well the phases of the pathways line up. That’s what creates the constant variation (speckle) in amplitude all over the laser? I’m sure science teachers are pulling their hair out at how I phrased that but have I got it right?
How about when the original wave reflects from a surface and interacts with itself, creating interference patterns? This only happens when the path length difference is less than the coherence length of the light, but it always seemed like a "ringing’ phenomenon to me.
in optics, one example of this is what’s called Lloyd’s Mirror. For natural light, it’s only a few wavelengths, but lasers have a much longer coherence lengths, and you can get lots of interference maxima
You can get interference between a particle and its “reflection” from a “square” boundary. There’s a nice example of this in Schiff’s Quantum Electronics, but I can’t find it online. Basically the same thing as the light interfering with itself, only you’re looking at a wave packet of finite extent interfering with itself.
Different phenomena. The ghost image on a TV is caused by echoing signals reaching a receiver. The actual physical signal is repeated. What you see when you look away from a bright spot is an after-image cause by fatigue of the photoreceptor cells in your retina. There is no signal. It’s analogous to what happens if you turn off a CRT TV–the image continues to glow for a second or two because the phosphor coating on the tube is still bleeding off energy.
If a signal goes straight to a TV (we’re talking about old analog TVs here), the TV shows a nice, clear picture. If that signal hits a reflected surface, the reflection arrives at the TV a hair later than the original. The TV doesn’t realize it’s two copies of the same signal, it just sees one combined signal, and so the result is a visual echo on the screen. Think of standing in a room with a single light bulb. You see a shadow from that light. Now put a mirror in the room. The light from the bulb is also going to bounce of the mirror, and you will see two shadows, one caused by the bulb, and one caused by the image of the bulb that is reflected in the mirror. That extra shadow is like the ghost image on a TV.
In general, echo cancellation is a hard problem, but there are multiple ways to solve the problem. I’d need more time and space and a few diagrams to explain it to a 10 year-old. Modern day speakerphones have built-in echo cancellation that works by listening to the first few seconds of speech, detecting the echoes, then creating a digital filter that cancels the echoes. OFDM, which I mentioned in my post, is a complicated way of encoding information. It effectively breaks the communication channel into, say 1024, channels separated in the frequency domain. Each of those channels supports a bit rate that is 1024 times lower. At that data rate, the transmitted symbols last so long that short delayed echoes are unlikely to overlap substantially with the other signals. It’s all done by sophisticated signal processing that was totally impractical, blue-sky stuff when it was invented many decades ago, but is now ubiquitous. It’s amazing what you can do when a million transistors costs pennies.
That sounds like leading edge tracking, an electronic warfare technique: Imgur: The magic of the Internet
When a DRFM jammer sends a modified copy of the victim radar’s signal, that modified copy cannot be but at least a little late. So you can distinguish, resolve the two by taking just the very earliest segment of a return.
Aside from increased computational requirement, are there downsides to using that technique?
Is the first part a Fourier transform? The latter part, I’m not sure I’m grokking. You split the signal according to frequency (so far, I’m following) which has the effect of slowing down time or doing the temporal equivalent of zooming in 1024 times. The short delay between the initial signal and its multipath echo is now 1024 times longer which leaves it out of the processor’s temporal resolution cell. Is that about right?
It’s counter-intuitive because I would have expected that a higher bit rate is always better.
Presumably, the computer keeps a record of recently processed signals to summarily reject any signal it recognizes as a delayed echo? If not, how does the processor deal with the fact that there’s still a delayed echo coming that you should ignore.
What forms of signal processing are currently impractical, blue-sky stuff for lack of sufficient processing power and data bandwidth?
Yes, in its simplest form, the digital bit stream is Fourier transformed before transmission. The receiver does a reverse Fourier transform to reconstruct the signal. But in reality it is much more complicated than that. For example, multipath and other problems will cause a variation in signal-to-noise ratio among the 1024 (for example) orthogonal channels. Thus, it will make sense to transmit a higher bit rate on the good channels and lower bit rate on the noisy ones. This is just scratching the surface of the complexity of modern wireless systems, enabled by our amazing digital integrated circuit technology.
There are two subjects here.
One is how to send and receive information using vibrations in some medium like light, radio, air or water. If it can be sent and received, it is a signal and we can encode some meaning into it. Our senses are conditioned towards processing the data from our senses and that is generally short range. We see light and hear sound and our brains process it to tell us what is going on in our environment. One part of that is communicating with fellow human beings and for that we started with sound and developed language. Latterly this was augmented by visual symbols and written language. In human affairs, there is much to be gained by sending messages over distance without men running carrying written notes. This comes especially useful when large numbers of men are deployed in order to fight a war over some territory. Messages from generals and admirals to their troops and ships are crucial to success. Bankers and news reporters also find it useful.
Consequently there has always been a demand for a technology for sending messages by signalling. Flags, semaphores and flashing lights were used. But the great advance came with electricity. Turning the electricity on an off on a copper wire in code is telegraphy. Later, when radio was invented, it was the same over even greater distances.
The capacity to send messages is limited by the technology to change something at one end that can be detected at the other end. The development of the telephone and radio were intended to replicate human use of sound by converting it from sound to electrical signals, transmitting it over a long distance, then converting it back into sound. This was the world of analogue communication, where the signal continuously varies in the same way as speech does, but converted to electrical signals or radio waves and converted back at the receiver.
Now there were problems with this. Copper wires can pick up electrical interference, lose signal through bad connections or water. Radio also has a whole host of problems with reflections interference, loss of signal due to atmospheric effects like electrical storms. It is analogous to the problems humans can have making out a voice in crowded room full of chatter, or in an echoing chamber. Light reflections on a summers day when there are shiny windows can make sight difficult. Similar problems happen with wires and radio.
Technology to cut out interference took a great step forward when the silicon chip came along and though microprocessors get all the attention, signal processors are when made the Internet possible.
When electricity flows down a wire you can change three characteristics of the electrical signal. If the signal is a wave, its amplitude (loudness) can vary, its frequency (pitch) and its phase. By varying these such that various level changes can be detected by a receiver. The same applies to radio signals. Signalling schemes are generally known as modulations.
Now with a signal processing chip at each end it is possible to implement many kinds of modulation and coding schemes for the data sent. Encoding involves turning audio or video into a digital format. You can either save this as a file or build the encoding process into into a phone, TV or a broadband router. The encoding standard can compress and error check that the numbers transmitted are free from errors arising from the usual interference, relections and so on. This is digital data communication. The processing capabilities of these chips are remarkable and have enabled the manic multimedia world we live in today. Sometimes these digital schemes struggle. Some TV broadcaster try to squeeze in too many channels into available frequency space. You can see this when your TV display starts breaking into blocks. But it is somewhat better than the old days of analog TV modulation and all the static and ghosting.
Your broadband router is trying to squeeze the maximum throughput from the copper wire or cable to your service provider. They do a magnificent job dealing with all the problems. Fibre optic cables are much less prone to interference and the range of frequencies light can carry is very large, much more than copper. Once this becomes the standard all those fancy 8K TVs will make sense, they will get the signal they need. None of that VR goggles nonsense. You will have a room like a Holodeck on Startrek and those plots in the Black Mirror TV series become a bit more plausible. Solving the problem of multimedia communication over distance will, no doubt, be a Pandoras box.:dubious: