With an ear on each side of your head it’s easy to tell whether sound is coming from your left or your right, but how does the brain work out whether the sound is in front of you or behind you? It seems to me that a sound coming from “11 o’clock” should reach both ears proportionally the same as a sound coming from “7 o’clock” - the relative angles are the same.
And yet you can always tell whether a sound is coming from in front of you or behind you. How does that work? Do you unconsciously turn your head slightly to get more 3-D information? I’ve tried keeping my head totally still and listening to background sounds and I can still tell where they’re coming from.
I don’t know a lot of detail about how the brain interprets and locates sound but your ears are not symmetric front-to-back. The tonal qualities of sound arriving from behind you will be different from the same sound in front of you, due to the shape of your ears. You also unconsciously use cues from how sounds are reflected in your environment, plus what you know about your environment. Right now I am sitting facing a wall and if I hear the sound of a clear voice I know it isn’t coming from in front of me. ETA: A lot of research has been done about cognition and how the brain reaches conclusions about sensory input by combining what it knows about context.
Also, you can’t always pinpoint the sound. I find this is often true of high-pitched sine-wave type sounds. I’ve had a hell of a time identifying where electronic tones are coming from if I misplace my phone and it starts to beep, or something else acts up. The tone from such a device isn’t as rich as things like a human voice so it’s harder to pick up on the reflections, reverberations, and other cues that help you locate it.
I have a strong suspicion - and I’m sure there have been studies - that most of the time our hearing and vision and knowledge of our surroundings work together. Lots of times I’ve heard a noise somewhere in or around the house, and had to investigate to find out where it was coming from . . . and sometimes I’ve been very wrong about the location.
Our ears are not completely perpendicular with our heads, so we have adapted the ability to distinguish sounds from the front and back of ourselves. Had to do with survival and the bengal tiger and all.
That’s interesting. Isn’t conventional wisdom that it’s low frequency sound that’s hardest to locate? Hence why people are not very concerned about locating a subwoofer but give great care to pointing a tweeter correctly?
My partner is deaf in one ear and has extremely limited ability to pinpoint sound. He can get it a little by turning his head but often will just be completely wrong and have to rely on his eyes. It’s amazing how disabiling this is in everyday life for him: I used to take the ability for granted, but I appreciate it now.
All good answers to the OP. It’s the brain that does most of the work, and it does a huge amount of processing to work out directionality. The finite speed of sound is useful here as phase differences between the signals received by the ears gives clues as to left/right orientation - this is why low frequencies are very omnidirectional (as Fuzzy Dunlop said); they have a long wavelength and hence the phase difference between the ears is much smaller. Several surround-sound systems have been created by taking a stereo signal and altering various signal phase angles to give a 3D effect, either with conventional stereo speaker pairs (not massively effective), stereo headphones (a bit better), or quad speakers fed from a processed stereo source. I briefly worked on a system of the latter type that an old friend developed, and he demoed it to Pink Floyd to much acclaim (around about the time that their stage at Earl’s Court, London, collapsed, if that helps date it).
A lot is learned conditioning as well (as sez panache45), and unfamiliar sounds are much harder to locate. Here the folds in the pinnae help an awful lot (as noted by CookingWithGas) as the reflections from all the undulating bits perform a sort of pre-processing task. This is also one of the reasons that pure tones (tending towards monotone sinewaves) are difficult to pinpoint. Also, if I remember my aural cognition theory correctly (and there will be a degree of fuzziness, so please do check) there is a crossover between mechanisms in the ear at around 2-3 kHz, and tones at these frequencies are very hard to locate. Beeps and alarms in this range are particularly irritating.
Turning your head (noted by Dr Drake & naita) helps by giving crude but effective amplitude clues. Other animals (and Stan Laurel) can handily move their ears independently, which is even better.
To explainify a bit more, there are two theorized methods that your brain uses to pinpoint sound:
The time lag of the sound (as it travels the distance of your head) is different depending on the location of the sound. This method is used at higher frequencies.
For lower frequencies (Less than 1800Hz, say) the brain can use pre-learned information about how sounds at different frequencies travel. A sound is built up from different frequencies and knowledge about the intensity differences (again, because of the distance between the two ears) at various frequencies can be compared with the pre-learned information to figure out where the sound came from.
Note that we’ve only talked about 2 dimensional location so far.
… and now that I’ve written all that I found the Wikipedia page as I’m looking for some IID curves to help explain.
In case you were interested, you get about 10 degrees of error when the sound is directly to your right or left and narrows to less than 4 degrees of error when the sound is directly in front of you! Clicks and sinusoids are easy to pinpoint, followed by tones and speech. Random noise, as previously mentioned is much harder to accurately find.
One way to eliminate this asymmetry is to use close fitting headphones or in ear earbuds, while listening to binaural recordings. There are several such recordings around, some of the more popular ones being that of a barber shop and one of a matchstick (I’ll comeback to post links if I find them).
The really surprising thing I’ve noticed is that when these sounds move from one ear to the other, the perception is always that the object moved behind the head, never in-front-of or through the head.
I suppose there is some amount of visual-confirmation involved there. when an object sounds like it could be at either 11 o’clock or 5 o’clock, and you cant see anything, it should be at 5 o’clock.
Aside from cognitive aspects of low-freq sounds mentioned earlier, in the environment they reflect less and are absorbed less readily. Bass just tends to fill up a room without much reverberation, unlike treble tones that emphasize it. That makes it harder to hear the cues that locate it.
I would be interested in the recordings you describe–how does “binaural” differ from “stereo”? When I listen to music on headphones, I always have the subjective experience of hearing the music in front of me, and if I put the headphones on backwards, it still sounds in front of me. I attribute this to my brain knowing that the performance is usually in front of me.
A bad battery in a hidden smoke detector drove me nuts for over a week. It’s a one pitch beep, and was hiding in a box during a move. I think that the brain does an enormous amount of interpreting of what the sound means and where such a sound should be located. So I vote mostly brains, and not so much ears.
The barber shop clip explains how binaural works, there’s always wikipedia for more reading
The barbershop works especially well because you’d expect the barber to be behind you in real life, but (atleast for me) even the matchbox and several other clips I’ve tried are always behind, never in front.
err, I think we all agree that it is the brain that interprets the location of sound sources. I interpret the original question as asking what clues does the brain use to differentiate between right in front of you vs right behind you - or any other diametrically opposite positions.