Well I’m just witnessing one hell of a thunderstorm right now.
It got me thinking, I was always told once you saw the sky “light-up” with lightning, you could count the seconds until the thunder sounded and you could tell how far away the lightning was. 1 second = 1 mile.
Actually, the rule-of-thumb is 5 seconds=1 mile. Which isn’t far off. Sound travels in air at STP at 1087 feet per second. There are 5280 feet in a mile, which works out to 4.86 seconds per mile.
Not one second per mile but five seconds per mile.
Sound travels at 331.29 meters per second (at a set temperature at a set air pressure at sea level yadda yadda), which is about 1,087 feet per second which is just over 1/5 of a mile.
No, the speed of sound is about 776 mph (1,240 kph for non-Americans), or about 1,130 feet per second (340 meters per second), so it’s a little under five seconds per mile.
BTW: Here’s a cool calculator for theSpeed of Sound that let’s you correct for temp and humidity. (I used 21 C and 100% R.H. – because it was raining, according to the OP – to get my figures.)
Oh yeah, I wanted to add that the “No” in my first post was in response to the OP, not tomndebb or Q.E.D.. I may be a stickler for grammar, but what’s a few fps or mps among friends?
You should have taken the time to find a better source for your speed of sound. In 1986 it was recalibrated and found to be 331.29 meters per second (741.1 miles per hour) at 0° C at an air pressure of 1013.25 millibars (14.7 lb/sq ft) at sea level. (There was a lot of hoopla when an engineering or physics student rechecvked the numbers and discovered that the 1942 calculation had beenn in error.)