Some years ago, I remember skimming this in a statistics class, but it’s escaped me.
Two variations. . .
Say you have something that happens at a defined but arbitrary rate (a number of times per minute or hour or day, for instance), and then you define a length of time. For example, a camera in a room takes a photo once every ten minutes. If I enter the room and spend three minutes before leaving, what is the probability my photo will be taken?
Variation the second: The same situation, but the event is not instantaneous. Instead, it begins, happens for a defined length of time, then stops. For example, a radio station has news and weather every hour on the hour, and it lasts five minutes. If I turn on the radio for 45 minutes, what are the chances I will hear A) any part of the news and weather, and B) ALL of the news and weather?
In your examples, the camera and news are not random events, but pre-determined fixed events. The random factor is when you enter the room or turn on the radio. In this case it’s a uniform distribution: the chance of this event is the same for any time period during the 10-min or 1-hour cycle.
In your first example, there’s a 3-minite window within each 10-minute cycle where your entering the room would cause a photo to be taken. The probability is 3/10.
In your second example, there’s a 50-minute window out of each hour where the radii would catch any part of the news (you turn the radio any time between 45 minutes before the start of the news, and end of the news). So the probability is 50/60. To hear the whole news you have to turn on the radio within 45 minutes of the end of the news, so the probability is 45/60.
Interesting. For some reason I was making it more complex.
Ok, how about the typical TV or radio contest where “When you hear the codeword, call in!”?
To put it in problem form, let’s say a camera in a room will RANDOMLY take a picture sometime within the next 10 minutes. You enter the room for three minutes, then leave. What are the odds you’re on film?
And the radio news & weather example, only this time they don’t do it on the hour, but at a random time during the hour.
It depends - does it take exactly one picture within the next 10 minutes, only you don’t know exactly when? Or does it take pictures randomly, at an average rate of 1 per 10 minutes? If the former, it would be uniform distribution, and pretty much the same question you posted in the OP. If the latter, that’s Poisson distribution, with an expected number of events of 0.3. Probability of zero photos being taken is f(0,0.3)=0.74. So the probability of one or more pictures being taken is 0.26.
Again it depends, is the news happening exactly once during each hour, or is it completely random? Actually it can’t be completely random, you probably can’t start one and start another 1 minute later. The problem needs to be defined a little better.
Yes, the Poisson distribution was what I was looking for. Thank you both.
Actually, it’s what I had remembered, but then when I tried to put it in problem form, I got to thinking there was no way to figure it out just given the average rate of picture-taking or news-broadcasting or whatever so I inadvertantly made it a lot easier than I needed to. Which is odd, since the fact that you COULD figure it out just from the average rate was what was so interesting to me in the first place. Thanks again.
I think it is 40/60. If you turn the radio on 3 minutes before the end of the news, you miss 2 minutes. You have to turn the news on from 0 to 40 minutes before the beginning of the news.