Calculating Speed

I’m having an argument with someone over a riddle.

The riddle is: If you go somewhere that is 60 miles away, and you travel the first 30 miles at 30mph, how fast do you have to go for the second half of the trip so that you will average 60mph for the entire trip?
Of course the trip can’t be accomplished at an average of 60mph, because if you want to travel 60 miles at 60mph, then the trip must take one hour. So if you go 30mph for 30 miles, the hour is already used up.
Someone is arguing that if you go 30mph for 30 miles and then 90mph for another 30 miles, then you have averaged 60mph. How can I show him he’s wrong?

I should have wrote: because if you want to travel 60 miles at an average of60mph, then the trip must take one hour.

30 miles @ 30 miles per hour = 1 hour
30 miles @ 90 miles per hour = 20 minutes

60 miles in 1 hour, 20 minutes > 60 miles in 1 hour

His math is wrong. If you go 30 miles @ 30MPH, that takes an hour. Then you go 30 more miles at 90 MPH, which takes 20 minutes. So, you’ve travelled 60 Miles in 1 1/3 hours or 45 MPH.

Saying that your average speed over a trip is x mph just means that, if you had driven a constant x mph for the total amount of time as the trip took, you would’ve covered the same total distance. Therefore, average speed is just total distance divided by total time. Travel 30 mph for 30 miles, and you’ve taken an hour. Travel 90 mph for 30 miles, and you’ve taken 20 minutes. 60 miles in 80 minutes is 45 mph.

So going 30mph for 30 miles and then going 90mph for 30 miles can’t possibly average 60mph, right?

Is there a simple formula I can use to prove this?

Thanks guys. :slight_smile:

Yes. It’s miles per hour, or more generically speaking, distance / time, which is the formula for speed.

Yes, do it as follows: Miles traveled divided by hours spent traveling equals miles per hour (mph). You travel for an hour at 30 mph, so you’ve traveled 30 miles, since 30/1 = 30. You travel 30 miles at X mph, so you’ve spent 30/X hours in traveling. In total then, you’ve traveled 60 miles and you’ve spent 1 plus 30/X hours in traveling. The miles per hour is then 60 divided by (1 + 30/x). 60/(1 + (30/X)) = 60X/(X + 30). Since 60X/X = 60 and 60X/(X + 30) is smaller than 60X/X (because the numerator is the same and the denominator is larger), the overall speed you’ve traveled has to be smaller than 60 mph. If your speed in the second 30 miles is very fast, the overall speed can be very close to 60 mph, but it can’t equal 60 mph. Even if you travel 1,000,000 mph on the second 30 miles, the total speed is less than 60 mph. You can’t calculate speed by averaging the speeds over the two halves of the journey.

The formula is Miles Per Hour or Miles divided by hours.

30 miles at 30 MPH takes 1 hour and 30 miles at 90 MPH takes 20 minutes. Your total time is 80 minutes or 80 divided by 60 which is 1 1/3.

60 (miles) divided by 1 1/3 (hours) is 45.

Once you’ve passed that one hour mark, you can never get to the 60 MPH if you only have 60 miles to drive.

You have to get to 60 (miles) divided by 1 (hour), which is impossible because you only traveled 30 miles in 1 hour. Even if you accelerate to the speed of light at the 1 hour point, it’s still going to take you 1 hour plus a split second to go the 60 miles which would average out to ~59.9999999999999999999 MPH.

More formally (but essentially the same solution) is to sum the durations by splitting the the trip into known and unknown intervals, t[sub]1[/sub] and t[sub]2[/sub], and then summing the intervals. Time is distance over average speed for any interval; thus for the overall trip time T=1 hour for D=60mi at an average speed S=60mph. For t[sub]1[/sub], d[sub]1[/sub]=30mi, and s[sub]1[/sub]=30mph. For t[sub]2[/sub], d[sub]2[/sub]=30mi, and s[sub]2[/sub]=?. So, summing those and setting them equal to the total time gives:

T = D/S = t[sub]1[/sub] + t[sub]2[/sub] = d[sub]1[/sub]/s[sub]1[/sub] + d[sub]2[/sub]/s[sub]2[/sub],

or,

1 hour = 60 mi / 60 mph = 30 mi / 30 mph + 30 mi / s[sub]2[/sub]

Since t[sub]1[/sub] = 30 mi / 30 mph = 1 hour, we subtract this from both sides and are left with:

0 hour = 30 mi / s[sub]2[/sub]

Since no real result for s[sub]2[/sub] can satisfy this equation you demonstrate that the problem is impossible.

The usual error with this problem comes from trying to sum the speeds, which is meaningless, or confusion over units.

Stranger

Yes, but what’s the third -gry word?

Well, it depends on what you’re averaging over. It’s implicit that you’re talking about the average speed over time, in which case everyone in this thread is right. But if you’re talking about the average speed over distance, well, then, the OP’s friend is obviously right.

By all means, show him where he went wrong in conflating the two averages, but try to also indicate just where the confusion stemmed from, and how he gave the right answer to “the wrong question”, rather than just dismissing him.

The problem is that an average speed* over* distance doesn’t really make sense, because speed is distance over time.

A cool trick I like to use for these three term algebraic problems is to draw them like a triangle, and cover up the one you want to solve for:

… R/
D = T

If you cover the D, you get R/T, cover the R to get DT, or cover the T do get DR.

There’s nothing inherently problematic in taking the average of any quantity over any other quantity on which it depends. Let f(x) be my speed when I have traveled distance x. Then my average speed over the distance interval from x_0 to x_1 is the integral of f(x) dx evaluated from x_0 to x_1 divided by (x_1 - x_0). (When we take this to be average speed over time, instead, then the fundamental theorem of calculus kicks in and we get the nice formula (total distance)/(total time))

It’s as though the moving object had an odometer and a speedometer attached, and you asked the question “If I picked a random x and wanted to know what value y the speedometer read when the odometer hit x, what would the expected value of y be?”. Not senseless; even conceivably useful, just not as often so as the question “If I picked a random time t and wanted to know what value the speedometer read at that time, what would be the expected value be?”

He was right in that the average of 30 and 90 is 60, but 30 and 90 are just two arbitrary numbers unrelated to speed.

You could also figure in the time it took to accelerate to 30 if he was stopped at the beginning of the hour. If he never exceeded 30 MPH in that first hour, he wouldn’t even average 30 MPH over the first hour.

The definition of “Speed” is distance/time.

Unrelated to speed how? They would be his speeds during the first and second legs of his journey.

We could also factor in the lunch break he took in the middle, if we want to just make up changes to the problem in order to complicate it.

Sure, taken as a limit. Instantaneous speed is the derivative of distance with respect to time. But what of it?

Santo, he doesn’t mean “over” in the sense of “divided by”. He means “over” in the sense of “with respect to”. That is, the average speed across/over/with_respect_to time is what one usually means by the unqualified phrase “average speed”. But, average speed with respect to distance is just as well-defined.

To average 60mph, you must make the entire 60 mile trip in 1 hour.
If you’ve already wasted an hour dawdling along at 30 miles per hour, you must make the remainder of the trip in zero time to get your average up to 60mph
In a normal universe, that just means you have to go infinitely fast, but what with the possibility of time reversal as you exceed 300,000 km/sec, the answer in this universe might require a little fancy calculating.

Just to clarify what my point was: an average is always with respect to some probability distribution. If you use a distribution which is uniformly distributed over time (which is indeed the one implicit in the ordinary language wording), then, as everyone has been saying, you can’t make the average speed come out to 60 no matter what you do on the second leg. But if you use a distribution which is uniformly distributed over distance travelled (not explicitly ruled out by the wording of the OP), then the OP’s friend is right, travelling the second leg at 90 MPH makes the average speed come out to 60. And if you choose more bizarre distributions, you can get more bizarre results. It’s entirely a matter of what distribution you take the average with respect to.

ETA: Yes, what Pasta said.