How is that useful?
You can go for 30 miles at 30 mph, then move at the speed of light for the remaining 30 miles. Since traveling at the speed of light causes time to stop moving from your point of view, the last 30 miles goes by in zero time. To an outside observer, it takes you a time of 30 miles/c to get there, which is nonzero. But if you’re looking out for Number One, you can ignore other frames of reference. So the total time to go 30 miles is 1 hour, and the total time to go 60 miles is also 1 hour.
The only problem with this is you have to be massless. This makes a lot of other things in life problematic.
How is average speed over time useful? Quantities don’t have intrinsic use; usefulness is determined by the purposes towards which you want to apply it. Like I said, because average speed over time comes out, by the second fundamental theorem of calculus, to the same thing as the simple formula (total distance)/(total time), it will more often be the case that there is some use we have in mind for it (specifically, multiplying this average by the total time in order to get the total distance, or such things). But there’s nothing intrinsically useless about average speed over distance just because it’s potentially more complex.
Perhaps there’s a helicopter tracking my car, with the unusual property that the helicopter maintains a height directly proportional to my speed at any given moment. (Why? I don’t know… why would I start out travelling 30 MPH and then switch to something else? A math/physics problem doesn’t usually come with much in the way of motivation, it just is). At the end of my trip, I’m curious about how much area was under the path of the helicopter. For this, we can take my average speed over distance, multiply it by my total distance, and then multiply that by the factor relating helicopter height to car speed. It’s rare that I’ll ever be asked to do such a calculation, but, then, it’s rare that I’ll ever be asked to evaluate 4x^5 + 9x - 7, yet no one would argue that fifth degree polynomials are inherently meaningless.
Well, if something is 120 miles away, and I have two hours to get there, I have to average 60 miles per hour (with respect to time, of course). My example is concrete, and a normal person experiences it every day. I can’t imagine any helicopter behaving that way, nor any person caring about the average two dimensional area it encompasses. Remember, we’re trying to describe this problem for a layperson, not for a mathematician.
ETA: I mean, come on… look at all the disclaimers you had to put in your problem statement: unusual, rare, I don’t know why.
99.9% of everything I’ve modeled has been second order. Displacement, velocity, acceleration. Once in a long while I’ll use jerk, getting it up to third order. An argument could be made that fifth degree polynomials aren’t useful to model most physical events.
A different example of average speed with respect to distance:
What’s the average traffic speed in the L.A. freeway system at 8:00 AM?
This can’t be a time average, because I’m asking a question about a single point in time. What I would be looking for is the average speed of traffic across all the miles of freeway. (You could imaging asking the per car average as well, but that has a different interpretation.) What the above distance-based average tells you is: If I drop myself into the freeway system at random points at 8:00 AM, what’s the average speed I’ll experience?
Yes, but the normal person doesn’t care about the fact that they have to maintain an average of 60 miles per hour. They care about total distance and total time, to be sure. They care about the fact that the trip could be accomplished at a constant speed of the 60 MPH ratio, and they care about the fact that the trip could be accomplished at non-constant speeds in various ways. But that the non-constant speed trips are all such that the average speed with respect to time is 60 MPH… well, this fact is totally useless to the layman, since nothing in his driving experience gives him any direct connection to the average speed with respect to time, as such.
That is to say, the layman cares about the ratio (total distance)/(total time). He just doesn’t care about the fact that this happens to equal the average speed with respect to time; at most he cares about the fact that this is the appropriate speed to maintain in a constant-speed journey, but a constant-speed journey is precisely the case where it doesn’t matter what you take the average speed with respect to (time, distance, world population). Caring about the averages, qua averages, is left to the mathematicians.
Also, I like Pasta’s example.
Oh, actually, here I think is the most succinct ordinary language example of the usefulness of average speed over distance:
Q: You are placed at random somewhere on a road, and then a car is sent travelling along that road in some prescribed manner. What is the expected speed with which you get hit?
A: The average speed of the car with respect to distance. (Since random placement along a road is, in ordinary language, implicitly meant to be random with respect to distance)
I think this is different. I think the per car average at 8AM is what you’re actually talking about, because at precisely one point in time, a particular car is going to be at a particular spot, and you aren’t going to be taking an average of its speed at other distances. Same with Indistinguishable’s example; they’re both per car averages at a certain point in time.
How is my example a per car average? There’s only one car, and its speed varies. If you really do place a person at random along a road (in the usual distance-uniform way), and then have a car follow a prescribed speed-varying journey along the road, the expected speed with which the person gets hit will be the average speed over distance of the car’s prescribed journey.
For example, if you place me at random along the 60 mile road from here to grandma’s house, and then unleash a car along the road, having it travel 30 MPH for the first 30 miles and 90 MPH for the second 30 miles, then I will get hit with an expected speed of 60 MPH. (There’s probability 1/2 that I’m on the first half of the road getting hit at 30 MPH, and probability 1/2 that I’m on the second half of the road getting hit at 90 MPH.)
Fair enough. It’s still a stretch, though, because you’d have to drop the person right in front of the car for them to get hit. I understand your point, though. It’s still much less useful than an average speed over time, for all intents and purposes. Pasta’s example, I believe, is still a per car average.
Well, my idea was that the person (or safety cone or whatever; it doesn’t need to be so grisly, but it’s more fun that way) would be placed on the road well ahead of time, with the car only starting its journey after the placement.
Hooray!
I like grisly! It’s just that a reasonable person would move out of the way when the car was baring down on them. :smack:
As is usual in these problems, I am working in a world without friction, air resistance, or the instinct for self-preservation.
Not if the goal is to answer the question at the end of my previous post. The westbound 405 has something like 6 lanes, while the southbound 101 has in places only two. If we did the per car average, the 405 would count three times as much as the 101. That answers a different question than “What is my expected speed if you drop me onto a random stretch of freeway?” You could calculate either, of course, but the distance average is the one needed here.
Actually, more important than the number of lanes:
The faster the traffic, the more spaced out the cars are, so the fewer there will be per mile. So “per car” is even more different.
Gotchya.
The problem hinges on the definition of “average.”
It’s true that the arithmetic mean of the two numbers, 30 and 90, is 60, but that’s essentially irrelevant in this situation. What you need is the harmonic mean.
Unrelated to average speed over the course of the entire journey because the two legs are not equal.
Because Speed is Distance/Time and the unit we’re using is Miles per Hour.
If you drive 30 MPH for an hour and 90 MPH for 20 minutes, you can’t average 30 and 90 because the units change.
The only way this work is if the time unit is the same.
If you drive 30 MPH for 1 hour and then drive 90 MPH for 1 hour, you cover 120 miles in 2 hours which averages out to 60 MPH.
This would be the case where 30 + 90 /2 averages out to 60 and the average speed is actually 60 because the unit is 1 Hour both ways.