Yeah, but you have to exceed c, so that time runs backwards.
Apparently, it’s even easier to come up with an answer if you don’t understand them.
d = r x t, 1mi = 15 mi x t or t = 1 mi x hr x 60 min = 4.0 minutes.
…15 mi x hr
It takes 4 minutes at 15 mph to go 1 mile. The above shows the use of units in arriving at the answer, just cancel the units above and below the line.
How fast must you drive the second 1 mile to average 30 mph for the two mile trip.
First find the time to average 45 mph on the overall 2 mile trip.
d = r x t, 2 mi = 30 mi x t or t = 2 mi x hr x 60 min = 4.0 minutes.
…30 mi x hr
Now we see that it takes only 4.0 minutes to make a two mile trip at an average speed of 30 mph, but we used up 4 minutes for the first mile, therefore it is impossible to drive fast enough over the second mile for an overall average of 30 mph! QED
I seriously doubt that Einstein couldn’t solve this by inspection, i.e. just by reading the question.
.
.
I dunno, I’m very tempted to go out and try this…
gets car, stopwatch, hill
Another algebraic approach:
1 mi. / 15 mph + 1 mi. / x mph = 2 mi. /30 mph
Cancelling out the miles (remember we’re dividing by miles per hour [which is miles divided by hours] which has a net effect of dividing by miles while multiplying by hours) yields :
1 hour / 15 + 1 hour / x = 2 hour / 30 = 1 hour / 15
Subtracting 1 hour / 15 from each side yields:
1 hour / x = 0
To isolate x we multiply both sides by x and divide both sides by 0:
1 hour / 0 = x
Of course, division by zero is undefined, thus a solution is impossible.
I am far and away not math savvy. But isn’t the problem just solving
15+x
____= 30
2
x=45 ???
Spelling and grammer (and math) subject to change without warning !!!
Kegg: the only problem with that is that it takes longer to go slower over the same distance. If you assume that the hill is the same length going up as down, it makes it two miles in total. If you go two miles at 30 MPH, it takes 4 minutes. The only problem with that is that it takes 4 minutes to go 1 mile at 15 MPH. Therefore, once you’ve spent 4 minutes going at 15 MPH, there’s no time to get the rest of the trip done.
Doing it your way makes it:
60/15=4 minutes (Assuming there’s a one there, shut up, sticklers.)
30/15=2 minutes
So the whole trip takes 6 minutes over a distance of two miles. This makes it 2/(.1) MPH, or 20 MPH on average.
Am I the only one that is not seeing an actual distance given in the problem. It states “You drive a car up a 1 mile hill”, not “You drive a car 1 mile up a hill”, or “You drive a car to the top of a 1 mile hill”. There doesn’t appear to be a definite distance, so it would seem that it is irrelevant to the solution and we should throw d = rt out the window.
That leaves you with Dan Turk’s original solution:
First leg of trip average speed = 15 mph
Second leg of trip average speed = x mph
Average speed for trip = 30 mph
(15+x)/2=30
x=45
However, this just looks too simple to me. Maybe the solution is this simple and the formula d = rt was tossed in to throw people off. My guess is that the problem is worded poorly(or I’m reading it poorly), and the answers of an impossible solution given by several posters is correct.
It’s a trick question. You cannot average 30 MPH for the trip because you used up all the time on the first leg. You must travel faster than 15 MPH going UP the mountain to average 30 MPH for the whole trip.
It doesn’t make any difference what the distance is – if you drive x miles at 15mph, that in itself takes you as long as it would to drive 2x at 2x15=30mph. So it’s impossible to attain an overall average of 30mph, unless you drive downhill at infinite speed.
If the uphill and downhill distances are allowed to be different, there’s an infinite number of solutions depending on how much longer the downhill journey is than the uphill.
:smack: I read the thread, knew it was wrong, and yet I still posted it.
My simple mind was so preoccupied with the wording of the problem that it fell right into Dahnlor’s flawed apple boxes. In my reasoning, I was just seeing the numbers rather than measures of distance traveled over a period of time.
But hey, even Einstein supposedly had problems with it.
I know, I’m still an idiot.
Why is everyone assuming this is a localized reality? It never specified and theoritically speaking this could really change the problem. Of course, if this is a basic casual deterministic problem, it has been solved. Idealism, on the other hand…
Really only does take a second to see it.
The distance makes no difference so just say 30 miles - must do in 1 hr, first half is 15 miles@15mph = 1hr :dubious:
Einstien maybe started thinking about the speed of light half way through this problem
it might be easier to visulize this if you tell the problem the way my father presented it to me when I was a kid.
“You are going to drive a car a distance of 2 miles. This must be done in 2 minutes. If you drive the first mile at 30 mph, how fast must you drive the second mile?”
Of course I solved for averge speed of 60mph and said 90. He then explained about 1 miles @30mph =2 minutes :smack: