Why can't I calculate the average speed for the entire trip by using the average rate of the two trips?
We can only average the rates of two trips when the time for both trips is the same. In this problem Clyde is traveling at an average speed of 90 mph for twice as long as he is traveling at 60 mph. So the average rate should be closer to 90 mph than to 60 mph.
To use an extreme example to illustrate this: Let's say I drove 100 mph for 10 hours and then 2 mph (in horrible traffic) for about 5 minutes. Is my average speed now (100 + 2)/2 = 51 mph?
No, of course not! My average speed is actually very close to 100 mph, because I was traveling at 100 mph for MUCH longer than I was for 1 mph. We need to take into account the amount of time that I am traveling at both speeds.
Therefore, unless both times are the same, we need to add up the total distance and then divide by the total time to find the average rate.