How far does a jogger run in 1.5 hours (5400 s) if his average speed is 2.22 m/s?
Reasoning
The average speed of the jogger is the average distance per second that he travels. Thus, the distance covered by the jogger is equal to the average distance per second (his average speed) multiplied by the number of seconds (the elapsed time) that he runs.
Solution
To find the distance run, we rewrite Equation 2.1 as
|