This is a problem I got from one of their official tests that I purchased from mba.com
16. A Hiker walked for two days. On the second day the hiker walked 2 hours longer and at an average speed 1mile per hour faster than he walked on the first day. If during the two days he walked a total of 64 miles and spent a total of 18 hours walking, what was his average speed on the first day?
I know this is a rate problem, so the first thing I did was set up a chart, that contained the RTD for Day 1, Day 2, and the total for the 2 days. Using the relationship between the number of hours spent hiking, I solved for t and got 8. So now my question is, what step do I need to take next? Is the "average rate" for day 1 the same thing as its rate? I just don't know what to do from here....Any help would be appreciated. Thanks.