Hmm this is interesting. If I could venture an informed guess:
When one uses RRDs to store information, RRDs also control what the period of the timeseries data they are returning. For example, lets say you ask for the last year of data and the last day of data. The RRD will return the last year of data in one interval, and the will return the last day of data in a different interval. The last day of data will be a much higher resolution of an interval and by extension, will be more inidicative of the actual data in the RRD. I think that is what is going on here. The data being used to generate the second graph is higher resolution so it shows more datapoints, of which those datapoints show a higher CPU Load.
So use the long term graphs for general trending, but the short term graphs for drilling down for actual incident information.