Look blackended, I'm not sure how this is so hard for you to grasp but let me try to break it down so you can understand it in a one second example.
Let's say we have a GTX Titan and a 690, the Titan gets 30 fps, the 690 gets 60... Of course these aren't real but we want to make it simple.
Example Titan (smooth): Within 1 second, 30 frames are flashed on the screen. 1 second is 1000 MS, so how fast a video card is is based on how many frames the card can display in 1000 MS. So assuming perfect frame delivery the Titan is giving 30 frames at 33ms intervals. So we take 1000 divide that by 33, and we get 30 fps. That means within one second of time, Titan delivered 30 frames at 33ms intervals within a second of time.
The output graph would look like this --------------------------------------------------------
Example 690 (MS): Within 1 second, 60 frames are flashed on the screen. 1 second is 1000 MS, so how fast a video card is is based on how many frames the card can display in 1000 MS. So assuming poor frame delivery the 690 is giving 60 frames at alternating 33 and 8.3 MS intervals. So we take our frame average, which is 16.7 and divide that into 1000, which gives us 60 fps. That means within one second of time, 690 delivered 60 frames at alternating 33 and 8.3 MS intervals within a second of time.
The output graph would look like this /\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
I hope that helps.