I have a microstutter example for you... from my thread about being CPU limited, specifically, on a an E8400 @ 3.6ghz on mass effect:
I figured out something, I was calling them "jitters"... but actually they are textbook cases of "micro-stutter"... I heard about single card micro-stutter before, and now I see it. Obviously the cause of this is not the GPU by the CPU... since the example I found was (11 to 41.6 to 15 ms to draw a frame; in FPS that is 90, 23, 66 instantaneous FPS) at the 720x480 resolution, where my GPU shows a utilization in the low 20%, while the CPU is 100%. Unless there is a bug or something causing it. Who knows. I personally felt that the micro-stutter was actually lessened when I turned off frame-smoothing, I will try it again with frame smoothing as it is supposed to combat exactly that.
As you can see, the time to draw a frame went from 11, to 41.6, to 15ms (rounded to nearest .1). Which is 90, 23, 66 FPS (the overall FPS measured was a lowest of 43, but that is because all the frames drawn that second are averaged together).
Those were frames 606, 607, and 608 in that test.
At that exact point I felt a "jitter" as I called it... But this is clearly a textbook case of micro-stutter.
While it is possible to play with, it is definitely not as fun, the game never felt smooth, I complained that my frame rate was in the 20s before I actually ran fraps and found out that I am averaging 52fpsand never going below 40 (except if I am staring down the water fountain in the citadel tower)... It just FEELS like it is in the 20s cause it dips into it with micro-stutter, and it seriously lowered the quality of the game for me (mass effect).
So I now have as a priority to upgrade my CPU first (I am thinking a quad core @ 3.6ghz).