Like I said earlier I do see the difference. There is a stark difference between 120hz and 60hz in fluidity of motion. But when I hear blur, I think actual blur like you experience when you are on a roller coaster and moving so quickly you can't process your visual input quickly enough leading to everything smearing together. I've seen some gamers say as much here, that they see a similar blurring - albeit to a lesser degree than being on a roller coaster I would guess - when gaming on slower monitors. We can call it 'seeing' or 'experiencing' but we're referring to the same thing with how you perceive the use of the monitor in front of you.
Using the roller coaster example above I'd suggest that some individuals experience that effect at a lower threshold of increased speed of motion than others do and is why this is all subjective.
There are two kinds of blurring, spatial and temporal. Spatial blurring is caused by things in the optical system that don't change with time: not wearing your glasses, an anti-glare coating, high spatial frequency objects. Temporal blurring is caused by any sort of a changing signal, whether it's something changing position or pixels changing over time.
Your eye impacts temporal blurring rather substantially, because of how your eye integrates the input into an image. Your eye is not a camera - it does not have a discrete sampling interval, it samples continuously and updates the result continuously. Motion blur on a camera is caused by an object moving during the discrete sampling time, and so the same location on the object is laid over multiple pixels, and hence blurring. Motion blur for your eye is caused by anything at all that changes, from the pixels to what's being displayed on them to the motion of your head. Your eye does the best it can to interpret the changing signal.
The big impact this has for high refresh rates is that the continuous nature of the eye's temporal response has an impact on the tech we use to present images to it. If something changes, you see a blur - always. So when a pixel flips to a new pixel, your brain is still interpreting that with the previous state of the pixel, and for a brief moment you get a mix of the two. Then you have a "dwell time" on the new pixel, where it's interpreting just that one pixel's input, and then the pixel changes and the process repeats.
Blur reduction technology makes it so that the display goes dark during pixel transitions - so that you have only the dwell time with any luminance, not the transition time. This has a pretty dramatic effect on our perceptions, because now we just see a series of static, disconnected images, rather than one image that is smeared from one state to the next. If you make the flicker high enough frequency, we lose the ability to track that the luminance is going up and down as well, and everything looks great (with a reduction in average luminance, since the display is off part of the time).
If our eyes didn't work the way they did, blur reduction tech wouldn't work the way it does. Interesting things to think about.
Temporal blurring is also an issue for high-frequency IPS. IPS has a slower pixel transition time, and if this takes up more of the refresh interval, then you get more blurring because there's not as much dwell time. The sharpness and quality aren't blurred necessarily, but you're in effect blending one frame with the next frame if your eye doesn't have enough time to sit on a frame before the pixel starts switching to its next state.