Are you talking about real displays that refresh at 120Hz and support 120Hz input (often used for active shutter 3D), or fake 120Hz displays that just do framerate conversion to 120Hz? That article is about the latter, a TV from 2008. I basically wouldn't trust anyone who tried to demonstrate the real world effects of a feature using four year old TVs, at best it's just not relevant to current models, and at worst it's wrong because it's outdated.
I wouldn't pay extra for a TV or monitor that just does framerate conversion to 120Hz, in fact for gaming this processing adds additional frames of delay. It can look nice if you're distracted by the jerkiness of 24fps video, but it can also look ugly if badly implemented, and the smoothness of the video can seem weird to people used to lower framerates. On the other hand, I WOULD pay extra for a monitor that could really display 120Hz and accept a 120Hz input signal, as that would really improve gaming and significantly lower latency (as latency is a fixed number of frames, and at 120Hz each frame takes half as long).