Hi,
I'm in the market for a big flat screen TV and I saw all those 200/400/600Hz TVs from Samsung, LG, Sony etc'. I stood in the store with all the TVs mounted on a wall and stared at them to grasp the effect of this massive frame overdrive and I clearly noticed that while the input (blu ray movies) was still at 24 or 30 FPS the picture was moving a lot smoother on the monitors compared to non-overdriven ones.
due to budget constraints I am going to go with a slightly older Samsung F6100/6400 TV, if anyone has any advice against it, please let me know.
All this research into TV technology got me thinking, could the same frame interpolation be implemented in PC monitors for high FPS gaming?
I know the technology isn't perfect and that there are some artifacts and stuttering still, but the same principal should work with PCs.
what do you think? it could act as a post-processing effect like MPAA/FSAA except it would run on the monitor and not on the GPU.
<conspiracy>
BTW...how can we know that the GPU companies don't already do this on the GPU level?
</conspiracy>
I'm in the market for a big flat screen TV and I saw all those 200/400/600Hz TVs from Samsung, LG, Sony etc'. I stood in the store with all the TVs mounted on a wall and stared at them to grasp the effect of this massive frame overdrive and I clearly noticed that while the input (blu ray movies) was still at 24 or 30 FPS the picture was moving a lot smoother on the monitors compared to non-overdriven ones.
due to budget constraints I am going to go with a slightly older Samsung F6100/6400 TV, if anyone has any advice against it, please let me know.
All this research into TV technology got me thinking, could the same frame interpolation be implemented in PC monitors for high FPS gaming?
I know the technology isn't perfect and that there are some artifacts and stuttering still, but the same principal should work with PCs.
what do you think? it could act as a post-processing effect like MPAA/FSAA except it would run on the monitor and not on the GPU.
<conspiracy>
BTW...how can we know that the GPU companies don't already do this on the GPU level?
</conspiracy>