Originally posted by: Nnyan
Ok so whats the best why to determine which panel you have with your LCD?
Originally posted by: Nnyan
xtknight,
Again thank you! AT is very fortunate to have you as a resource.
Originally posted by: BassBomb
it takes a bit of time to get used to an LCD.... happened to me when i switched from CRT
@FireChicken: From what I've seen the 120Hz LCD's have only shown up in TV's so far 32"+. I would love to see 20-24" monitors with this tech but have yet to find anyone even thinking of releasing them.
Originally posted by: Elcs
Originally posted by: BassBomb
it takes a bit of time to get used to an LCD.... happened to me when i switched from CRT
I had the same problem going from a 17" CRT to 17" LCD.
Brightness killed me, made my eyes hurt and made me feel unwell.
Within a week, I was fine and I could never go back to a CRT style setup.
@CP5670 thanks for the info even if it is a bit of a bummer. I'm going to buy myself an LCD next month or so for my bday so hopefully there are some better options in the 20-24" range for me to pick.
It's been said multiple times in this thread that an LCD can only display 60 fps, am I missing what you are saying because I can clearly get more than 60fps if I don't have vsync on.
Since LCD monitors do not employ phosphors, refresh rate is not a concern. Basically, the transistors in the LCD remain open or closed as needed until the image changes. This can be a point of confusion for some consumers, however, since most graphics cards still ?ask for? a refresh rate setting. This is due to the analog nature of existing graphic cards (see ?Inputs? section) and their support for CRT displays. While refresh rates do not apply to LCD monitors, most LCDs are set up to accept any settings from 60Hz and above."
Originally posted by: CP5670
It's been said multiple times in this thread that an LCD can only display 60 fps, am I missing what you are saying because I can clearly get more than 60fps if I don't have vsync on.
It doesn't matter if you're getting more than 60fps without vsync since the video card will not fully output every frame. Parts of some frames get dropped to make room for the extra ones, which results in the tearing effects you get. One way to look at it is that the rate at which the video card sends out actual pixels depends only the refresh rate, not the framerate.
Since LCD monitors do not employ phosphors, refresh rate is not a concern. Basically, the transistors in the LCD remain open or closed as needed until the image changes. This can be a point of confusion for some consumers, however, since most graphics cards still ?ask for? a refresh rate setting. This is due to the analog nature of existing graphic cards (see ?Inputs? section) and their support for CRT displays. While refresh rates do not apply to LCD monitors, most LCDs are set up to accept any settings from 60Hz and above."
This description is rather misleading. The monitor is still being fed input data at a specific rate irrespective of how it internally updates the pixels, so that will always be an upper limit to how fast the final image can be updated. Other restricting factors (like how fast the individual pixels can update, i.e. the response time) will sometimes come into play before you hit the limit imposed by a 60hz refresh rate, but that certainly doesn't mean that the refresh rate concept does not apply to LCDs.
Originally posted by: Cutthroat
Refresh rate and response time are not exactly the same thing, when the fps is greater than a CRT refresh rate tearing can occur (parts of the image ar not aligned properly), when the fps is grater than the response time of a LCD, you should see ghosting (leftover frames giving a ghosting effect). I cannot get my monitor to show any symptoms of ghosting no matter the framerate. I have an Acer AL2216W, the response time is advertised as 5ms, so by this logic my maximum framerate should be 200fps, 1s/5ms=200. That is an optimal situation since the advertised respose time of LCD's is not measured using a standard so some manufacturers calculate the response time as the time it take for the pixel to change from white to black, which is optimal, but unlikely. Others measure it from grey pixel back to grey pixel, which is more real world.
Anyway, I made some screenshots to show there is no ghosting, tearing, artifacts or anything else when I have a fps>60.
http://i208.photobucket.com/al...at_012/stalkermain.jpg
http://i208.photobucket.com/al...oat_012/stalkersky.jpg
http://i208.photobucket.com/al..._012/stalkerforest.jpg
http://i208.photobucket.com/al...at_012/stalkerdoor.jpg
http://i208.photobucket.com/al...tthroat_012/3dmark.jpg
I tried to post them at 1680x1050, but photobucket automatically resized them.
I also had one showing more than 3000 fps on one of the S.T.A.L.K.E.R loading screens, but it was boring, mostly black screen. I think the screenshot of the main screen of S.T.A.L.K.E.R proves my point well enough. But explain how it is even possible to acheive 1000fps with this logic at all, let alone without artifacts.
The point I'm arguing as that it is possible to achieve more than 60fps on an LCD, depending on it's quality, without ghosting. Now although I can get 100 fps without glitches, it was hard to find any screenshots I could get those fps, normally running around the forest there in Stalker I averaged about 50fps. I don't have any old games around at all to test that would really get a lot of fps. I still normally run everything with triple buffering and vsync on because it is smoother when there are large fluctuations in the fps. Besides, do you normally need to see more tha 60 fps? That's extremely smooth for most people most of the time, even if you may see stuttering and lag in an intense scene from a fps, it is not 'flickering'. And 99% of people will never notice it anyway. There is also no way to get around some stuttering in a new game. I think I would need to be running a pretty old or crappy game to achieve greater than 60fps all the time anyway. I put up with a few seconds of stuttering sometimes in a game so I can have the best graphics 99% of the time. I'm rambling now so...
Originally posted by: Cutthroat
I give you the argument theat fraps is measuring my GPU output, that would explain why it show 3000fps at all. But according to the specs for my monitor it is 5ms response time, so that should mean the maximum amount of times each pixel can be changed (not refreshed because it never goes off, only changes) is 200, or effectively 200fps before I would even start to run into the issues you are talking about. As I said previously no game that I play while moving would ever get anywhere near 200fps so it's a non-issue. And since I play games with vsync on anyway, and still can't see any 'flickering'.
Originally posted by: Cutthroat
I give you the argument theat fraps is measuring my GPU output, that would explain why it show 3000fps at all. But according to the specs for my monitor it is 5ms response time, so that should mean the maximum amount of times each pixel can be changed (not refreshed because it never goes off, only changes) is 200, or effectively 200fps before I would even start to run into the issues you are talking about. As I said previously no game that I play while moving would ever get anywhere near 200fps so it's a non-issue. And since I play games with vsync on anyway, and still can't see any 'flickering'.
This thread started about LCD's flickering due to low refresh rate. That is just not the case, LCD's do not 'flicker' because the pixels never go off, they just change color. As previously stated, refresh rate and response times are not the same. A LCD monitor would work fine if there was no such setting as refresh rate, but since graphics drivers ask for the refresh rate it has to be there for compatibility. Try this test, override your LCD monitor to run at 30Hz, it doesn't flicker, now try that with your CRT.
The important factor with a LCD is the response time, if the response time is to slow you will definatly see the symptoms. Although the end result may be similar, the difference needs to be distinguished because when you buy a new LCD monitor you don't care about the refresh rate, you care about the response time.
Originally posted by: Cutthroat
I give you the argument theat fraps is measuring my GPU output, that would explain why it show 3000fps at all. But according to the specs for my monitor it is 5ms response time, so that should mean the maximum amount of times each pixel can be changed (not refreshed because it never goes off, only changes) is 200, or effectively 200fps before I would even start to run into the issues you are talking about. As I said previously no game that I play while moving would ever get anywhere near 200fps so it's a non-issue. And since I play games with vsync on anyway, and still can't see any 'flickering'.
Originally posted by: pcslookout
I have the same problem here. I have a 20 inch NEC 20WMGX2 and sometimes its unbearable, Thinking about going back to my 19 inch NEC CRT if I can't fix this problem. Doesn't seem to happen when playing games though.
I think the problem may be my eyes though because I have to look pretty close to the lcd to read the text. I am far sided and the text is so small. I tried making it bigger but it messes up webpages, windows, and everything.