That wasn't there when I started my response. It took me a while. I told you it was hard and I was frazzled, but I was right, wasn't I? It is more than 6mm.
who cares? It's an extra 2-4 mm on a 620mm wide device.
That wasn't there when I started my response. It took me a while. I told you it was hard and I was frazzled, but I was right, wasn't I? It is more than 6mm.
That wasn't there when I started my response. It took me a while. I told you it was hard and I was frazzled, but I was right, wasn't I? It is more than 6mm.
who cares? It's an extra 2-4 mm on a 620mm wide device.
That's it, this monitor is garbage. 10mm, not 6mm? Kronvict might as well throw it away now.
It's covered in the TFTcentral review. Perhaps you should read it?
Hey, I just asked a simple question. It's claimed to be 6mm and it didn't look like it was.
Bezel width matters for multi monitor surround setups. That would be exactly why they exaggerated the spec. I never claimed it was junk. Nor did I actually ask either of you two. So, if you really think it's a worthless point then don't bother trying to defend it.
who cares? It's an extra 2-4 mm on a 620mm wide device.
You wanted to know about the bezel width, and I pointed you towards the answer. You're welcome BTW.
Bezel is thin enough for me for eyefinity/surround after seeing pcper do it
Well, no joke. Duh, you don't use ULMB and g-sync in the same situations. It's a button press on the swift IIRC. You would never need to use ULMB and G-sync at the same time, it just depends on the situation. Not that it matters because it takes practically no effort to switch between the two, but the short version is:
Low frame rates = g-sync. High frames = ULMB. They both cover the entire gamut of gaming. ULMB doesn't benefit much at low framerates while g-sync does, and the opposite is true of high framerates where g-sync becomes less beneficial and then ULMB/lightboost becomes tremendously beneficial.
ULMB and g-sync are two different solutions for two different scenarios. G-sync won't benefit you if you're playing Black Ops 2 and your framerate is 300 fps. But you can bet that ULMB will create a tremendous difference. It doens't matter though. Switching between the two modes is trivial and takes probably 2 seconds, depending on which game you're playing. Clearly if you're playing crysis 3 at 1440p cranked to the max you're not going to use ULMB because ULMB would not create a tremendous difference when your frames are going to dip low a lot. But g-sync would help there. Just depends on the game and like I said, switching between the two modes takes a second if that.
Yeah, their demo of it looked rather nice compared to most triple monitor configs I've seen.
A question, though: how often do other monitors quote the plastic part only as the bezel, and not the edge-to-first-pixel distance? I imagine it would be rather common...
I actually prefer ULMB no matter what framerate. It does add motion clarity even when fps is lower. I guess it's not trivial to implement with G-Sync as the refresh rate varies.
So you don't mind / notice tearing? Do you use V-sync or other?
Thing is also, as people try to wrap their minds around the technical details, G-sync is better than V-sync emerges as the simple conclusion, but we ought not forget that double and tripple buffering is also better than V-sync. Instead of presenting unfinished (torn) buffers to the monitor, the graphics card rotates in a second buffer when it's rendered and presentable. At 120 Hz we are talking 8 ms out-of-sync with the animation max, and 4 ms error on average. Maybe that's the kind of unevenness that people can live with, provided the game supports double buffering and the monitor uses ULMB / Lightboost. I'm beginning to imagine just how this might be preferable to the ghostly smear of pixel transitions and the regular stutter of persistence.
So you don't mind / notice tearing? Do you use V-sync or other?
Thing is also, as people try to wrap their minds around the technical details, G-sync is better than V-sync emerges as the simple conclusion, but we ought not forget that double and tripple buffering is also better than V-sync. Instead of presenting unfinished (torn) buffers to the monitor, the graphics card rotates in a second buffer when it's rendered and presentable. At 120 Hz we are talking 8 ms out-of-sync with the animation max, and 4 ms error on average. Maybe that's the kind of unevenness that people can live with, provided the game supports double buffering and the monitor uses ULMB / Lightboost. I'm beginning to imagine just how this might be preferable to the ghostly smear of pixel transitions and the regular stutter of persistence.
Double and triple buffering do not stop tearing. They just give the GPU a place to write to and store images. And V-sync cannot be done without double buffering btw and modern games always have at least double buffering, with or without V-sync (I'm not sure they ever went without double buffering). Triple buffering just allows the GPU to write to a 3rd buffer while waiting to show the image in the 2nd buffer if using V-sync. Without V-sync, either method can flip the image during a refresh resulting in tearing.
Except triple buffering also increases input lag, which G-Sync doesn't do.
Syncing the monitor refresh to the source is just fundamentally a better way of displaying video, period.
I really hope the early adopters give both a fair shake and report back, instead of immediately embarking on a vapid and soul destroying gaming binge.
I'll be getting one of these as soon as possible (I've got friends and family scouring their local Fry's electronics,) but I was planning on just using G-sync because I don't understand what ULMB does. I also can't understand what you guys are saying about double and triple buffering. I have no idea how that works. Isn't g-sync the whole point of the monitor? So I can get high FPS and no tearing? That's what I'm after.
Thanks for the correction, it's been a while since I read up on this stuff. http://www.anandtech.com/show/2794/2 My point remains, there is a better way to V-sync, than typically discussed in the G-sync explanations, that doesn't delay the rendering of frames in any way. Though to be fair the Nvidia guy did bring it up at his first visit to Pcper.
Then proceeded to belittle my concern for asking. So, forgive me if I'm not feeling particularly gratuitous.
I'll be getting one of these as soon as possible (I've got friends and family scouring their local Fry's electronics,) but I was planning on just using G-sync because I don't understand what ULMB does. I also can't understand what you guys are saying about double and triple buffering. I have no idea how that works. Isn't g-sync the whole point of the monitor? So I can get high FPS and no tearing? That's what I'm after.