Asus Rog Swift PG278Q

Page 17 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

jackstar7

Lifer
Jun 26, 2009
11,679
1,944
126
Oh. Of course you do. Major surprise.

Any competitive gamer that has experience with it, will attest to it being excellent. It works. I assume that AMD won't be getting anything equivalent to motion blur reduction technology since they've said nothing about it. G-sync covers the low framerate area, lightboost/ULMB covers high framerates.....Anyway, so if AMD doesn't get it, you won't get it on an official basis. (I assume, since nvidia poured their R+D dolloars into it) Maybe hacks will still work, I don't know.

Nevermind. I did not read that correctly.
 

Mand

Senior member
Jan 13, 2014
664
0
0
I think he was only objecting to how you basically describe them as the same thing, as they aren't really the same feature like you claim.

ULMB is a feature included into G-Sync monitors, but it isn't necessarily a feature that will always be included. For instance, I doubt we'll see it included in the 4K G-Sync monitors as we're currently limited to 60Hz at that resolution due to the bottleneck with DisplayPort (would need DP1.3 to get 4K120), unless they can get those monitors to run at 85-120Hz if they are run at 1080p (which would be a pretty awesome feature).

From my understanding ULMB capability is contained within the G-Sync module itself, as one of its functions. Perhaps my use of the word "feature" was incorrect, as I did not mean to imply that they do the same thing, but rather that the hardware that enables them is part of the same module. So the complaint that ULMB or G-Sync lose value because you can't use them simultaneously is not entirely valid, since you can't get them separately at this point.

I am not certain that you can claim that ULMB won't always be included with G-Sync. I can't see a reason why after designing it into the module they would take the time to take it out again.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
On that note on the whole IPS vs TN vs AMVA thing, here's hoping OLED takes off in the next 5 years. Still to this day there are signifncant issues prevent mas production of consumer size big screens or monitors, hopefully those get ironed out.

Basically, I find good points and bad points with all panels. IPS is great for productivity and work but has issues with ghosting due to issues inherent to IPS electronics. Only panel type with 10 bit color (although 10 bit color does NOT work unless you have a quadro or firepro card..) and 10 bit color is only useful for professional applications such as video/photo editing. GAmes do not use 10 bit color. Good for work and slower paced games, not the best at all games though. I think the swift is the best gaming monitor so far, based on reviews i've read and it is a TN. Obviously TN panels have the viewing angle issue, but the panel itself on the swift is a very good one. Still that doens't prevent viewing off center angles. I'll also say that I think AMVA panels are excellent - they don't get much mention here but every VA i've seen had black levels/contrast ratios that IPS and TN could only dream of achieving. Both TN and IPS don't really have true blacks when viewed by the eye. VA and especially Plasma excels at black levels/contrast ratios where TN/IPS fall flat on their face.

All about trade-offs. I can't believe it's 2014 and LCD technology hasn't progressed further; I want something new to come. In the meantime, you have to pick your poison based on what you do with your monitor. But I really hope OLED takes off sometime in the next few years.
 
Last edited:

Mand

Senior member
Jan 13, 2014
664
0
0
On that note on the whole IPS vs TN vs AMVA thing, here's hoping OLED takes off in the next 5 years. Still to this day there are signifncant issues prevent mas production of consumer size big screens or monitors, hopefully those get ironed out.

OLED beats them all. Fundamentally, LCDs suck, because they're throwing away half the light. Emissive patterning like OLED doesn't have that problem. Backlight bleed, black levels, color shifts, all of that nonsense goes away.

The problem with OLED is lifetimes. It's fine for a smartphone, but it's awful for a desktop display with a much longer expected lifetime. It's not a situation where it just fails outright, but rather there's a gradual decay in brightness as the display is used. Higher brightness causes it to decay faster, and the colors themselves don't have the same decay rates - blue decays fastest, then green, and red is the most stable. But, that means you'll get luminance changes in your color channels over the lifetime of the device, which change based on what you're looking at, so play a game with a HUD for long enough (or leave the display on the Windows lock screen) and you'll start to preferentially decay certain pixels more than other pixels, which results in a mess of an image.

Once they get the remaining issues ironed out, OLED will be the clear winner. Pattering the emitters is just better than patterning an absorption layer that works off of polarization.

I can't believe it's 2014 and LCD technology hasn't progressed further; I want something new to come.

Unfortunately, it's not just a matter of time. It's also a matter of physics. If there were a way to get LCs behave to get us IPS performance at TN speeds, we would already be doing it.
 

SoulWager

Member
Jan 23, 2013
155
0
71
OLED beats them all. Fundamentally, LCDs suck, because they're throwing away half the light. Emissive patterning like OLED doesn't have that problem. Backlight bleed, black levels, color shifts, all of that nonsense goes away.

The problem with OLED is lifetimes. It's fine for a smartphone, but it's awful for a desktop display with a much longer expected lifetime. It's not a situation where it just fails outright, but rather there's a gradual decay in brightness as the display is used. Higher brightness causes it to decay faster, and the colors themselves don't have the same decay rates - blue decays fastest, then green, and red is the most stable. But, that means you'll get luminance changes in your color channels over the lifetime of the device, which change based on what you're looking at, so play a game with a HUD for long enough (or leave the display on the Windows lock screen) and you'll start to preferentially decay certain pixels more than other pixels, which results in a mess of an image.

Once they get the remaining issues ironed out, OLED will be the clear winner. Pattering the emitters is just better than patterning an absorption layer that works off of polarization.



Unfortunately, it's not just a matter of time. It's also a matter of physics. If there were a way to get LCs behave to get us IPS performance at TN speeds, we would already be doing it.

I want to see more work going into QD LED displays.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
From my understanding ULMB capability is contained within the G-Sync module itself, as one of its functions. Perhaps my use of the word "feature" was incorrect, as I did not mean to imply that they do the same thing, but rather that the hardware that enables them is part of the same module. So the complaint that ULMB or G-Sync lose value because you can't use them simultaneously is not entirely valid, since you can't get them separately at this point.

I am not certain that you can claim that ULMB won't always be included with G-Sync. I can't see a reason why after designing it into the module they would take the time to take it out again.

yeah, its a little confusing

we have a hardware part that is called "G-Sync" and it has two key features, G-Sync and ULMB. The two features are definitely not the same, but they are controlled by the same hardware module.

And my claim that we might see G-Sync without ULMB is based on the fact that we're supposed to see a 4K G-Sync monitor from Acer (and I think even ASUS), and I can only guess that those monitors will be limited to 60Hz and thus there would simply be no reason to make ULMB available from the G-Sync module (that is unless we somehow get an interface that is capable of the bandwidth required to push 4K @ ULMB friendly refresh rates, and/or see the option to run a higher refresh rate with a lower resolution, of which 4K should be capable of perfect pixel doubling down to 1080p and thus should then have plenty of bandwidth to drive the monitor well over 120Hz)
 

AdamK47

Lifer
Oct 9, 1999
15,322
2,928
126
On that note on the whole IPS vs TN vs AMVA thing, here's hoping OLED takes off in the next 5 years. Still to this day there are signifncant issues prevent mas production of consumer size big screens or monitors, hopefully those get ironed out.

Basically, I find good points and bad points with all panels. IPS is great for productivity and work but has issues with ghosting due to issues inherent to IPS electronics. Only panel type with 10 bit color (although 10 bit color does NOT work unless you have a quadro or firepro card..) and 10 bit color is only useful for professional applications such as video/photo editing. GAmes do not use 10 bit color. Good for work and slower paced games, not the best at all games though. I think the swift is the best gaming monitor so far, based on reviews i've read and it is a TN. Obviously TN panels have the viewing angle issue, but the panel itself on the swift is a very good one. Still that doens't prevent viewing off center angles. I'll also say that I think AMVA panels are excellent - they don't get much mention here but every VA i've seen had black levels/contrast ratios that IPS and TN could only dream of achieving. Both TN and IPS don't really have true blacks when viewed by the eye. VA and especially Plasma excels at black levels/contrast ratios where TN/IPS fall flat on their face.

All about trade-offs. I can't believe it's 2014 and LCD technology hasn't progressed further; I want something new to come. In the meantime, you have to pick your poison based on what you do with your monitor. But I really hope OLED takes off sometime in the next few years.

I love my new AMVA monitor. Bought a BenQ BL3200PT a week ago. The contrast is awesome!
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Indeed. I like AMVA a lot as well, I was surprised that the viewing angles are actually good and (AM)VA in general excels with deep blacks and contrast ratios. Very noticeable especially with movies, it's an awesome panel tech and a shame it doesn't get more attention. Not perfect just like nothing is perfect with any panel tech, but very nice.

I'm not sure how VA fares in terms of fast response/ghosting, as I didn't game for an extended period on one... but by those other metrics they're pretty awesome. I like them a lot. Is that the 32 inch 4k panel you bought? If so, looks like a great panel. I think 28 inch is way too small for 4k (given Windows DPI problems) but 32 inches would be just perfect for 4k IMO.
 
Last edited:

AdamK47

Lifer
Oct 9, 1999
15,322
2,928
126
Nope, not 4K. 32" 2560x1440, which is the same PPI as a 24" 1920x1080. No squinting required.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Indeed. I like AMVA a lot as well, I was surprised that the viewing angles are actually good and (AM)VA in general excels with deep blacks and contrast ratios. Very noticeable especially with movies, it's an awesome panel tech and a shame it doesn't get more attention. Not perfect just like nothing is perfect with any panel tech, but very nice.

I'm not sure how VA fares in terms of fast response/ghosting, as I didn't game for an extended period on one... but by those other metrics they're pretty awesome. I like them a lot. Is that the 32 inch 4k panel you bought? If so, looks like a great panel. I think 28 inch is way too small for 4k (given Windows DPI problems) but 32 inches would be just perfect for 4k IMO.

I think a major reason it doesn't get more attention is that it just doesn't have as much of a place on the PC

IPS for work, TN for play/cost, which leaves VA to excel at media viewing, which is almost always going to rank behind work/play on PC. I've thought about adding a 3rd monitor to my main rig specifically for playing video and going with VA for it, but at the end of the day if I'm really wanting a quality display for playing back video and I want to appreciate that quality, I'll take the time to sit down in front of my plasma. Otherwise I'm perfectly content to play back video on my IPS if I want to have something on while I'm playing on my TN.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I haven't seen anyone here say any panel is perfect, you need a re-read before you wind up qualifying for your label.

I've also seen Linus' review and he said exactly what I've been saying; you can still tell it's TN and it's still not as good as IPS in the metrics IPS excels at, regardless of all the hype. It's good at what it's marketed on; response times, gsync, high refresh rate and less blur as a result of those metrics for those susceptible to it.

That's not what I heard in his review at all. Although he hit on the viewing angles being a detriment if you're not centered viewing, and obviously all TN panels have that if you're not viewing dead on......He stated specifically that he is a 1) an IPS panel fan and he has always preferred IPS (he said this in the past) and 2) he stated that the Swift didn't feel to him like a TN experience at all, in other words, he loved it, it is significantly better than any TN panel he's ever seen (including the 4k TN panels).... and 3) he called the monitor a "slam dunk" and easily the best gaming monitor ever per his words. He also said that the monitor is good enough for him to use as a legit daily driver. He has always used IPS panels as his daily driver. Apparently this was good enough that he considered going to the Swift full time for everything. He also made some comments on ULMB/G-sync as well.

As I said before, it's all trade-offs. IPS isn't perfect, TN isn't perfect, but this is probably the closest to the best gaming monitor ever. I think IPS is best for productivity and professionals, and pretty good for slower paced games. IPS used to have the advantage of being the only panel with high res, but that is no longer the case since AMVA and TN can have 1440p-4k panels as well now (and these high res TN panels are significantly better than older ones) Previously, I liked IPS a lot for this very reason. Higher than 1080p resolution, I cannot stand 1080p. I still like IPS but IPS isn't the only tech that allows high resolution now, so as always buy based on your usage model. Working? IPS. Gaming? The swift sounds really nice.

Anyway, hopefully we'll have more options in the coming months, I'm in a wait and see mode to see what other g-sync panels come.

His review is here:

https://www.youtube.com/watch?v=XdqTIfNv2DE&list=UUXuqSBlHAE6Xw-yeJA0Tunw
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
ULMB and G-Sync are the same feature...they're both on the G-Sync module. You can't get one without the other.

They are not the same feature. You can't use Gsync and ULMB at the same time. Whether they come together on the same monitor doesn't matter.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
From my understanding ULMB capability is contained within the G-Sync module itself, as one of its functions. Perhaps my use of the word "feature" was incorrect, as I did not mean to imply that they do the same thing, but rather that the hardware that enables them is part of the same module. So the complaint that ULMB or G-Sync lose value because you can't use them simultaneously is not entirely valid, since you can't get them separately at this point.

I am not certain that you can claim that ULMB won't always be included with G-Sync. I can't see a reason why after designing it into the module they would take the time to take it out again.

You can't use them simultaneously is the issue. What isn't valid? Are you saying you wouldn't want gsync and ULMB at the same time? Of course you would. If I'm using Gsync, the main feature that people are touting these monitors for ULMB will be off. Unless you are running at or above the refresh rate you will need to forgo ULMB. That's going to be most of the time for most users.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Well, no joke. Duh, you don't use ULMB and g-sync in the same situations. It's a button press on the swift IIRC. You would never need to use ULMB and G-sync at the same time, it just depends on the situation. Not that it matters because it takes practically no effort to switch between the two, but the short version is:

Low frame rates = g-sync. High frames = ULMB. They both cover the entire gamut of gaming. ULMB doesn't benefit much at low framerates while g-sync does, and the opposite is true of high framerates where g-sync becomes less beneficial and then ULMB/lightboost becomes tremendously beneficial.

ULMB and g-sync are two different solutions for two different scenarios. G-sync won't benefit you if you're playing Black Ops 2 and your framerate is 300 fps. But you can bet that ULMB will create a tremendous difference. It doens't matter though. Switching between the two modes is trivial and takes probably 2 seconds, depending on which game you're playing. Clearly if you're playing crysis 3 at 1440p cranked to the max you're not going to use ULMB because ULMB would not create a tremendous difference when your frames are going to dip low a lot. But g-sync would help there. Just depends on the game and like I said, switching between the two modes takes a second if that.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Well, no joke. Duh, you don't use ULMB and g-sync in the same situations. It's a button press on the swift IIRC. You would never need to use ULMB and G-sync at the same time, it just depends on the situation. Not that it matters because it takes practically no effort to switch between the two, but the short version is:

Low frame rates = g-sync. High frames = ULMB. They both cover the entire gamut of gaming. ULMB doesn't benefit much at low framerates while g-sync does, and the opposite is true of high framerates where g-sync becomes less beneficial and then ULMB/lightboost becomes tremendously beneficial.

ULMB and g-sync are two different solutions for two different scenarios. G-sync won't benefit you if you're playing Black Ops 2 and your framerate is 300 fps. But you can bet that ULMB will create a tremendous difference. It doens't matter though. Switching between the two modes is trivial and takes probably 2 seconds, depending on which game you're playing. Clearly if you're playing crysis 3 at 1440p cranked to the max you're not going to use ULMB because ULMB would not create a tremendous difference when your frames are going to dip low a lot. But g-sync would help there. Just depends on the game and like I said, switching between the two modes takes a second if that.

Define low framerates. 80Hz? 100Hz? What framerate do we stop being concerned about blurring?

How easy or difficult it is to switch between them? How does that matter to what I said?

Since when is 300fps even a relevant framerate when discussing modern gaming? If you game @ 300fps, or anywhere above the refresh rate, you don't even need Gsync.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Personally I would very much of liked Gsync and ULMB together, because I do not want to be dealing with vsync stutter just so I can reduce the blur. ULMB only ends up useful in a small subset of games where your PC is so powerful that you can maintain a consistent FPS of 120/144 without any drops at all. Things like CS and other high speed, low graphics twitch shooters where the frame rate is dependable. It might be beneficial in BF4 but the frame rate is far to varied to be usable in such a game, which is where gsync is good.

Having both together would have been a killer combination IMO, it would have clearly made gsync astounding. But I can also understand why Nvidia chose not to solve that problem with gsync V1 because gsync in itself is worth having.
 

know of fence

Senior member
May 28, 2009
555
2
71
The scary thing is that even the best IPS monitors in the worst case take nearly 25ms to change a pixel. They can on average achieve 13ms but they almost never reach the spec sheet and the claims of the manufacturers of 5ms. Even at 60hz IPS screens aren't even done changing by the time the next screen comes along, which its why its not uncommon to have 2 screens worth of information bleeding into the current one. We do see those after images, we perceive the poor blur. Its well within the eyes tolerances, everyone can see it just not everyone cares. Blind trials on high speed panels have repeatedly shown that people can absolutely tell the difference. Its not really that some people don't notice, everyone does, its just that some people never saw better than what they are using.

Without overdrive your 25ms for IPS are pretty much dead on. With overdive they can do better, here's an example with for an Eizo PLS screen (eizo_ev2736w).


While IPS(PLS) may be slow it is remarkebly consistent, where transitions from a TN can last anywhere from 1 ms to 10 ms. I think that transition time inconsistency can affect picture quality as well. For a TN the faster dark transitions appear several ms before the slower brighter transitions. Anyway for depiction of motion there seems to be no substitute for the speed of TN, and even a fast TN panel (without OD) is barely fast enough for 120 Hz ULMB.

Viewing angles and the resulting color shifts on a TN become bothersome as screen size increases or distance to screen decreases. Which is why a smaller screen may be preferable. At the end of the day I can still sit further away from the monitor, leaning back like the infamous South Park caricature, but then the tighter 2K pixel pitch is a bit of a waste...
 

Mand

Senior member
Jan 13, 2014
664
0
0
You can't use them simultaneously is the issue. What isn't valid? Are you saying you wouldn't want gsync and ULMB at the same time? Of course you would. If I'm using Gsync, the main feature that people are touting these monitors for ULMB will be off. Unless you are running at or above the refresh rate you will need to forgo ULMB. That's going to be most of the time for most users.

There are technical reasons why they're incompatible. ULMB works at specific frequencies, G-Sync works by varying frequency. Do we want them both? Sure. But complaining that we don't get what we want is about as valid as complaining that the G-Sync module doesn't create a Holodeck. Neither tech has existed before this, stop complaining when we get new things.
 

Mand

Senior member
Jan 13, 2014
664
0
0
Not to mention the GPU power required to run it. Good luck getting 144Hz at 7680x1440.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Conversely, this is where g-sync would make the absolute biggest difference. At the lower framerate realms such as 30-60 fps where tearing is (generally) the most evident.

That's why i'm pretty excited about g-sync at 4k, but what i'm not excited about is LCD technology and MS windows poor job of DPI scaling. First LCD tech would allow me to get 4k, but then i'm stuck with MS' piss poor DPI scaling and I still think 28 inches at 4k is too small (although 32 to 34 inches is about right for 4k) and then LCD tech would prevent me from getting 144hz on a 4k screen. Could I lower the resolution to get ULMB at a lower resolution on a 4k panel? Nope, no dice, as everyone knows you don't use non native resolutions on LCD technology. Well, you *can* but it will look horrible due to pixel doubling and scaling.

I still want better monitor tech. Here we are in 2014 and no matter what panel you get, it's a matter of what trade-offs you find acceptable. What I want is TN ghost-free (relatively) movement and motion, high refresh, IPS viewing angles, Plasma contrast ratios, I want it all on a single panel. But sadly we can't have this. Anyways back to the main point. I think these are areas where g-sync could excel: surround and super high resolutions. Those are the areas where you would most frequently get dips into the low framerate areas.
 

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
I'm curious as to how that is hooked up. Doesn't Gsync need DP to work? I don't know of any Nvidia cards that have 3x DP outputs. Unless they are using Tri-SLI, and using the DP of each card. Is it possible to do that with multi-GPU?

Edit - Looks like you can use the DP of each card, as shown here.
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Why not? Multimonitor gaming with Fermi was released with two or more cards because Fermi only supported 2 display engines.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |