Nvidia ,Rtx2080ti,2080,(2070 review is now live!) information thread. Reviews and prices

Page 42 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Pandamonia

Senior member
Jun 13, 2013
433
49
91
I have seen/used and in many cases own these panels. I've used a lot of high end monitors over the last 2 decades and IPS should not be driven at those speeds you ruin the color reproduction which is one of the main reasons to get an IPS in the first place. As I said, there's a reason that IPS are not typically driven at those speeds by all reputable manufacturers because they know it's not appropriate.

I'm not denying that high refresh rate is a benefit to gamers but right now it comes at a trade off with colour accuracy, I'm sorry if you personally fell for a meme panel, you could have got a TN with equally inaccurate colours but saved yourself a load of cash as good IPS panels are expensive.

I assume it's something like the ROG gamer meme panel with it's super turbo button for the fast mega hertz, pew pew lasers and go faster stripe. 4ms g2g response time. So at 165 hz your panel spends 2/3rds of every second at the wrong colour, so gg with that one. And that's just on average g2g, response time b2w will be worse, so worst case pixels are never at the right colour.
If I wanted perfect colours I'd go watch blue ray 4k on my oled tv.

For pc gaming a 144hz gsync is the most important thing you can have. After that you can choose which panel tech you want.

Ive owned both and both are as good as it gets for gaming.

You do a lot of bs talking without telling us what monitor you use.
 

Muhammed

Senior member
Jul 8, 2009
453
199
116
Interesting turn of events in this thread.

A real gamer who likes competitive games and doesn't like meme graphic settings would rather play his multiplayer competitive games with a GTX 750 Ti at 1440p 60hz than with a GTX 1080 at 1440p 144hz because the image quality is bad on the latter, and he knows via research that our brains cannot appreciate that high refresh rate.

Somehow it's all connected to whether or not RTRT will be the real deal anytime soon.

FWIW, I upgraded from 144hz to 240hz and can clearly see and even feel (input lag when capping framerate) the difference. I understand not everyone is that attuned, but there are very few healthy people below senior citizen age who cannot easily appreciate 60hz->144hz (even if they can happily live without it, which is fair). But especially for a competitive multiplayer person who claims to achieve high ranks, it's a complete joke to argue 60hz is fine.
It's the mark of troll discussion.
 

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
Interesting turn of events in this thread.

A real gamer who likes competitive games and doesn't like meme graphic settings would rather play his multiplayer competitive games with a GTX 750 Ti at 1440p 60hz than with a GTX 1080 at 1440p 144hz because the image quality is bad on the latter, and he knows via research that our brains cannot appreciate that high refresh rate.

Somehow it's all connected to whether or not RTRT will be the real deal anytime soon.

FWIW, I upgraded from 144hz to 240hz and can clearly see and even feel (input lag when capping framerate) the difference. I understand not everyone is that attuned, but there are very few healthy people below senior citizen age who cannot easily appreciate 60hz->144hz (even if they can happily live without it, which is fair). But especially for a competitive multiplayer person who claims to achieve high ranks, it's a complete joke to argue 60hz is fine.

Yep. Going from 1080p 60hz to 1440p was a huge jump in immersion and overall gaming experience. Going from 1440p 60hz to 144hz was an equally huge jump. I occasionally play on 1080p 60hz and it's painful to make the jump back to after becoming accustomed to 1440p 144hz.

I look forward to ~ 2020 when I can game 4k 144hz HDR with sustained high frame rates.
 

zinfamous

No Lifer
Jul 12, 2006
110,802
29,553
146
In lala land. The whole industry is moving toward RT, we have DXR, game engines, game developers, and RTX hardware. No one is going to kill off that much investment. Intel is also hopping on the RT bandwagon. AMD will be left in the dust if they don't come up with an RT solution now.

And NVIDIA is way ahead of AMD even when comparing 7nm to 12nm. Imagine what NVIDIA will do on 7nm, they will completely obliterate AMD unless AMD dumbs GCN and start doing something better.

Ask the console land if they are moving to RT...you know: ~70% of the actual market for gaming.

Oh, never-mind. You almost had a soapbox.
 

Pandamonia

Senior member
Jun 13, 2013
433
49
91
It's the mark of troll discussion.
Tbh I only find your posts at fault.

You have zero comprehension of how the market works. Zero comprehension of how the technology works and also sounds like you don't really understand the benefits or lack there of it all.

Take it from someone who has walked the walk. Rtx is no different than 3d or hairworks or physix or VR or even sli to a degree. It's not the quality that will make it fail its the adoption rate and the barrier to entry on cost versus benefit.

Nobody is going back to 1080p 60hz to use Rtx. You need more than 4 2080ti to get that number to 4k and the maths don'tadd up.

It's a sales gimmick to cover a bad release due to 7nm delays
 

DeathReborn

Platinum Member
Oct 11, 2005
2,755
751
136
Tbh I only find your posts at fault.

You have zero comprehension of how the market works. Zero comprehension of how the technology works and also sounds like you don't really understand the benefits or lack there of it all.

Take it from someone who has walked the walk. Rtx is no different than 3d or hairworks or physix or VR or even sli to a degree. It's not the quality that will make it fail its the adoption rate and the barrier to entry on cost versus benefit.

Nobody is going back to 1080p 60hz to use Rtx. You need more than 4 2080ti to get that number to 4k and the maths don'tadd up.

It's a sales gimmick to cover a bad release due to 7nm delays

I'd like to see the empirical evidence to backup your claims here. Also Hairworks & PhysX (post Nvidia integration) are both different to 3D, SLi and VR in the small way of not requiring often expensive extra hardware.

There's a lot of snobbery in PC Gaming/Hardware, if it's not "1080p isn't a gaming res" or "if it's not 60hz+ it's crap" then it's "a xxx GPU is all you need, anything else is a ripoff etc" among countless other ideological fallacies.
 
Reactions: Muhammed

Pandamonia

Senior member
Jun 13, 2013
433
49
91
I'd like to see the empirical evidence to backup your claims here. Also Hairworks & PhysX (post Nvidia integration) are both different to 3D, SLi and VR in the small way of not requiring often expensive extra hardware.

There's a lot of snobbery in PC Gaming/Hardware, if it's not "1080p isn't a gaming res" or "if it's not 60hz+ it's crap" then it's "a xxx GPU is all you need, anything else is a ripoff etc" among countless other ideological fallacies.
The evidence is that these technologies are dead.
 
Reactions: ub4ty

PrincessFrosty

Platinum Member
Feb 13, 2008
2,301
68
91
www.frostyhacks.blogspot.com
Do you have a link to a source where the color accuracy of an IPS panel would go down if you ran it at 120Hz vs at 60Hz. That seems quite counter intuitive to me.

No, it's just obvious when you understand how the technology works.

Pixels don't switch instantly they take some time to change from the colour of the previous frame to that of the new one during this period they're displaying inaccurate colours, eventually they reach the intended colour and then stay at that colour until the next frame refresh occurs when the process starts again.

Visually how that appears on the screen during motion is what we refer to as ghosting. On a contrasting desktop if you drag a window around you see a trail behind it that's like a blur, because you're seeing this pixels swap from one colour to the next over some short period of time, that's visually very obvious to us especially on panels with high pixel response times.

Now that's actually happening across all of your monitor where ever there's changes in pixel colour which typically in games is more or less constant across most of the screen. Colour accuracy in one sense of the word is a ratio of how much time the monitor is displaying an accurate colour vs an inaccurate one. If your pixel switches in 1ms it has inaccurate colours for that 1ms but if your refresh rate is say 60hz it means each frame is displayed for 1000/60 = 16.6ms So the pixels has inaccurate colours for 1/16th of the time.

If you run that same panel at 120hz that's 1000/120 which is 8.3ms and now suddenly it's ratio of inaccurate to accurate is much worse it's 1/8th of the time is spent at the wrong colour. If you keep pushing that to say 4ms pixel response time at 165hz which is 1000/165 = 6ms you're talking about 2/3rds of the refresh is spent at the wrong colour

That's the reason we've not historically had high refresh rate IPS monitors because IPS technology has always had relatively poor pixel response times, something like 4x worse than TN over the years. The technology to drive panels at high refresh rates has been around for decades, we've had 120hz+ TN panels for a long time but historically never IPS, until basically recently. That not an accident, it's because panel manufactures understand that image quality suffers when you push slow panels too fast. As pixel response comes down we can make faster panels and that's great, the moment they have a 1ms IPS panel at like 120hz or above then I can gurantee you I'll be the first person in line to get one, I like high refresh rates like many other gamers I just not prepared to suffer a soupy inconsistent mess to get it.

I don't know why you are going on about colours, the colours look great on the high refresh rate IPS monitors, significantly better then TN screens

The best response I can give to this is probably in a context you'll be familiar with. Which is that many gamers, especially console gamers, will argue that 30fps is enough or that 60fps is enough, that you don't need 120hz or 144hz or 165hz or even 240hz because no one can tel the difference. Yet you clearly can, so can I, which means they're either ignorant (which is typically the case) or in some circumstances that their vision isn't very good, everyone's vision is impaired to differing degrees. I see your argument really no different from theirs it's just about colour rather than speed.

You can't just push panels at arbitrary speeds with no trade off, as the pixel response time and refresh interval approach one another the colour accuracy decreases and that's a sliding scale. What you prefer is a personal trade off, not just because our perceptions differ at a biological level but also we weight the relative things based on what we care about. This is why competitive gamers who care about response time will sacrifice colour to get it by buying TN panels and designers who need maximum colour accuracy for their design work will only work on high gamut IPS displays and happy to do that at 60hz. I'm not denying your personal preference or saying mine is better, all I'm saying is that fundamental trade off exists and that by getting a high refresh IPS panel you're getting speed at the cost of accuracy. Now that might be a preferable trade off for you and that's great, but to deny it's fundamentally a trade of is just mathematically false.

And beating a TN panel in colour accuracy is not hard, if you know how their 8 bit colour works, which is fake, it's actually 6 bit colour with a hack to use averaging (2 bit dithering) then yea beating that isn't hard, they're literally the worse panel type for colour. You could stab yourself in the eye with a pencil and you'd get more vibrant colours.
 
Reactions: psolord and ub4ty

PrincessFrosty

Platinum Member
Feb 13, 2008
2,301
68
91
www.frostyhacks.blogspot.com
If I wanted perfect colours I'd go watch blue ray 4k on my oled tv.

For pc gaming a 144hz gsync is the most important thing you can have. After that you can choose which panel tech you want.

Ive owned both and both are as good as it gets for gaming.

You do a lot of bs talking without telling us what monitor you use.

You're just foisting your personal choice on others, it's a subjective matter about what you prefer, be that speed/colour/viewing angles, etc.

What have I said that's BS? Be specific.

My main displays I use are in my sig. I currently own (i've owned way more but most were sold)
BenQ 32" 3840x2160 IPS 60hz 4ms
BenQ 24" 1920x1080 TN 120hz 1ms (for Nvidia stereoscopic)
Dell 30" 2560x1600 IPS 60hz 12ms
Epson 1920x1080 60hz 3LCD Projetor
Iiyama 454 VMP 2048x1536 CRT (I keep for nostalgia sake)
2 Zenbooks (Asus Ultrabook) one 1080p and one 4k both IPS 13"
OLED screen in my Rift and whatever was in the Rift DK2, I think another OLED

For work in our IT office
4x 24" variants of the old 30" dell monitor I personally own, from a long time ago, they're 1920x1200 IPS 60hz, and now my mains are 2x LG27" 4k IPS 60hz, and we have 2x 21:9 curved ultra wides 3440x1440 both 120hz 4ms IPS. But in general have dealt with buying monitors for our business for more than a decade now, so we've had everything from crap old 5:4 LCDs at 1280x1024 which were old TN rubbish things, all the way up to the dual screen 1920x1200 dell IPS dual screens everyone uses and a raft of things in between. As well as various Mac options most recently 5k screens and glass fronted IPS ones as well as a brand new 4k LCD projector which is the nicest and most expensive bit of kit I've ever seen. I've been in IT as a profession for more than 15 years now I've dealt with just about every panel under the sun and have a good eye for it, the only thing I've never ever seen first hand is 240hz LCD panels although I think my old CRT would do that at low res like 640x480 if memory serves.
 
Reactions: ub4ty

Pandamonia

Senior member
Jun 13, 2013
433
49
91
You're just foisting your personal choice on others, it's a subjective matter about what you prefer, be that speed/colour/viewing angles, etc.

What have I said that's BS? Be specific.

My main displays I use are in my sig. I currently own (i've owned way more but most were sold)
BenQ 32" 3840x2160 IPS 60hz 4ms
BenQ 24" 1920x1080 TN 120hz 1ms (for Nvidia stereoscopic)
Dell 30" 2560x1600 IPS 60hz 12ms
Epson 1920x1080 60hz 3LCD Projetor
Iiyama 454 VMP 2048x1536 CRT (I keep for nostalgia sake)
2 Zenbooks (Asus Ultrabook) one 1080p and one 4k both IPS 13"
OLED screen in my Rift and whatever was in the Rift DK2, I think another OLED

For work in our IT office
4x 24" variants of the old 30" dell monitor I personally own, from a long time ago, they're 1920x1200 IPS 60hz, and now my mains are 2x LG27" 4k IPS 60hz, and we have 2x 21:9 curved ultra wides 3440x1440 both 120hz 4ms IPS. But in general have dealt with buying monitors for our business for more than a decade now, so we've had everything from crap old 5:4 LCDs at 1280x1024 which were old TN rubbish things, all the way up to the dual screen 1920x1200 dell IPS dual screens everyone uses and a raft of things in between. As well as various Mac options most recently 5k screens and glass fronted IPS ones as well as a brand new 4k LCD projector which is the nicest and most expensive bit of kit I've ever seen. I've been in IT as a profession for more than 15 years now I've dealt with just about every panel under the sun and have a good eye for it, the only thing I've never ever seen first hand is 240hz LCD panels although I think my old CRT would do that at low res like 640x480 if memory serves.
As I suspected there isn't a decent gaming monitor in that list to be seen.

Until I went gsync 1440p 144hz in both TN and ips I didn't know what good looked like. I'm also guessing the even newer models will be better.
 
Reactions: ozzy702

zinfamous

No Lifer
Jul 12, 2006
110,802
29,553
146
Interesting times here in PC tech forums, when an argument of the day is that going back to 10 year-old resolutions is pretty much OK, and maintaining that enthusiast goal that has been a hard slog--and is now here--is suddenly snobbery.

The reason people have long stuck to PC is because there is access to better performance, better resolutions, more eye candy, and it was always understood and accepted to come at a higher cost. It's simply why this forum exists. Obviously there is always some lively discussion to be had regarding the pricing tiers within that greater-expected quality range that will continue on forever; but it baffles me that there is a vocal contingent that suddenly finds themselves defending an obvious multi-generational step back in performance, if only to appreciate a barely-noticeable piece of tech (which currently still doesn't exist, btw), and at historically unfathomable and thoroughly unreasonable cost. ...It's completely nuts.

But let me be clear: If you want to jump ahead into that realm, that's certainly great and all. But it's a rather objective matter around here--at least I used to think--that dropping back several generations in expected quality/performance, to appreciate something that actually doesn't exist, at an extortionate price, is wholly irrational. Enjoy it and all that, but don't pretend that there is any rational thought behind that decision, heh.
 

ub4ty

Senior member
Jun 21, 2017
749
898
96
Interesting times here in PC tech forums, when an argument of the day is that going back to 10 year-old resolutions is pretty much OK, and maintaining that enthusiast goal that has been a hard slog--and is now here--is suddenly snobbery.

The times aren't interesting or unique in the historical sense or contextual sense. The times are such that computing has gone mainstream and with that comes mindless consumerism, elitism, and snobbery... This as opposed to well thought out, critiqued, and sensible consumption of general hardware with a bigger focus on the uniqueness and of the experience... and god forbid actually producing something on a platform as opposed to mindless consumption of an endless sea of offerings. I think it's clear who is in the new founded mainstream consumerist camp. Their logic is such that : If you don't have the highest profit margin tier hardware then you're having a deplorable experience. What true enthusiast speaks like this? If 10 year old resolutions got 99% of the job done, then there isn't much to discuss when a company markets memes to people to keep record profits rolling in. The consumerist disagrees.. The consumerist has no understanding of what's really going on beyond a refresh rate. The well studied enthusiast sees the BS beyond the marketing slides. The enthusiast troubles themselves to page through sites like anandtech and others to get the real scoop. In my case, I purchased a range of GPUs and Monitors to evaluate it myself. It was a gimmick. Pissed I even wasted gas going to pick it up and return it. The hard slog is when the discussion is about how much money a person can blow on an experience vs a more enthusiast trained discussion about the underlying technology and how it works. The take my money Nao crowd vs the well trained enthusiast that is critical even when they can clearly afford the highest tier. Instead of arguing from a basis of fact and logic, we get ad hominem attacks, elitist snobbery as if even a burger flipper can't afford a $1,000 GPU in today's economy or doesn't have an RGB AIO rig with a 1080ti. People who define themselves and their hobby by how much money they can blow vs what they actually do with it. So, if you're talking about that snobbery and slog then you're 100% right. I've seen communities come and go when this mentality sets in. It becomes an upvote echo chamber of hollow populist opinion. Meanwhile the real enthusiast having sound and detailed critical conversations get flogged and go elsehwere.

The reason people have long stuck to PC is because there is access to better performance, better resolutions, more eye candy, and it was always understood and accepted to come at a higher cost.
The reason true enthusiast stick with PCs is because of what doors they open. In the past when you were OC'ing a 300MHz processor by 33Mhz/40Mhz at a time, the clear magnitude of the gains was clear. In the past when things went from CRT to LCD. 640x4080 to 1024x768 and on upwards, the gains were quite palatable and a no braner. In the past when you went from single core to dual it was a doubling that mattered when the CPU was max'd out and you multi-tasked. From dual to quad.... from quad to six core. From six core to 8 core. Now the consumer can go all the way up to 32 core via AMD. At this juncture, with everything available, it comes down to value/need and sensibility. All of the crazy resolutions and refresh rates are at your finger tip. You can buy an $8,000 GPU if you so choose. A multi-thousand dollar 32 core processor. 128GB of RAM you name it... PC building is mainstream and take my money consumption is rife... Why? Because companies still need to jeer and push the consumer to higher profit margin products now that all general hardware is great. With everything being capable, there always needs to be that next tier. It's at this point that any true enthusiast stops and takes a deep look at what's going on and with a keen eye specs their hobby sensibly. The higher cost has significant diminishing returns now. While hardware is pushing boundaries... Software and uses have stagnated. There aren't even any games that support the new flagship Nvidia card's features that's been released yet... And a super small minority.. Less than 1% are clogging the forums with their endless praise of it. This is consumerism not enthusiasm. From the wall to wall e-celebs cramming products down people's throats... Seizure inducing RGB.... "I just put it under water..." .. "Oh chit I just roasted 2k worth of hardware with water leaks" ... This is what happens when mainstream consumerism takes over.

It's simply why this forum exists. Obviously there is always some lively discussion to be had regarding the pricing tiers within that greater-expected quality range that will continue on forever; but it baffles me that there is a vocal contingent that suddenly finds themselves defending an obvious multi-generational step back in performance, if only to appreciate a barely-noticeable piece of tech (which currently still doesn't exist, btw), and at historically unfathomable and thoroughly unreasonable cost. ...It's completely nuts.

But let me be clear: If you want to jump ahead into that realm, that's certainly great and all. But it's a rather objective matter around here--at least I used to think--that dropping back several generations in expected quality/performance, to appreciate something that actually doesn't exist, at an extortionate price, is wholly irrational. Enjoy it and all that, but don't pretend that there is any rational thought behind that decision, heh.

The most hilarious aspect is watching the gymnastics someone goes through to justify themselves. Furthermore watching them seizure when you drill down into specs and things they obviously had no clue about but still maintain they have a grounded view.. Even going to lengths to contradict data/facts. Per steam data, Less than 1%-2% of gamers run the higher end GPUs, they are a loud minority... Something companies try to amplify to push others into diminishing returns hardware.

Computing performance rapidly progresses. Hardware released today will be obsoleted tomorrow. Any informed, well seasoned PC enthusiast waits a year or two after a hardware release when the bugs are worked out, drivers settle, and prices come down to reality before they buy. Not until things went super ridiculous mainstream consumerist could you sell a ridiculously priced GPU w/o reviews or even game/driver support PRE-ORDER.

Things jumped the shark with this product launch and I'm glad to have gotten a chance to see yet again what happens when something goes mainstream. It completely jumps the shark, becomes an elitist snob fest and the quality goes down.
 

Muhammed

Senior member
Jul 8, 2009
453
199
116
You have zero comprehension of how the market works. Zero comprehension of how the technology works and also sounds like you don't really understand the benefits or lack there of it all.
And you have zero comprehension of how graphics advance over time, and how new tech gets introduced ..and adopted and evolved over time.
The evidence is that these technologies are dead.
PhysX is not dead, HairWorks is not dead, vr requires extra hardware thus have slow adoption.
You really are speaking BS and generalizing .. ray tracing is being adopted by everyone. Even consoles will have it, mark my words.
 
Last edited:

MrTeal

Diamond Member
Dec 7, 2003
3,584
1,743
136
No, it's just obvious when you understand how the technology works.

Pixels don't switch instantly they take some time to change from the colour of the previous frame to that of the new one during this period they're displaying inaccurate colours, eventually they reach the intended colour and then stay at that colour until the next frame refresh occurs when the process starts again.

Visually how that appears on the screen during motion is what we refer to as ghosting. On a contrasting desktop if you drag a window around you see a trail behind it that's like a blur, because you're seeing this pixels swap from one colour to the next over some short period of time, that's visually very obvious to us especially on panels with high pixel response times.

Now that's actually happening across all of your monitor where ever there's changes in pixel colour which typically in games is more or less constant across most of the screen. Colour accuracy in one sense of the word is a ratio of how much time the monitor is displaying an accurate colour vs an inaccurate one. If your pixel switches in 1ms it has inaccurate colours for that 1ms but if your refresh rate is say 60hz it means each frame is displayed for 1000/60 = 16.6ms So the pixels has inaccurate colours for 1/16th of the time.

If you run that same panel at 120hz that's 1000/120 which is 8.3ms and now suddenly it's ratio of inaccurate to accurate is much worse it's 1/8th of the time is spent at the wrong colour. If you keep pushing that to say 4ms pixel response time at 165hz which is 1000/165 = 6ms you're talking about 2/3rds of the refresh is spent at the wrong colour

That's the reason we've not historically had high refresh rate IPS monitors because IPS technology has always had relatively poor pixel response times, something like 4x worse than TN over the years. The technology to drive panels at high refresh rates has been around for decades, we've had 120hz+ TN panels for a long time but historically never IPS, until basically recently. That not an accident, it's because panel manufactures understand that image quality suffers when you push slow panels too fast. As pixel response comes down we can make faster panels and that's great, the moment they have a 1ms IPS panel at like 120hz or above then I can gurantee you I'll be the first person in line to get one, I like high refresh rates like many other gamers I just not prepared to suffer a soupy inconsistent mess to get it.
I get what you're saying on that over the whole frame time lower refresh rates would produce on average a more accurate colour relative to the input frame, I just don't see why that would be a criteria someone would care about in a fast moving game. Even assuming that it takes the full GTG time, the faster refresh rate should produce more accurate colours relative to the scene. Static images shouldn't matter, but say you take an extreme example of a test object of a fast moving 2x2 pixel black square moving from one side of the monitor to the other side on a white background, say it moves 1 pixel per ms. Say it takes 4ms to go to black, and another 4 back to white. Let's also pick 100 Hz and 200 Hz to make the numbers easier. At t=0, the four pixels on the far left of the monitor turn on, and at t=4ms they're fully black. At this point, both refresh rates are the same, and at t=4ms the squares are full black. Now at t=5ms on the 200Hz monitor the first square will start to fade out and another 5 pixels over will start to turn black. For the 100Hz monitor, the square at the left side of the screen will still be full black. At t=9ms, the 200Hz monitor has a fully black square starting 5 pixels from the edge, and the 100Hz has one at the edge. The leftmost square on the 200Hz has finished fading out. At t=10ms, the 100Hz monitor starts fading out the left edge square and turning on one 10 pixels in, and the 200Hz fades out the square at position 5 and starts turning on the same one at position 10.

You will definitely get ghosting, but the colour is better relative to the actual scene. Sure at the instant of t=10ms the square on the left edge of the screen at 100Hz is nice as black and has been for awhile, as is proper for its input signal, but the color is wrong and it shouldn't be black; that square's been gone 8ms. At 200Hz the pixels might not stay at the colour as long as at 100Hz, but that's because they're rendering the scene more accurately.

Likewise, if you're just looking at one pixel and it's stable red at t=0 when the refresh happens, if the color changes to orange at t=8ms both refresh rates would behave the same; they're start changing it to orange at t=10ms. If the switch to orange would have happened at t=4ms though, the 200Hz one would start changing to the correct colour at t=5ms instead of t=10ms, and would be fully settled long before the 100Hz one even started switching.

If you were comparing two monitors with different response times like a fast TN vs an IPS I could get on board with that, but outside of overclocking a panel faster than it's capable of I don't see how running a monitor a lot slower than it's capable of produces more accurate colours relative to the actual scene, which is what really matters.
 

Pandamonia

Senior member
Jun 13, 2013
433
49
91
And you have zero comprehension of how graphics advance over time, and how new tech gets introduced ..and adopted and evolved over time.

PhysX is not dead, HairWorks is not dead, vr requires extra hardware thus have slow adoption.
You really are speaking BS and generalizing .. ray tracing is being adopted by everyone. Even consoles will have it, mark my words.

I put as much stock in your predictions as a fortune cookie.

Those technologies are dead. They work only on one vendor so are missing from nearly all games. They pop up once in a blue moon when nvidia pays for some exclusive features. SLI has been killed by greenlight games with no drivers.

RT has one big problem. After 7nm there is no more die shrinks for a very long time and even then they are saying the benefits are not scaling any more. Yields are more important and so is cost.

This makes console RT I'd say almost impossible. Especially in its current form. It might get dumbed down to some low level where the overhead is small but the benefit non existent.

No technology had ever taken hold which ultimately increases cost and reduces frames. People need better visuals and more frames to buy in and for it to stick and reduce in cost every year.

I'm betting that soon a new version of sli is the only way to push performance forward where nvidia will hope to sell you two cards again in some magical revival to keep frames moving forward.

I don't know what cave you live in but Mooreslaw is basically dead and this is what had been keeping performance alive. This has been the case in cpu for about 5 years and that's why they sell us 5 year old designs with more cores.

We have one more die shrink left before redacted hits the fan by the looks of it. So good luck to RT because that will be the first thing on the cutting room floor.

Profanity is not allowed in the tech areas.

AT Mod Usandthem
 
Last edited by a moderator:

Muhammed

Senior member
Jul 8, 2009
453
199
116
This makes console RT I'd say almost impossible. Especially in its current form. It might get dumbed down to some low level where the overhead is small but the benefit non existent.
Just you wait, I hope you eat your words when you see consoles with ray tracing.

They pop up once in a blue moon when nvidia pays for some exclusive features.
Yeah once in a blue moon like dozens and dozens of games, have you seen the GameWorks list of games lately?
I don't know what cave you live in but Mooreslaw is basically dead and this is what had been keeping performance alive. This has been the case in cpu for about 5 years and that's why they sell us 5 year old designs with more cores.
Riiiiight, I have seen this BS repeated a hundred times in the past 5 years. It's simply crap.
 

Pandamonia

Senior member
Jun 13, 2013
433
49
91
Just you wait, I hope you eat your words when you see consoles with ray tracing.


Yeah once in a blue moon like dozens and dozens of games, have you seen the GameWorks list of games lately?

Riiiiight, I have seen this BS repeated a hundred times in the past 5 years. It's simply crap.
You have no evidence of anything you say.

There is plenty of evidence of what I claim. Namely that we are still using skylake cores until 2020 at the earliest. Amd has given up on high end gpus and nvidia are charging £1500 for a strix ti.

If you are so clever explain how they will get RT in a 4k console for £350 when a £1500 gpu can't do it in 1080p
 

Muhammed

Senior member
Jul 8, 2009
453
199
116
You have no evidence of anything you say.
When the next gen of consoles will be announced you are going to feel royally disappointed, go ahead right now and bookmark this post, I sure will do that just to enjoy the look on your face.
There is plenty of evidence of what I claim. Namely that we are still using skylake cores until 2020 at the earliest. Amd has given up on high end gpus and nvidia are charging £1500 for a strix ti.
What kind of pathetic evidence is that? your gut feelings?!
If you are so clever explain how they will get RT in a 4k console for £350 when a £1500 gpu can't do it in 1080p
Consoles don't need to run games at ultra settings, they just do medium and upscale. And this is going to be the continuing trend next gen too.
 

Pandamonia

Senior member
Jun 13, 2013
433
49
91
When the next gen of consoles will be announced you are going to feel royally disappointed, go ahead right now and bookmark this post, I sure will do that just to enjoy the look on your face.

What kind of pathetic evidence is that? your gut feelings?!

Consoles don't need to run games at ultra settings, they just do medium and upscale. And this is going to be the continuing trend next gen too.
Haha you have zero cred now.

Next gen consoles. Hahaha.

£350 console with RT hahaha

Good luck.
 

UsandThem

Elite Member
May 4, 2000
16,068
7,380
146
When the next gen of consoles will be announced you are going to feel royally disappointed, go ahead right now and bookmark this post, I sure will do that just to enjoy the look on your face.

What kind of pathetic evidence is that? your gut feelings?!

Consoles don't need to run games at ultra settings, they just do medium and upscale. And this is going to be the continuing trend next gen too.

Haha you have zero cred now.

Next gen consoles. Hahaha.

£350 console with RT hahaha

Good luck.

This thread is for the discussion of the RTX cards, and not for consoles.

The last several pages have been taken over by both of you arguing and
insulting each other. If it does not stop immediately, there will be consquences.

AT Mod Usandthem
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
The evidence is that these technologies are dead.

3D is dead, but VR is not. It actually works, and pretty well. What merely happened is the initial hype phase has passed, making people say its dead.

See, most people that are not working on a specific category can only think in groups. So when there's hype surround it, they say its the best thing since... (insert whatever you think its the best). But when the hype train passes they immediately claim its dead. It doesn't help when the developers of the technology themselves are usually the reasons for the hype to create interest for the product.

When the hype and disillusion passes, then we can expect real hard work to happen in this space. But it'll be slow and painful, and that'll bore the masses.

There's a lot of snobbery in PC Gaming/Hardware, if it's not "1080p isn't a gaming res" or "if it's not 60hz+ it's crap" then it's "a xxx GPU is all you need, anything else is a ripoff etc" among countless other ideological fallacies.

This isn't just in PCs, and Gaming Hardware. This is apparent in all areas where premium products exist.

Yep. Going from 1080p 60hz to 1440p was a huge jump in immersion and overall gaming experience. Going from 1440p 60hz to 144hz was an equally huge jump.

I can tell you when I saw someone that loaded GTA V on a 1440p monitor, it was quite amazing. The problem is that if I had that, likely I may not see the benefits that I expected in real use.

I use the NEC 1970VX LCD I bought from craigs for $40 cdn. It was a serious downgrade compared to the 1080p monitor I had, but I figured I wanted to save money. And even though the peak resolution is only 1280x1024, the refresh rate is a decent 75Hz. After about 2 weeks, I didn't notice it anymore.

The great thing about our body that it can adjust to its surroundings. If you are willing to change your otherwise fixed belief about things, its amazing what it can get used to. Unfortunately, its also downgrading things that you tend to notice the most. When moving to SSDs, it feels like the world without SSDs have slowed to a crawl, rather than your system feeling faster.
 

Mopetar

Diamond Member
Jan 31, 2011
8,004
6,446
136
And you have zero comprehension of how graphics advance over time, and how new tech gets introduced ..and adopted and evolved over time.

You're being far, far too optimistic. As it has been pointed out, only the 2080 Ti even offers acceptable (and barely at that) levels of RT performance. Look back at previous generations and it's pretty easy to see how slow the uptake is on the high-end. The 980 Ti makes a pretty good point of comparison because it came out a little over 3 years ago and has about equal performance to a 1070.

If you check the Steam hardware survey, as of June 2018 (the 980 Ti came out in June 2015) there were only 8.5% of users who had something at 980 Ti level or better. It's even worse in this case since the 980 Ti came out at $650, whereas the 2080 Ti is about twice that price. I have no idea to what extent that slows adoption even more, but it's pretty obvious that it isn't going to speed it up.

I think it's cool that NVidia is trying this out, but you're just deluding yourself if you think that it's going to be mainstream and widely accessible in the next 5 years. Even 10 years is probably being generous. I remember when people thought that the Nintendo 64 would be capable of real time ray tracing, so this pipe dream of soon to be ubiquitous ray tracing is hardly new.
 
Reactions: psolord

psolord

Platinum Member
Sep 16, 2009
2,015
1,225
136
Ah yes. The N64. If it wasn't for the PS3 and Sony's lies, showing motorstorm gameplay, supposedly at PS3 render quality, that won't be available for another ten years starting now, the N64 would have been the biggest dissapointment of them all. Babylon 5 quality graphics my heiny, lol.

Lies, lies everywhere. And that from the pinnacle industry of human beings. lol
 

UsandThem

Elite Member
May 4, 2000
16,068
7,380
146
This thread is for the discussion of the RTX cards, and not for consoles.

The last several pages have been taken over by both of you arguing and
insulting each other. If it does not stop immediately, there will be consquences.

AT Mod Usandthem

Ah yes. The N64. If it wasn't for the PS3 and Sony's lies, showing motorstorm gameplay, supposedly at PS3 render quality, that won't be available for another ten years starting now, the N64 would have been the biggest dissapointment of them all. Babylon 5 quality graphics my heiny, lol.

Lies, lies everywhere. And that from the pinnacle industry of human beings. lol

Hello, is there anybody in there?

Please keep the discussion in this thread
on the RTX cards only.

AT Mod Usandthem
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |