Nvidia ,Rtx2080ti,2080,(2070 review is now live!) information thread. Reviews and prices

Page 40 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Muhammed

Senior member
Jul 8, 2009
453
199
116
The pro cards are currently being detailed @amd Horizon. They're shipping this quarter.
dude, 7nm is a bust, vega 20 on 7nm with a pathetic 1800 mhz clock has a TDP of 300W, same as volta on 12nm, and volta is double it's size and has tensor cores! wake up ..
 

ub4ty

Senior member
Jun 21, 2017
749
898
96
You don't game, we get it. Your claims about "I run xyz on my 750ti just fine" are laughable. It's like my buddy who's been playing team fortress almost exclusively once or twice a month on a GTX 460 for the better part of a decade. He has no reason for an upgrade, good on him and good for you.

Yes, 7nm and PCIE 4.0 will be awesome and 2020/2021 should be awesome for gamers and I really look forward to AMD having options for my friends and family that I build machines for. I'll have a blast gaming @ 1440p 144hz with maxed detail until then, and will move to 4k, 144hz, HDR in 2020/2021.

I'm a smart consumer in most of my life. I don't drive a car I can't afford, live in an expensive home, go on lavish trips, etc. I do spend good $$$ regularly on computer hardware because at the end of the day, I can always sell it, and it's really not that expensive in the grand scheme of things. Do I wish stuff were cheaper? Sure, but I also like being on the forefront of gaming hardware and playing the newest games without compromise. To each their own.
I've been gaming since DOS Duke Nukem and DOOM.
Animal Tycon on Macintosh.
Atari, Sega Genesis, Sega CD, Nintendo, N64, PS, PS2, PS3, Xbox.
Quake, Counter strike, CS Source, CS:GO, SC, SC2, DOTA, WoW, you name it.

As I have gamed all my life... Way before it went mainstream and became consumer based I do indeed get it and what I get is that you don't need more than you need. I play competitively so I don't ramp my graphics settings to max and I don't need it to enjoy a game. I don't play bloated populist titles. I play competitive multiplayer. I rank in some of the top leagues in various multiplayer games.

Less than 1% and Less than 3% of gamers are above 1080. Most are are 1050tis and less. You are the unnoticed minority if you're above such specs. No one makes broadly adopted games that require beefy GPUs. Most people I know who play on spec'd rigs honestly aren't long standing gaming enthusiasts. A good amount center moreso on consoles. The least of all are gaming on high cost GPUs and high res monitors. The Steam data speaks for itself.

I do have 1080s. I don't use them for gaming because I don't need to. No one really does which is why a statistically insignificant portion of gamers own such hardware. Only new age gamers think they're a gamer as defined by what GPU they're running.

Oh and btw, that 750ti runs on a 1440p display.. 60hz of course because higher frequency displays are also a meme that the broad market gamer doesn't own.

Gaming is not some Cadilac wine affair exclusive to the 1%... It's actually the opposite. 95% of the real gamers are on 1050tis and less. They game boatloads more than those with premium hardware that really isn't necessary to enjoy oneself.

Given that I have about 6 1080s at the moment, it clearly has nothing to do w/ affordability.
I don't waste my money no matter what the hobby/expense is.
I'm a smart consumer even in my 'hobbies'.

Ever since computing and gaming went mainstream, as is always, people have become obsessed with keeping up w/ the Jones as opposed to the true spirit of the endeavor. $200 meme tier cherry keys from the 80's because a marketing campaign convinced people its better to go back in time to inferior technology and have clacking sounds echoing throughout their home... Enough RGB lights to induce a seizure and monitors that cost more than a 65" OLED TV.. This doesn't make you a gamer. It makes you a consumer.

I'm an avid gamer and I game on a 750ti @1440p because quite frankly I don't need anything more ATM. When that changes, I'll do my research and invest in a better GPU that I will use for ~5 years (the sane cycle of computer hardware). The majority of avid gamers are just like me. The broad majority of people aren't excited or dumb enough to spend $600+ on a GPU even if they have the money for it... which many do (surprise, you're not special). Namely because they quite frankly arent worth it and are a ripoff.

Other companies are learning they can't keep fleecing their consumers with overpriced nonsense :
https://www.bloomberg.com/news/arti...on-boost-for-budget-iphone-xr-nikkei-jo4al082

Just because I have money doesn't mean I'm going to let a company market nonsense to me to convince me to give it away. I decide what is a sane price for a product and features. If a company doesn't meet my standards, they don't get my money. It's as simple as that.

Gamers come in all flavors. The majority who log the most hours aren't consumerist as proven by data.
RTX is a cash grab beta platform. Someone has to subsidize the market. I just ensure as a smart consumer, it isn't me.
 
Last edited:

ub4ty

Senior member
Jun 21, 2017
749
898
96
I am the 1% with the 1440p 165hz gsync and the 1080ti.

As the 1% I'm saying this 2080ti is not purchasable. I am the target market and I will not upgrade to this substandard beta product with beta features and super premium pricing.
Based 1%'er.
Pascal launched and sustained on pretty solid value.

The RTX line is void of value.
2016 Pascal + 5 years = 2021.
More than enough time to enjoy Pascal to its fullest while they refine RTX, node hits 7nm, PCIE 4.0 and the price is brought down to reality.
 

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
I've been gaming since DOS Duke Nukem and DOOM.
Animal Tycon on Macintosh.
Atari, Sega Genesis, Sega CD, Nintendo, N64, PS, PS2, PS3, Xbox.
Quake, Counter strike, CS Source, CS:GO, SC, SC2, DOTA, WoW, you name it.

As I have gamed all my life... Way before it went mainstream and became consumer based I do indeed get it and what I get is that you don't need more than you need. I play competitively so I don't ramp my graphics settings to max and I don't need it to enjoy a game. I don't play bloated populist titles. I play competitive multiplayer. I rank in some of the top leagues in various multiplayer games.

Less than 1% and Less than 3% of gamers are above 1080. Most are are 1050tis and less. You are the unnoticed minority if you're above such specs. No one makes broadly adopted games that require beefy GPUs. Most people I know who play on spec'd rigs honestly aren't long standing gaming enthusiasts. A good amount center moreso on consoles. The least of all are gaming on high cost GPUs and high res monitors. The Steam data speaks for itself.

I do have 1080s. I don't use them for gaming because I don't need to. No one really does which is why a statistically insignificant portion of gamers own such hardware. Only new age gamers think they're a gamer as defined by what GPU they're running.

Oh and btw, that 750ti runs on a 1440p display.. 60hz of course because higher frequency displays are also a meme that the broad market gamer doesn't own.

Gaming is not some Cadilac wine affair exclusive to the 1%... It's actually the opposite. 95% of the real gamers are on 1050tis and less. They game boatloads more than those with premium hardware that really isn't necessary to enjoy oneself.

Given that I have about 6 1080s at the moment, it clearly has nothing to do w/ affordability.
I don't waste my money no matter what the hobby/expense is.
I'm a smart consumer even in my 'hobbies'.

Ever since computing and gaming went mainstream, as is always, people have become obsessed with keeping up w/ the Jones as opposed to the true spirit of the endeavor. $200 meme tier cherry keys from the 80's because a marketing campaign convinced people its better to go back in time to inferior technology and have clacking sounds echoing throughout their home... Enough RGB lights to induce a seizure and monitors that cost more than a 65" OLED TV.. This doesn't make you a gamer. It makes you a consumer.

I'm an avid gamer and I game on a 750ti @1440p because quite frankly I don't need anything more ATM. When that changes, I'll do my research and invest in a better GPU that I will use for ~5 years (the sane cycle of computer hardware). The majority of avid gamers are just like me. The broad majority of people aren't excited or dumb enough to spend $600+ on a GPU even if they have the money for it... which many do (surprise, you're not special). Namely because they quite frankly arent worth it and are a ripoff.

Other companies are learning they can't keep fleecing their consumers with overpriced nonsense :
https://www.bloomberg.com/news/arti...on-boost-for-budget-iphone-xr-nikkei-jo4al082

Just because I have money doesn't mean I'm going to let a company market nonsense to me to convince me to give it away. I decide what is a sane price for a product and features. If a company doesn't meet my standards, they don't get my money. It's as simple as that.

Gamers come in all flavors. The majority who log the most hours aren't consumerist as proven by data.
RTX is a cash grab beta platform. Someone has to subsidize the market. I just ensure as a smart consumer, it isn't me.


Blah blah blah, I game at 15fps @1440p with the lowest settings possible and it's fantastic. I've used both a 750TI and 1050TI @ 1440p and they are essentially not serviceable in any remotely modern game even at 60hz which is why I called you out on your absurd claim that they are. If you're playing CSGO then yeah, they're great.

I agree with you, the RTX cards have extremely poor value and by the time games are optimized for them the 3000 series will be out and much more capable of actually implementing the RTX feature set, I've never claimed any different. Hopefully by that time AMD has solid offerings so the consumer can have a choice again.

I just recommended a 1070TI for my brother, because it was on sale, came with Monster Hunter and he wants to pickup a 1440p 144hz monitor so 1070TI will get him down the road until 2020 when better options are available. The 2060 will be the same speed and the same price and won't be released until 2019 so I see no sense in waiting.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,301
68
91
www.frostyhacks.blogspot.com
The market for this kit prior to prices almost double was less than 3% and 1% in some cases. I'm sure it has shrunk even further at these prices.


When the prices come down to earth.
When it is on 7nm process.
When it isn't a blistering inferno.
When it is on PCIE 4.0.
When the drivers and software are actually written.
When games and software support it.
When the RT compute complex is beyond beta and gets at least one revision...

At that point, is when a smart consumer will entertain a purchase... Not a moment soo.



Anyone with a brain bigger than their wallet and need to spend money knows when something is not a value.
Paradigm shifts in CPUs and GPUs have occurred in the past. Usually this allows for price cuts .. not doubling of the price.
I think anyone w/ a brain knows this is a stop gap ripoff and will remain on Pascal until this changes.
No one is really questioning dedicated RT... They're questioning the ridiculous price for this prototype card.
In all honestly, it would only be a sound purchase if the price was cut. Not even at the same pricing is this a good value.

And yes, as I type this, AMD has just announced their next gen GPUs are going to be on PCIE 4.0 and will have infinity fabric link to the CPU. So much for being a glorified beta tester at a premium.

There seem to be a number of companies that went full retard recently. They're going to pay for it is as well. Apple just announced production cuts on their new flagship phone. All reviewers are saying their new products are not worth purchasing. At some point you hit a wall when you keep extorting your customers. Intel learned this. Apple is in the process of learning. Nvidia will learn as well. The last wave of consumers who get fleeced before they readjust to reality will get fleeced the hardest and that's the people buying RTX Nvidia cards. Someone had to fall on the knife.... Congrats

Most of the price increase comes from the change in the marketing where they've essentially bumped each card along, they really should have called the Ti the Titan and the RTX 2080 the Ti and so on, once you account for that the prices are only slightly inflated but not outrageous for the size GPU you're getting. I don't think Nvidia are especially price gouging here. And of course there is a range of RTX cards, if you're on a budget then there's budget models.

Paradigm shifts are not there to cut costs although they can lead to lower costs in some instances, paradigm shifts are there when you run into a wall with the old paradigm, at which point you need to shift. If Nvidia had thrown all the die space at FP32 ops and added say another 1/3rd the power, that'd be complete overkill, the cards already easily dominate 4k and hardly anyone runs 4k to start with. And we're getting diminishing returns with rasterization, we've faked about as much as we can in terms of lighting and reflection/refraction, the next significant jump in image quality is going to be RT.

For the average gamer running 1080p the current lineup offers them a product that will give them the ability to see a fidelity increase for their money, had Nvidia just gone with huge amounts of extra rasterization power then anyone running 1080p comfortably right now would have zero reason to upgrade to that card. Nvidia are appealing to their core consumer base the masses who run 1080p and offering them a reason to upgrade. Now it's not ideal because there's a period to wait for game developers to implement but that's always the case with new tech.

If Nvidia had dedicated the entire GPU towards traditional rendering all we'd be reading on these forums instead is how it's a pointless upgrade for almost everyone, the only people who would have won are those tiny number of people that want 4k at like 120+fps.

Nvidia aren't fleecing anyone, there's really no reason to believe their margins on these cards are astronomical, most of the cost comes from the very large GPU size, what you're actually getting for your buck is still decent value. The problem is one of perspective, people do not have an appreciation for just how expensive it is to ray trace, small GPUs would not have cut it so it was go big or go home. Nvidia did the most rational thing given the circumstance, they offered their customers a product that offered them a meaningful upgrade path. If you think AMD are going to do RT capable cards and those cards are going to be some huge cost reduction over current cards then I wouldn't hold my breath. RT is hard no matter who does it, if you want to join the club of offering meaningful visual fidelity increase then the jump to RT is an expensive one, just in terms of sheer number of transistors you have to throw at the problem.

AMD can announce their roadmap for next gen GPUs on PCIE 4.0, but so what? What actual upgrade path does that offer the average gamer? Running 1080p at 500fps? Why would anyone buy into that?

Nvidia are right now selling GPUs and making money and AMD have nothing to offer that's competitive and probably wont until late 2019 or early 2020 given their roadmap plans released so far. By that time Nvidia will have a refresh out to compete, all AMD are doing by waiting is handing the market to Nvidia.
 
Last edited:

Pandamonia

Senior member
Jun 13, 2013
433
49
91
Most of the price increase comes from the change in the marketing where they've essentially bumped each card along, they really should have called the Ti the Titan and the RTX 2080 the Ti and so on, once you account for that the prices are only slightly inflated but not outrageous for the size GPU you're getting. I don't think Nvidia are especially price gouging here. And of course there is a range of RTX cards, if you're on a budget then there's budget models.

Paradigm shifts are not there to cut costs although they can lead to lower costs in some instances, paradigm shifts are there when you run into a wall with the old paradigm, at which point you need to shift. If Nvidia had thrown all the die space at FP32 ops and added say another 1/3rd the power, that'd be complete overkill, the cards already easily dominate 4k and hardly anyone runs 4k to start with. And we're getting diminishing returns with rasterization, we've faked about as much as we can in terms of lighting and reflection/refraction, the next significant jump in image quality is going to be RT.

For the average gamer running 1080p the current lineup offers them a product that will give them the ability to see a fidelity increase for their money, had Nvidia just gone with huge amounts of extra rasterization power then anyone running 1080p comfortably right now would have zero reason to upgrade to that card. Nvidia are appealing to their core consumer base the masses who run 1080p and offering them a reason to upgrade. Now it's not ideal because there's a period to wait for game developers to implement but that's always the case with new tech.

If Nvidia had dedicated the entire GPU towards traditional rendering all we'd be reading on these forums instead is how it's a pointless upgrade for almost everyone, the only people who would have won are those tiny number of people that want 4k at like 120+fps.

Nvidia aren't fleecing anyone, there's really no reason to believe their margins on these cards are astronomical, most of the cost comes from the very large GPU size, what you're actually getting for your buck is still decent value. The problem is one of perspective, people do not have an appreciation for just how expensive it is to ray trace, small GPUs would not have cut it so it was go big or go home. Nvidia did the most rational thing given the circumstance, they offered their customers a product that offered them a meaningful upgrade path. If you think AMD are going to do RT capable cards and those cards are going to be some huge cost reduction over current cards then I wouldn't hold my breath. RT is hard no matter who does it, if you want to join the club of offering meaningful visual fidelity increase then the jump to RT is an expensive one, just in terms of sheer number of transistors you have to throw at the problem.

AMD can announce their roadmap for next gen GPUs on PCIE 4.0, but so what? What actual upgrade path does that offer the average gamer? Running 1080p at 500fps? Why would anyone buy into that?

Nvidia are right now selling GPUs and making money and AMD have nothing to offer that's competitive and probably wont until late 2019 or early 2020 given their roadmap plans released so far. By that time Nvidia will have a refresh out to compete, all AMD are doing by waiting is handing the market to Nvidia.
Nobody running 1080p is buying a 2080ti

What planet u on
 

maddogmcgee

Senior member
Apr 20, 2015
385
310
136
Blah blah blah, I game at 15fps @1440p with the lowest settings possible and it's fantastic. I've used both a 750TI and 1050TI @ 1440p and they are essentially not serviceable in any remotely modern game even at 60hz which is why I called you out on your absurd claim that they are. If you're playing CSGO then yeah, they're great.

I agree with you, the RTX cards have extremely poor value and by the time games are optimized for them the 3000 series will be out and much more capable of actually implementing the RTX feature set, I've never claimed any different. Hopefully by that time AMD has solid offerings so the consumer can have a choice again.

I just recommended a 1070TI for my brother, because it was on sale, came with Monster Hunter and he wants to pickup a 1440p 144hz monitor so 1070TI will get him down the road until 2020 when better options are available. The 2060 will be the same speed and the same price and won't be released until 2019 so I see no sense in waiting.

The main games I am playing at the moment are Starcraft 2, MTG Arena, Thronebreaker and Hearts of Iron 4. All these would play great on a 1050ti. MTG went to beta a few weeks ago. Thronebreaker just came out while Hearts of Iron 4 had an expansion a few months ago. Just had a look at the top sellers on steam https://store.steampowered.com/sale/2018_so_far_top_sellers/ > A 1050ti would be overkill for many of them. Games like Divinity 2, Rocket League, Stellaris and Path of the Exile would run fine on way less.

Please don't tell me I need a 2080ti to play modern games or I am not a gamer just because you like to max settings on Battlefield 27.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
For what is meant to be a tech forum people are very stuck in their ways. Like PrincessFrosty says just pushing more of the same isn't the way forward for gaming graphics. If you look at [H] reviews it's "I can run everything at high with card X" or I can spend $hundreds more to go to ultra honestly hardly any graphical improvement. It's just not worth it, the gains are getting smaller and smaller.

What Nvidia has done has announced a way past that, and the hardware is brilliant. Sure we can't afford it yet, and the games aren't out yet but you can't change the whole landscape without a painful transition period. The cost is to be expected too - the chips are huge, too big really - and it's gonna be expensive for early adopters, but really that's the case in most markets (not just gpu's).

For the rest of us we just wait for the 3 series @ 7nm with second gen ray tracing, after a load of games have been released. The cards will be cheaper, the games will be available, we win but only because the 2 series paved the way.
 

coercitiv

Diamond Member
Jan 24, 2014
6,383
12,798
136
Don't you guys have phones?

On a more serious note, it looks like we're entering an age of complete disconnect between business leaders and consumers. The opportunities will be golden.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
I never said they would. Nvidia are offering an entire range of RTX cards across a very wide range of price points...
There's no sign yet of NV offering anything cheaper than $500 in the RTX line.
And it looks like anything less than a 2070 wouldn't be able to do RT very well in any case.
 

sze5003

Lifer
Aug 18, 2012
14,184
626
126
For what is meant to be a tech forum people are very stuck in their ways. Like PrincessFrosty says just pushing more of the same isn't the way forward for gaming graphics. If you look at [H] reviews it's "I can run everything at high with card X" or I can spend $hundreds more to go to ultra honestly hardly any graphical improvement. It's just not worth it, the gains are getting smaller and smaller.

What Nvidia has done has announced a way past that, and the hardware is brilliant. Sure we can't afford it yet, and the games aren't out yet but you can't change the whole landscape without a painful transition period. The cost is to be expected too - the chips are huge, too big really - and it's gonna be expensive for early adopters, but really that's the case in most markets (not just gpu's).

For the rest of us we just wait for the 3 series @ 7nm with second gen ray tracing, after a load of games have been released. The cards will be cheaper, the games will be available, we win but only because the 2 series paved the way.
I think the 2 series will get cheaper and the 3 series will be about the same price as now. I don't think they will get cheaper at all. $12-1300 and they sold out meaning people are willing to pay that price. If I'm a business I wouldn't lower the price gap.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,301
68
91
www.frostyhacks.blogspot.com
The only thing that will drive down price is competition and innovation. People can complain about prices when AMD or some other business offers an alternative that provides the same kind of fidelity and leap in technology for cheaper.

AMDs leap to hybrid RTing won't be any cheaper, RT is extremely computationally expensive, it needs a lot of die space dedicated to do it in even conservative amounts. There's no reason to believe that AMD have a magic fix for this in the pipeline.
 

amenx

Diamond Member
Dec 17, 2004
4,005
2,275
136
I think the 2 series will get cheaper and the 3 series will be about the same price as now. I don't think they will get cheaper at all. $12-1300 and they sold out meaning people are willing to pay that price. If I'm a business I wouldn't lower the price gap.
I think Nvidia anticipated much lower demand due to the high RTX prices and thus produced much lower quantities. This being the main reason its out of stock. Even those who RMA'd defective cards are unable to get replacements from Nvidia in a timely manner.

I dont think Nvidia expected much in terms of a successful product launch that will reel in tons of buyers, but I dont believe that was the purpose. The purpose imo is just to set the ball rolling for game developers to get on board the ray tracing train, so when 7nm launches there will be existing RT titles and a far better developed product to take advantage of them.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
I think the 2 series will get cheaper and the 3 series will be about the same price as now. I don't think they will get cheaper at all. $12-1300 and they sold out meaning people are willing to pay that price. If I'm a business I wouldn't lower the price gap.
We'll have to see, obviously Nvidia are in it to make as much money as they can, but to do that you've got to sell to the $200-300 people too.

How is the hardware brilliant before anyone even knows how well it performs in its tasks its trying to sell itself on?
I think we've seen enough to know it works, but I agree exactly how well we'll have to see in the next year or two.
 

SMOGZINN

Lifer
Jun 17, 2005
14,218
4,446
136
AMDs leap to hybrid RTing won't be any cheaper, RT is extremely computationally expensive, it needs a lot of die space dedicated to do it in even conservative amounts. There's no reason to believe that AMD have a magic fix for this in the pipeline.

I don't know if AMD is going to take the leap to RT. AMD has a opportunity here to hurt NVidia by simply making a fast card without RT at much lower costs. If they do that they will be the easy winner in this generation. Then NVidia will have to do something to respond, and that something will probably be to produce cheaper 2000 series cards without RT, which will effectively kill Ray Tracing altogether.
 

Muhammed

Senior member
Jul 8, 2009
453
199
116
I don't know if AMD is going to take the leap to RT. AMD has a opportunity here to hurt NVidia by simply making a fast card without RT at much lower costs. If they do that they will be the easy winner in this generation. Then NVidia will have to do something to respond, and that something will probably be to produce cheaper 2000 series cards without RT, which will effectively kill Ray Tracing altogether.
In lala land. The whole industry is moving toward RT, we have DXR, game engines, game developers, and RTX hardware. No one is going to kill off that much investment. Intel is also hopping on the RT bandwagon. AMD will be left in the dust if they don't come up with an RT solution now.

And NVIDIA is way ahead of AMD even when comparing 7nm to 12nm. Imagine what NVIDIA will do on 7nm, they will completely obliterate AMD unless AMD dumbs GCN and start doing something better.
 

Pandamonia

Senior member
Jun 13, 2013
433
49
91
In lala land. The whole industry is moving toward RT, we have DXR, game engines, game developers, and RTX hardware. No one is going to kill off that much investment. Intel is also hopping on the RT bandwagon. AMD will be left in the dust if they don't come up with an RT solution now.

And NVIDIA is way ahead of AMD even when comparing 7nm to 12nm. Imagine what NVIDIA will do on 7nm, they will completely obliterate AMD unless AMD dumbs GCN and start doing something better.
Rt requires large die gpu designs and I doubt amd or Intel will want to dedicate so much die to what is essentially just bling.

Fps sells gpus not bling. And if nvidia are the only one in the game the market will decide that it won't pay for this bling as the resources required as disproportionate to the benefits and costs.

If you don't believe me then ask yourself why nobody is flying concorde any more.
 

Muhammed

Senior member
Jul 8, 2009
453
199
116
If you don't believe me then ask yourself why nobody is flying concorde any more.
Irrelevant comparison, this is GPUs, not a thousand tons flying aircrafts.
Rt requires large die gpu designs and I doubt amd or Intel will want to dedicate so much die to what is essentially just bling.
Intel will gladly do it, they already tried with Larrabee, and they will do it again. NVIDIA is making a killing in the pro visualization segment with RTX, Intel will want to compete.

DXR forces AMD and Intel to come up with solutions to ray tracing, and if they want their GPUs to be competitive in ray traced games they will have to come up with solutions to compete with NVIDIA. You can't have a competitor providing a vastly superior image quality in games and expect people to buy your hardware. This will always be a pain in your neck. Especially when NVIDIA blasts the footage of ray traced games all over the web, creating propaganda and mindshare. Graphics sell. This will strengthen NVIDIA's position immeasurably.

Also NVIDIA is ahead of them both in Machine Learning, Direct ML will be a part of DXR soon, forcing GPU makers to come up with solutions to upscale games using AI.

Again the whole industry is moving toward AI scaling and Ray Tracing .. those who won't compete will be left in the dust, this is a T&L moment in the graphics timeline. You either catch up, or go home .. they way 3dfx did.
 

Pandamonia

Senior member
Jun 13, 2013
433
49
91
Irrelevant comparison, this is GPUs, not a thousand tons flying aircrafts.

Intel will gladly do it, they already tried with Larrabee, and they will do it again. NVIDIA is making a killing in the pro visualization segment with RTX, Intel will want to compete.

DXR forces AMD and Intel to come up with solutions to ray tracing, and if they want their GPUs to be competitive in ray traced games they will have to come up with solutions to compete with NVIDIA. You can't have a competitor providing a vastly superior image quality in games and expect people to buy your hardware. This will always be a pain in your neck. Especially when NVIDIA blasts the footage of ray traced games all over the web, creating propaganda and mindshare. Graphics sell. This will strengthen NVIDIA's position immeasurably.

Also NVIDIA is ahead of them both in Machine Learning, Direct ML will be a part of DXR soon, forcing GPU makers to come up with solutions to upscale games using AI.

Again the whole industry is moving toward AI scaling and Ray Tracing .. those who won't compete will be left in the dust, this is a T&L moment in the graphics timeline. You either catch up, or go home .. they way 3dfx did.
Of course is relevant. You just can't comprehend it.

Just because something is better doesn't mean people are willing to pay for it.

We are hitting the Mooreslaw wall right now and shrinking dies are harder than ever. We may have 1 or 2 shrinks left which isn't enough to give us RT at 4k. 99% of the market is mid range. Games don't spend money on 1% of the market they focus on volume as they sell licenses. Consoles will never do RT as they are priced even below midrange gpus let alone whole pcs.
 

jpiniero

Lifer
Oct 1, 2010
14,825
5,442
136
For games, it's 100% dependent on whether the next gen consoles have RT hardware. Which if it does, it obviously means that AMD at least will have something.
 

SMOGZINN

Lifer
Jun 17, 2005
14,218
4,446
136
In lala land. The whole industry is moving toward RT, we have DXR, game engines, game developers, and RTX hardware. No one is going to kill off that much investment. Intel is also hopping on the RT bandwagon. AMD will be left in the dust if they don't come up with an RT solution now.

And NVIDIA is way ahead of AMD even when comparing 7nm to 12nm. Imagine what NVIDIA will do on 7nm, they will completely obliterate AMD unless AMD dumbs GCN and start doing something better.

I completely disagree. If AMD can come out with a card that can even match the 2070 in traditional gaming at $350 or less they will completely dominate the market for this entire generation. That seems like something within their grasp right now. IMHO AMD would be stupid to put RT cores on their GPUs until the market is already there, which means that the market may never materialize.

RT is the Stereoscopic 3D of this generation. It sounds like a good idea, but ultimately almost no one will use it, because almost no one will have the hardware to, with almost no hardware out there to make use of it developers will not put much effort into it so it will not get any 'killer apps', and so it will die away. It is simply too expensive to get any adoption, and without adaption there is no reason to support it, with out support people will not see any reason to spend extra for it.
 
Reactions: psolord

Muhammed

Senior member
Jul 8, 2009
453
199
116
RT is the Stereoscopic 3D of this generation. It sounds like a good idea, but ultimately almost no one will use it, because almost no one will have the hardware to, with almost no hardware out there to make use of it developers will not put much effort into it so it will not get any 'killer apps', and so it will die away. It is simply too expensive to get any adoption, and without adaption there is no reason to support it, with out support people will not see any reason to spend extra for it.
say that to every graphics feature ever in every dx release, features takes time to reach wider hardware.
I completely disagree. If AMD can come out with a card that can even match the 2070 in traditional gaming at $350 or less they will completely dominate the market for this entire generation.
If they can, you think NVIDIA will care? they will simply price the 2070 to 400$ and dominate AMD once again, just like 1060 vs 580. NVIDIA will have the image quality and performance advantage, AMD will simply collapse.

NVIDIA precisely chose the moment of RT introduction, they know AMD can't capitalize on it because they are so far behind. AMD can't even do a 7nm GPU that can challenge Volta on 12nm.

If AMD didn't develop RT hardware, you think they will compete? NVIDIA will hammer their RTX advantage to the end of time and become so far ahead of AMD that any hope of AMD catching them will be crushed. NVIDIA has like 11 RTX games RIGHT NOW when hardware is so scarce! how do you think they will do once RTX is widespread? dozens if not more. AMD can't simply afford to lose on performance, features and image quality all at the same time. That would be stupid.
 
Reactions: ozzy702
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |