Does the RTX series create an openning for AMD?

GodisanAtheist

Diamond Member
Nov 16, 2006
7,064
7,490
136
I've long maintained that AMD really missed a golden opportunity during the HD 4xxx/5xxx/6xxx era to build a top notch brand by pursuing the small die strategy and building incredibly competitive GPUs at a fraction of NV GPU die sizes, but never scaling up to NV GPU die sizes (and presumably crushing NV in the performance category) and securing a reputation as a premium brand. Instead they went after the so called sweet spot and cemented their reputation as "the other GPU maker".

NV arguably flipped this strategy on AMD from the GTX 6xx series onward, and executed on it nearly flawlessly... Until the RTX series. RTX dies are huge and expensive and pushing a new technology with arguable short term returns. Sounds similar to the GT200 series situation.

Granted, we don't have the exact performance figures on Turing, but this potentially leaves an opening for AMD, with even a moderate refinement of their architecture "unleaded" by a bunch of RT/TN cores to make a GPU that provides better performance in the "launch day titles" and win back the performance at all costs crowd (the "whales" and big spenders) in the consumer GPU space.

My understanding is that the GCN arch in Vega is actually reasonably competent in Ray Tracing tasks (naturally not going to perform as well as dedicated hardware, but better than Pascal) so they can still tout compatibility with DXR while focusing performance increases in standard rasterized tasks.

In short: Unmake the mistake they made with the small die strategy during the Terascale Arch years.

What are your thoughts on how AMD should move forward from here knowing what we know about the Turing shake-up?
 

gdansk

Platinum Member
Feb 8, 2011
2,492
3,394
136
What we need is the 7nm equivalent to the 4870. It was much lower price and good enough performance. But it seems their focus is on Rome and other Zen-related projects. Unlike Intel, who was happy selling small chips at high prices, Nvidia is still building massive chips (at high prices). There's little possibility of AMD designing a GPU to compete with full-size Turing, it would be too expensive given their limited R&D budget. But a mid-range GPU lacking tensor cores and raytracing hardware could be competitive. That GPU would also be a great match for future consoles.

I don't think going big die will ever work for them while they have lower performance per watt. Nvidia can go even bigger: they have more money and co-develop custom nodes like 12nm FFN. Finally, Nvidia can gate off unused RTX and Tensor cores. If AMD becomes competitive with Navi, Nvidia can* give their GPUs a higher boost clock for rasterization-only workloads.

* in fact they may have already done this on the RTX series
 
Last edited:

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
It depends if Nvidia does actually leave an opening and whether their claims of "50% higher per core performance" holds in real world testing but if those numbers don't come to pass judging by the theoretical numbers where the new leading part shouldn't gain more than ~25% on the current leading part in an optimistic scenario then they could very well have a less than impressive showing or in the worse case scenario they have a dud on their hands just like the Fermi microachitecture since they haven't shown any real numbers backed up by testing so far but even then they'd be safe so far since no competitor appears to be in sight within the near future ...

Nvidia's timing to say the least is impeccable so their pretty well setup in the case that if their competition does finally gain traction by figuring out how to reverse engineer their old efforts and apply it to a new microachitecture while trying to oversize it too in the process, it could all go unrewarded since Nvidia can potentially backpedal on their ray tracing or deep learning efforts by going back to a more lean design just in time ...

For years to come, benchmarks will still probably be judged by current standards (I don't think many reviewers will pick Assetto Corsa over Forza in their benchmark suite just solely because it supports RTX) rather than through Nvidia's ray tracing performance standards so focusing on what's already out there along with what the majority of the games will use in the near future is arguably a wiser decision for AMD than chasing ray tracing which probably won't see mass adoption in many AAA games in the current industry cycle ...

It's difficult enough for reviewers to include DX12 games in their benchmark suite as it is but how are they going to make a comparison with competing video cards in the market who don't yet have hardware acceleration for ray tracing ? If we take hardware accelerated PhysX as an example, it didn't have industry wide support in the end so reviewers never used it as a standard to judge the other vendors hardware ...

The way I see it as of now, it is Nvidia's obligation to sell ray tracing as a new niche feature like PhysX rather than something as a definitive performance advantage since most games and especially many AAA games for that matter won't integrate the feature in games for months or even years to come ...

It's all in the planning of what Nvidia's competitors will do if the Turing microachitecture is a total fumble ... (the data will surface soon enough on whether Turing is a massive regression in perf/die area so if it is it'll hinge on Nvidia's competitors whether or not they'll turn it into a field day for themselves)
 
Reactions: GodisanAtheist

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
I have written before in another thread there is a potential opening here. But, a lot of stars would have to align. Too many IMO.

AMD is simply too far behind, and while there has been a lot of complaing, NVidia is not really standing still on performance on regular games with RTX.

I will be totally shocked if AMD released a gaming GPU that caught 2080Ti on non Ray traced games. I don't think there is more than a 1% chance of that happening.

2060/2050 likely won't have RT cores, so they will be small die pure conventional performance. So no opportunity there.

That leaves 2080/2070. AMD might in theory catch 2080/2070 if the planets align, but they probably won't undercut NVidia on price, just like they didn't with Vega.

So there is opportunity, but likely nothing will come of it.
 

Muhammed

Senior member
Jul 8, 2009
453
199
116
AMD will feel the pressure to implement hardware acceleration for Ray Tracing like NVIDIA, so many developers are getting excited for this tech and is being implemented in all of the major game engines .. If AMD doesn't take the step now, they will be outflanked by NVIDIA for the foreseeable future. AMD can't simply afford to fall behind on Ray Tracing the same way they did on Tessellation/Geometry or CUDA or proper compute or even AI. They are going to have to step up or rest in the dust.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,729
136
AMD is as good as dead in the discrete gaming GPU market. GP106 has 13x the installed user base on Steam than Polaris 10.

13x user base in the price category that is most popular throughout the world, which even if you discount gaming laptops, is still staggering.

That's as good as dead in my books, which is to be expected, since AMD hasn't been both price and performance competitive in a long time, across different markets in the world, even without the mining craze.
 

amenx

Diamond Member
Dec 17, 2004
4,008
2,279
136
AMD is as good as dead in the discrete gaming GPU market. GP106 has 13x the installed user base on Steam than Polaris 10.

13x user base in the price category that is most popular throughout the world, which even if you discount gaming laptops, is still staggering.

That's as good as dead in my books, which is to be expected, since AMD hasn't been both price and performance competitive in a long time, across different markets in the world, even without the mining craze.
It seems that developing high performance GPU hardware is far more challenging than CPUs. I guess little chance they can pull off a Ryzen equivalent with GPUs, eh?
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
They've almost got the revenue to think about starting to do that now, but it'd take quite some time, and NV are hugely more of a moving target than Intel's CPU's. Maybe they'll try, they might not.

They're enormously behind right now. Never really got near to catching up with Pascal, and now another generation on.
 

TheELF

Diamond Member
Dec 22, 2012
3,993
744
126
The way I see it as of now, it is Nvidia's obligation to sell ray tracing as a new niche feature like PhysX rather than something as a definitive performance advantage since most games and especially many AAA games for that matter won't integrate the feature in games for months or even years to come ...
Screw games,will it make actual gpugp ray tracing faster? If it will accelerate blender rendering even more then it should sell like crazy.
 

TheELF

Diamond Member
Dec 22, 2012
3,993
744
126
It seems that developing high performance GPU hardware is far more challenging than CPUs. I guess little chance they can pull off a Ryzen equivalent with GPUs, eh?
That's what their cards all are,that's why they start being more competitive in higher resolutions and/or dx12 they need the extra parallelism to get fully utilized,just like ryzen.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Something else to add - a lot of the ray tracing stuff is reused for AI things, like that DLSS. If that works as well as it seems it might from their publicity (obviously in question!) then its a huge performance advantage from having the tensor cores.
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
I'd say it does create an opening. Since this technique is only viable for top end nvidia cards many games will not use it, especially console focused ones. When not using it this extra hardware is just a waste of die space.

Maybe amd can make a smaller gpu that performs similar to the 2070 (without this semi raytracing stuff) and sell it cheaper with healthy margins. After all Polaris performed similar to 1060 and isn't all that much bigger (and actually outperforms it in the 580 at the price of increased powerconsumption)
 
Reactions: CuriousMike

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
The latest reports are that the 20 series have 50% more efficient cores the the 10 series, which was already way ahead of GCN in power/performance/die size. Radeon doesn't have the ray tracing hardware so it can't ray trace any better then a 10 series (i.e. far too slow to use in games). The mid range nvidia chips won't have the RTX branding so probably don't have the die spaced used for ray tracing so won't be big dies. It is highly unlike AMD will manage to release 7nm significantly before Nvidia. It is highly unlikely AMD will be able to catch up as AMD under invested in gpu tech for years and have just been falling further behind, and Nvidia is showing no sign of slowing down.

The reality is the 10 series already pretty well destroyed AMD on the gaming front, but AMD was saved by mining as GCN is good at that - look at steam there are hardly any recent AMD gpu's being used to game on. The 20 series is just going widen the gap with new features and more efficient cores, while AMD has no new architecture due out for several years.

So basically they have no chance in stand alone cards. What they do have is their cpu's, so any place you can use an APU is where AMD can shine.
 
Reactions: french toast

jpiniero

Lifer
Oct 1, 2010
14,841
5,456
136
Vega 64 should be very competitive with the 2070. They may hate the margins if they have to sell it at the fake MSRP of $399/$499 but they could do it.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Vega 64 should be very competitive with the 2070. They may hate the margins if they have to sell it at the fake MSRP of $399/$499 but they could do it.
2070 is 175 watts reference / 185 watts FE. Far less than a Vega 64 card.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
So long as Nvidia stands firm on performance-per-$ stagnation, AMD has an opening. If PP$ doesn't increase, then consumers with hardline budgets can literally never upgrade.

The problem is NV is so well executed. Yes PP$ stagnated at 2070+. But NV knows AMD will likely only be able to compete at 2060 levels (look at RX 480 vs 1060). 2060 will likely not have RTX, so they are cheaper to make and NV can increase PP$. As odd as it seems right now, we could have $600 2070 and $300 2060 and a huge gap in between. NV has essentially separated 2070 and 2060 like Intel used to separate quad core and hex core. Just speculating their strategy anyway.

So AMD needs to be able to compete a raw performance with the 2070 - 2080 Ti to really capitalize. If they follow the Polaris strategy NV will be ready.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
It seems that developing high performance GPU hardware is far more challenging than CPUs. I guess little chance they can pull off a Ryzen equivalent with GPUs, eh?

It's a little different.

With CPUs, the easy to extract perf/clock gains were picked, so that's why we have this whole thing about many cores nowadays. Intel kept it at 4 core for a long time. While they have the perf/clock advantage, the core being same meant AMD could exploit that. Also the Intel cores are only like 10% better anyway, because performance in single thread is very hard. Whether 8 plus cores will ever be mainstream aside, if they could, it looks better to offer more cores. And AMD did what Intel didn't/couldn't do

With GPU, I don't see much headroom for AMD. While Nvidia is using about 1/3rd of the die to Tensor and Ray Tracing stuff, and AMD could try to deliver better performance by using the area for regular shaders, the problem is Nvidia has a big perf/watt advantage. Modern chips are essentially power bound, so perf/watt advantage can't be exploited using larger dies or faster clocks.

So there's that slim opportunity. AMD needs to take advantage of that before RT and Tensor cores take root and becomes a necessity to compete. Unfortunately, it looks like its at least few years away until AMD can offer a much different architecture. The sayings that Vega was sacrificed to get Navi a design win isn't encouraging either. That means even the next gen Navi may be disappointing, as its console-first, rather than PC first. The potential 4-5 years for post-Navi may mean RT and Tensor becomes yet another barrier for AMD.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
As odd as it seems right now, we could have $600 2070 and $300 2060 and a huge gap in between. NV has essentially separated 2070 and 2060 like Intel used to separate quad core and hex core. Just speculating their strategy anyway.

I have been wondering about that huge gap as well. What if Adored TV got it partially right, and 2070 is not a partially disabled 2080, but a smaller die(is there anything definitive on this?), so there could be a cut down 7GB 2070 falling in that gap?

So $500-$600 2070 8GB, and $400 2070 7GB (no Founders edition on 2070 7GB, just a cut down model for AIBs). Then the $300 2060, which will be very close in conventional performance to the 2070 7GB but lack in RT/Tensor features.

Just wildly speculating here, but that gap is just too large, and a GTX 2060 realistically can't be sold in volume at $400 USD, unless mining takes off again.
 

Head1985

Golden Member
Jul 8, 2014
1,866
699
136
Navi is still GCN.If AMD stays on GCN they will never catch NV.GCN is here since 2011.They needed end GCN support with radeon 290x and bring something new and way better.But navi in 2019 will still be GCN...
8 freaking years on one architecture.
Its like nvidia never move away from fermi because GCN is fermi counterpart.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,249
136
What's in it for AMD when most end users just want AMD to be competitive so they can get a better deal on a nvidia offering?

Too me it seems like a waste of valuable resources when the end result is the above.
 
Reactions: Leadbox

Guru

Senior member
May 5, 2017
830
361
106
AMDave already said that there won't be any new desktop GPU products in 2018, so we are looking at 2019 release schedule. AMD will have 7nm Vega 20 graphic cards and this is going to be a refined Vega(not just a simple cutdown) for the professional markets.

I think they are going to push Vega really hard into computing and AI, probably refine it much more in that way to be a professional computational card, rather than a graphic card.

AMD's desktop products are most likely going to be Navi which comes out in 2019, at the earliest I'm seeing late Q2 2019.

Could plans change, since they won't have anything to compete with Nvidia and their 2000 series for close to a year, sure, they might do a stopgap where they just do a full cutdown of Vega on 7nm and just bumps the clock speeds to the max, imagine a Vega at 1800Mhz that consumes less power and do it in Q1 2019, then push back the Navi release to Q3 2019.

Personally I don't see them changing their strategy, if they could they would likely speed up production of Navi, but 7nm quantities are limited and will be going first for Vega 20 and their EPYC CPU range.

In terms of what they should do, they should definitely do 2 designs as they've done, a small and big design, but strip all the redacted from the graphics, all of the computing, AI, all of the "features" and stuff and do a pure hardcore 100% gaming centric graphic card at 700mm2 Radeon RX 9900 with 8160 cores fed by high speed HBM2 memory at 900gb/s and release it at $1201 to compete with the RTX 2080ti.

Profanity is not allowed in the tech areas.

AT Mod Usandthem
 
Last edited by a moderator:

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
What's in it for AMD when most end users just want AMD to be competitive so they can get a better deal on a nvidia offering?

Too me it seems like a waste of valuable resources when the end result is the above.

Sad but true. AMD could build a 2070/2080 competitor with no Ray Tracing HW and sell for $100 less than NVidia. NVidia would drop their price by $50 and everyone would still pay the $50 extra to get the NVidia card.
 
Reactions: Magee_MC

maddie

Diamond Member
Jul 18, 2010
4,787
4,771
136
Sad but true. AMD could build a 2070/2080 competitor with no Ray Tracing HW and sell for $100 less than NVidia. NVidia would drop their price by $50 and everyone would still pay the $50 extra to get the NVidia card.
Very easy to pull #s out of wherever to fabricate an argument. You are the master.
 

MBrown

Diamond Member
Jul 5, 2001
5,724
35
91
I don't understand all of this doom and gloom talk for AMD GPUs. From all of the benchmarks I have seen (As far as gaming performance is concerned), The Vega 56 and Vega 64 are not that far off of The Nvidia 1070/1070ti and 1080. Even if the expectation was that the Vega 64 was supposed to compete with the 1080 ti, but it turns out its closer to just 1080 performance, Vega 64 still does just fine in games. Vega is not as efficient, runs hotter etc. but still I just don't understand this bandwagoning hate on AMD GPUs especially if you are just looking at game performance. Vega 64 may not be as fast as a 1080 in most games, but that doesn't mean the Vega 64 can't run games at good performance levels. They just need to lower the price of Vega. Price is the biggest issue for Vega IMO. For AMDs future competitor for RTX, if they price it competitively, it should be fine.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
I have been wondering about that huge gap as well. What if Adored TV got it partially right, and 2070 is not a partially disabled 2080, but a smaller die(is there anything definitive on this?), so there could be a cut down 7GB 2070 falling in that gap?

So $500-$600 2070 8GB, and $400 2070 7GB (no Founders edition on 2070 7GB, just a cut down model for AIBs). Then the $300 2060, which will be very close in conventional performance to the 2070 7GB but lack in RT/Tensor features.

Just wildly speculating here, but that gap is just too large, and a GTX 2060 realistically can't be sold in volume at $400 USD, unless mining takes off again.
2080ti - Titan level $999
2080 - 1080ti level $699
2070 - 1080 level $499
2060 - 1070 level $399
2050 - 1060 level $299

Perhaps?
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |