Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 142 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
292
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
Given Nvidia's recent GPU history, I wonder if GA104 failed to deliver the expected performance, and the 3080/3090 cards were released earlier because of this. This is the first time since 2013 that the 80s card is not a X04 - 980 was GM204 in 2014, 1080 was GP104 in 2016 and 2080 was TU104 in 2018. The 780 was GK110 in 2013 (Technically, the 680 was also GK104). In addition, everything seems over-the-top with the cards (cooling and TDP).

This is obviously pending reviews, but I wonder if the 3070 just failed to deliver the required performance for an 80 class card (i.e. faster than the previous gen's ti). According to Nvidia it's just on-par with the 2080ti, while even the 2080 was around 10% faster than the 1080ti (and it was also released alongside the 2080ti).

One of the key factors for Nvidia to move the x80 GPU to Gx104 die was the lack of competition from AMD. Starting with Maxwell, we could clearly see that AMD did not have the resources to do full a GPU stack refresh on a regular cadence. With RDNA2, AMD will refresh their entire GPU stack from top to bottom with the use of 4 dies - Navi 21, 22, 23, 24. The other factor was AMD was badly behind on perf/watt and perf/sq mm. With the information we have on Xbox Series X, its fair to say AMD have addressed both these issues in RDNA2. I think Nvidia have priced the RTX 3080 quite aggressively and have been forced to push it to 320w as they have an idea of what their competition is going to look like. What remains to be seen is final performance from AMD RDNA2 GPUs.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Right, which makes its 10GB frame buffer that much harder to accept. But with say 20GB at $850 it would really make the 3090 look outlandish ¯\_(ツ)_/¯

I’ve been using the 12GB NVidia parts for 3 generations. No game I played ever needed more than maybe 6-8GB. Sometimes I would use DSR to try 8k (slideshow) and it would get to 10-11 maybe.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Again, that is in 1 game which is currently the most RTX intense game available, look at the fine print. It is also comparing a 3080 versus an essentially overclocked 2080 so that will also tilt the scale in favor of the 3080 in terms of perf/w.

If you watch the Digital Foundry video you get more like 20-25% perf/w improvement compared to Nvidia's media slide.
Yeah that’s what it looks like realistically. All that tensor stuff might not even be used a lot of the time.

If AMD beats decisively on raw rasterization I know what I will buy.
 

jpiniero

Lifer
Oct 1, 2010
14,834
5,450
136
Given Nvidia's recent GPU history, I wonder if GA104 failed to deliver the expected performance, and the 3080/3090 cards were released earlier because of this. This is the first time since 2013 that the 80s card is not a X04 - 980 was GM204 in 2014, 1080 was GP104 in 2016 and 2080 was TU104 in 2018. The 780 was GK110 in 2013 (Technically, the 680 was also GK104). In addition, everything seems over-the-top with the cards (cooling and TDP).

If anything, I suspect the 3080/3090 would have been released earlier had SS7 panned out.

Original rumor had the 3080 as using GA103 (3840? cores)
 

FatherMurphy

Senior member
Mar 27, 2014
229
18
81
One of the key factors for Nvidia to move the x80 GPU to Gx104 die was the lack of competition from AMD. Starting with Maxwell, we could clearly see that AMD did not have the resources to do full a GPU stack refresh on a regular cadence. With RDNA2, AMD will refresh their entire GPU stack from top to bottom with the use of 4 dies - Navi 21, 22, 23, 24. The other factor was AMD was badly behind on perf/watt and perf/sq mm. With the information we have on Xbox Series X, its fair to say AMD have addressed both these issues in RDNA2. I think Nvidia have priced the RTX 3080 quite aggressively and have been forced to push it to 320w as they have an idea of what their competition is going to look like. What remains to be seen is final performance from AMD RDNA2 GPUs.

The new consoles played a role too. It was important for Nvidia to have competitively priced cards (esp. the $500 3070) to ensure PC gaming momentum (i.e. which is 80% Nvidia in the discrete market) continues.
 

moonbogg

Lifer
Jan 8, 2011
10,637
3,095
136
^ makes sense. If that 3080 was any more expensive I would have just walked away at that point. That's just me though, or perhaps there are others that are like-minded as well. I now have concerns about availability of that 3080 though. It's going to sell-out mad crazy fast.
 

Head1985

Golden Member
Jul 8, 2014
1,866
699
136
Given Nvidia's recent GPU history, I wonder if GA104 failed to deliver the expected performance, and the 3080/3090 cards were released earlier because of this. This is the first time since 2013 that the 80s card is not a X04 - 980 was GM204 in 2014, 1080 was GP104 in 2016 and 2080 was TU104 in 2018. The 780 was GK110 in 2013 (Technically, the 680 was also GK104). In addition, everything seems over-the-top with the cards (cooling and TDP).

This is obviously pending reviews, but I wonder if the 3070 just failed to deliver the required performance for an 80 class card (i.e. faster than the previous gen's ti). According to Nvidia it's just on-par with the 2080ti, while even the 2080 was around 10% faster than the 1080ti (and it was also released alongside the 2080ti).




Samsung is 8nm, they're technically not making any 7nm cards.
i dont think 3070 is even full die.Full die will have 3584SP(full 102die have +50% more sp than full 104die and full 102die have 5376SP) and they only put old GDDR6 on it.They keep full die for refresh/3070TI with GDDR6X.There will be huge gap between 3070 and 3080 like 40% so they can easily release 3070TI with full die/GDDR6x with +20% perf over 3070 and fill the gap.
 
Last edited:
Reactions: Gideon and ozzy702

jpiniero

Lifer
Oct 1, 2010
14,834
5,450
136
i dont think 3070 is even full die.Full die will have 3584SP and they only put old GDDR6 on it.They keep full die for refresh/3070TI with GDDR6X.There will be huge gap between 3070 and 3080 like 40% so they can easy release 3070TI with full die/GDDR6x with +20% perf over 3070 and fill the gap.

It's "3072" (same as TU104)
 

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
^ makes sense. If that 3080 was any more expensive I would have just walked away at that point. That's just me though, or perhaps there are others that are like-minded as well. I now have concerns about availability of that 3080 though. It's going to sell-out mad crazy fast.

Hopefully Samsung capacity will help restocks quicker than we are used to. I was hoping for preorders today. I hate fighting the unwinnable battle against bots on launch
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Reactions: Martimus

CP5670

Diamond Member
Jun 24, 2004
5,525
602
126
I’ve been using the 12GB NVidia parts for 3 generations. No game I played ever needed more than maybe 6-8GB. Sometimes I would use DSR to try 8k (slideshow) and it would get to 10-11 maybe.

I've seen games use right up to 10GB at 4K on my 1080ti. I remember seeing it regularly in Deus Ex Mankind Divided, and it's probably more common with more recent games. I would say 8GB is not enough on a high end card now (like the 2080) but 10GB may be okay. It may be that the slightly lower memory size is outweighed by the rest of the card even in memory-limited scenarios.
 

A///

Diamond Member
Feb 24, 2017
4,352
3,155
136
3070 is a really good deal. If the 3060 sells for around 275-350, then I think they can close the gap, especially if it performs midway between a 2070 and 2070 Super. AMD will have to do some funky town magic to get sales especially when it's likely their product won't be past 80% as good.


Honest, hand on heart, I haven't been excited for a GPU launch in at least 8 years. This is amazing. IF AMD releases anything as good and forced nv to change up their pricing, that sweet 3080 or a later rumored 20 GB model from AIBs may make its sweet tootsie roll way into my build. Where it'll be treated with love, humanity and respect. If they show nv's hand, it may see a nice price cut. I'll go with a conservative $80. $620 isn't all that bad!

Shhhh, JHH doesn't want you revealing the RTX 3070 Super so early.
It's hard to tell if you're serious, but I hope you're aware of the cultural signifi... insignificance of that lettering.
 

sze5003

Lifer
Aug 18, 2012
14,184
626
126
I'm having confused feelings about the 3080. I think I like it. I wish it had more Vram, but I'm guessing 10Gb might be fine for 3440x1440. Even I can admit when I'm being too damn picky. It looks like a great card at $700 and should lay waste to anything thrown at it for quite a long time. I do suspect an impressive refresh may be in the works, or a 3080Ti, but who cares. Waiting another year doesn't sound like fun, but playing around with a new card does sound like fun. I don't think I'll be able to resist taking a hammer to the "buy" button and smashing it into the ground on release day. Jacket man delivered.
I've been thinking the same thing. Shortly after I got my 1080ti, it was you who convinced me to go 3440x1440 and now with working from home for a while probably into beginning of 2021, I couldn't be happier with my ultrawide.

I've kept this 1080ti for a while. I would really like the 3090 but that whale is probably not fitting in my case and I don't plan on getting a bigger case or motherboard hence I'm hesitant on ordering one and then opening it to find out it won't fit.

Perhaps the 3080 will be good enough but my plan is to run an HP Reverb G2 on it so the extra vram of the 3090 could help.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,755
751
136
I've seen games use right up to 10GB at 4K on my 1080ti. I remember seeing it regularly in Deus Ex Mankind Divided, and it's probably more common with more recent games. I would say 8GB is not enough on a high end card now (like the 2080) but 10GB may be okay. It may be that the slightly lower memory size is outweighed by the rest of the card even in memory-limited scenarios.

Actual used VRAM or just "reserved" VRAM? You can't trust GPU-Z etc to report actual VRAM required, some games just request it all but only use some.
 
Reactions: Campy and ozzy702

Saylick

Diamond Member
Sep 10, 2012
3,386
7,153
136
It's hard to tell if you're serious, but I hope you're aware of the cultural signifi... insignificance of that lettering.
If they make a fully unlocked RTX 3070 Super with the full-fat GA104 die, complete with the 'S' design, and offered a $90 discount exclusively to 90s kids, e.g. me, then I would buy it on the spot, right then and there. It would make for some great marketing!
 

moonbogg

Lifer
Jan 8, 2011
10,637
3,095
136
Actual used VRAM or just "reserved" VRAM? You can't trust GPU-Z etc to report actual VRAM required, some games just request it all but only use some.

That's def true. It's hard to know how much is really needed. I think monitoring frametimes is a good way to know. Stuttering and hitching is the usual issue if memory serves me right. I haven't had that issue since my GTX 570's though.
 
Reactions: ozzy702

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
You guys remember Jensen by the end of last year?

Still possible though if GA106 and smaller ones will be on TSMC N7. But for now that looks weird to me since GA102 and GA104 are on Samsung's 8nm...

GA100 is TSMC.
 

Karnak

Senior member
Jan 5, 2017
399
767
136
Still 2:1 for Samsung atm. Doesn't make any sense to me though regarding GA102 and GA104 because the better the products the better the process should be IMO.

Or rumors about no (more) capacity for NV are true.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |