Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 97 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
292
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

maddie

Diamond Member
Jul 18, 2010
4,790
4,774
136
The current modular PSUs typically have 1x8pin at the PSU going to 2x8pin PCIE at the video card. This nvidia cable has 2x8pin at the PSU, suggesting it's comparable to 4x8pin PCIE. Some PSUs even support the latter but not 2x8pin for the CPU at the same time.
Yep, this 12 pin connector is going to be delivering more than 2x 8 pin PCIe. Lets hope it isn't close to the max possible from 4x 8 pin PCIe, but the massive cooler is an indicator of serious power consumption.
 

FaaR

Golden Member
Dec 28, 2007
1,056
412
136
the massive cooler is an indicator of serious power consumption.
I've been thinking myself about that, if the massive cooler really is for dealing with monster power consumption, or maybe instead to handle a perhaps somewhat higher than usual high-end-like power consumption either at or near whisper quiet sound levels.

It's an interesting situation, I can think of a bunch of reasons to argue for either case...
 
Reactions: Gideon

maddie

Diamond Member
Jul 18, 2010
4,790
4,774
136
I've been thinking myself about that, if the massive cooler really is for dealing with monster power consumption, or maybe instead to handle a perhaps somewhat higher than usual high-end-like power consumption either at or near whisper quiet sound levels.

It's an interesting situation, I can think of a bunch of reasons to argue for either case...
By itself, yes, the argument for low noise can be made, but is not as strong when you also realize that the 12 pin connector can deliver up to 600W.

Maybe it's an attempt to raise the oil price.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
7,077
7,507
136
It would be hard to imagine people who would be shopping for this card wouldn't have a modular PSU.

- Modular PSU cables are not interchangable however, as different PSUs can have different pin-outs on the PSU side of things. That would potentially mean you would need different modular cable adaptors for all the different PSU OEMs.

Just an example, but you cannot take cables for a, say, Seasonic PSU and plug them into a Corsair or EVGA PSU. Hell, cannot even take a cable designed for one unit from a manufacturer and use it on another unit from the same manufacturer without guarantee that you won't end up with a dead component at best or an electrical fire at worst.

I'm almost more curious to see how the logistics of this are going to work than I am the cables.

All that said, I would love a new standard that brought the number of connectors/cables to the motherboard and GPU down and shrunk the connectors as well. Disassembled and reassembled my rig today for cleaning and while it's old hat for me now, I still get a twang of frustration having to drag that 4+4 pin CPU power cable over my ram and cooler across the motherboard and plug in the connectors blind because there is just no room for my hand and eyes to work together on that one...
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
All that said, I would love a new standard that brought the number of connectors/cables to the motherboard and GPU down and shrunk the connectors as well. Disassembled and reassembled my rig today for cleaning and while it's old hat for me now, I still get a twang of frustration having to drag that 4+4 pin CPU power cable over my ram and cooler across the motherboard and plug in the connectors blind because there is just no room for my hand and eyes to work together on that one...

That's basically what 12VO does. ONE connector to the motherboard, and one to the PSU. And the power supply only outputs 12VDC, no 3.3 or 5. The motherboard can handle that for things like USB and such.
 
Reactions: GodisanAtheist

Veradun

Senior member
Jul 29, 2016
564
780
136
Well it would suck if you’re after an FE card and you don’t have a modular PSU.

At this point for me personally I would only be interested in an FE card on the outcome of the cooling solution. Other than that I won’t really want to spend $100 extra for FE tax.
I would expect them to bundle a 2x8pin to 1x12pin connector with the FE. Not doing this would be foolish
 

blckgrffn

Diamond Member
May 1, 2003
9,212
3,221
136
www.teamjuchems.com
After reading this thread last week, then spending the weekend fishing and thinking of many things, I cam back to this thread to ask:

Why? Why is nvidia doing this? Why now?

I've been encouraged that some others asked that question too.

Typically these mammoths age terribly. All this power consumption makes for a lot of load on a lot of components and how many of these will die when PSU's that are a couple years old get hit with an actual load (man, a top of the line Intel CPU and a 3090 could legit draw 1kw?) nearing what they are spec'd for and lose the magic smoke? These will suck to have in any type of smallish room if they are hitting their power numbers in gaming loads, a 290x could heat my theater room up with the door closed during a gaming session, this is way past that... dorm rooms, offices, etc. are going to feel it.

This seems like AMD's cards from the past delivered by nvidia in 2020. Fury X, Vega 64, solid architectures in their own right but clocked and volted to high heaven and paired with exotic memory... they haven't aged very well at all. The Fury and Vega 56 were likely the better long term values, if there really was any. But we know at least some of "why" AMD did that - to take a shot at the halo.

It seems pretty obvious that the 3080 is going to be a halo product by itself and is supremely likely to hold the single card performance crown AND still chew up north of 300W.

I will readily admit that I am strongly adverse to spending more than ~$500, because I feel that is where the value prop changes so drastically and I only "need" 2k performance.

Also, because I expect ~$500 is going to net me an entire PS5. Maybe $600?

I have really respected nvidias right to make money on their efficient, well mannered and solid video cards. But this???

My mind is boggled.
 

Glo.

Diamond Member
Apr 25, 2015
5,768
4,693
136
It seems pretty obvious that the 3080 is going to be a halo product by itself and is supremely likely to hold the single card performance crown AND still chew up north of 300W.
Yes, and no.

No. It won't hold the performance crown.
Yes. It will eat 300W of power for breakfast.
 

blckgrffn

Diamond Member
May 1, 2003
9,212
3,221
136
www.teamjuchems.com
Yes, and no.

No. It won't hold the performance crown.
Yes. It will eat 300W of power for breakfast.

Really? Is the AMD 300W card really going to be more effecient?

@Stuka87

I get that - I meant that a 3080 seems likely to be powerful enough to be extremely competitive as the best gaming card you can buy at launch.

A 3080 Ti seems like it would make more sense in four -six months as needed than a $1600 - special power connector required (that can't be fed by 2x 8 pin power cables so you need a new PSU too - which I know isn't that big of a deal for the hundreds of people that are willing to go to any end of the earth to get this to work) hail mary product that shows up *before* it is obvious it's needed.

I mean, it will exist but how many times are we going to hear "Well, you could get a big navi or a 3080 double black ultra edition with AIO *or* you can put your big boy pants on and spend 2x the money on a 3090". That seems unlikely in this forum.

If you have to define an entirely new price point, require specialized PSU support and deal with exhausting an unprecedented amount of heat, have you really taken the "performance" halo? Or did you just define a new performance category (super double crazy) that the competition just didn't realize existed before?

I won't dip into car analogies (ugh) but plenty come to mind in this case.

Finally, is this targeted at the 200hz 4k full ultra settings crowd? Does that exist and also have money to burn? I am not kidding when it seems to me that this segments seems like hundreds of people large.
 

blckgrffn

Diamond Member
May 1, 2003
9,212
3,221
136
www.teamjuchems.com
What 300W AMD Card?

There is none in upcoming AMD lineup.

Well, to be fair, AMD always can chose to crank the heat and clocks up, and make it a burning furnace, just like Nvidia did, but at the moment, there is no plans to exceed 275W TBP.

Fair enough. I'll try again

The ~275W RDNA 2 GPU is really going to be convincingly faster than the 300W "ish" 3080?

I mean, that sounds great, but based on most of the competitive history of the two companies it sounds like extremely wishful thinking.

Then again, as I read more about the 3090 I couldn't believe what I was seeing either so... yeah. Can't wait to see what happens over the next few weeks in GPU land.
 
Reactions: ozzy702

Glo.

Diamond Member
Apr 25, 2015
5,768
4,693
136
Fair enough. I'll try again

The ~275W RDNA 2 GPU is really going to be convincingly faster than the 300W "ish" 3080?

I mean, that sounds great, but based on most of the competitive history of the two companies it sounds like extremely wishful thinking.

Then again, as I read more about the 3090 I couldn't believe what I was seeing either so... yeah. Can't wait to see what happens over the next few weeks in GPU land.
Uhh... why would it not be faster than RTX 3080?

Let me ask you, and everybody here one simple question.

If ALL of Nvidia rumors turned out to be true, why is it so hard to believe that for once AMD has better GPU architecture, and products than their direct competitor?

Why would AMD rumors be incorrect, if ALL of Nvidia leaks and rumors turned out to be true?
 
Reactions: Krteq

blckgrffn

Diamond Member
May 1, 2003
9,212
3,221
136
www.teamjuchems.com
Uhh... why would it not be faster than RTX 3080?

Let me ask you, and everybody here one simple question.

If ALL of Nvidia rumors turned out to be true, why is it so hard to believe that for once AMD has better GPU architecture, and products than their direct competitor?

Why would AMD rumors be incorrect, if ALL of Nvidia leaks and rumors turned out to be true?

Well, AMD is getting what, a half node push, at best? Nvidia is getting substantial (but maybe not so great?) full step improvement...

Comparing the 2070 Super with the 5700xt was largely a push, and there was still the 2080 and 2080 ti above that...

I guess I am just dubious that a 3080 eating ~85 more watts than a 2080 is going to be slower than an evolutionary touch up on RDNA. A straight die shrink of the 2080 Ti would seem more prudent to profit from.

Again, if that turns out to be the case, great. RDNA v1 seemed really raw (like a launch now to get dev kits for consoles into existence type of product) and maybe there is that much efficiency left to be unlocked. That'd be cool.
 

Glo.

Diamond Member
Apr 25, 2015
5,768
4,693
136
Well, AMD is getting what, a half node push, at best? Nvidia is getting substantial (but maybe not so great?) full step improvement...

Comparing the 2070 Super with the 5700xt was largely a push, and there was still the 2080 and 2080 ti above that...

I guess I am just dubious that a 3080 eating ~85 more watts than a 2080 is going to be slower than an evolutionary touch up on RDNA. A straight die shrink of the 2080 Ti would seem more prudent to profit from.

Again, if that turns out to be the case, great. RDNA v1 seemed really raw (like a launch now to get dev kits for consoles into existence type of product) and maybe there is that much efficiency left to be unlocked. That'd be cool.
RDNA2 is Maxwell like step for AMD.


P.S. Have you payed attention to Xbox Series X performance? It has 52 CUs clocked at 1.8 GHz that are consuming between 130 and 140W of power.

Scale it for full dGPU and you get 60 CUs at 2.2 GHz that uses around 225W of power for whole board.
 

Ajay

Lifer
Jan 8, 2001
16,094
8,106
136
Uhh... why would it not be faster than RTX 3080?

Let me ask you, and everybody here one simple question.

If ALL of Nvidia rumors turned out to be true, why is it so hard to believe that for once AMD has better GPU architecture, and products than their direct competitor?

Why would AMD rumors be incorrect, if ALL of Nvidia leaks and rumors turned out to be true?
AMD has often had better hardware based on technical specs, and yet Nvidia has almost always been faster. Carmack used to puzzle about this problem.
Pretty certain that big Navi will be more power efficient, since it is using the much better TSMC 7N node (and AMD said they plan to improve perf/watt with RDNA2).
Anyway, we can do an apples to apples comparison when both GPUs are on the market.

Edit:typo
 
Last edited:
Reactions: Martimus and xpea

Glo.

Diamond Member
Apr 25, 2015
5,768
4,693
136
AMD has often had better hardware based on technical specs, and yet Nvidia has almost always been faster. Carmack used puzzle about this problem.
Pretty certain that big Navi will be more power efficient, since it is using the much better TSMC 7N node (and AMD said they plan to improve perf/watt with RDNA2).
Anyway, we can do an apples to apples comparison when both GPUs are on the market.
Has Turing been faster ALU for ALU than RDNA1?

No. If so, then I do not see why next gen Nvidia and AMD GPUs would show different picture.
 

Thala

Golden Member
Nov 12, 2014
1,355
653
136
Uhh... why would it not be faster than RTX 3080?

Let me ask you, and everybody here one simple question.

If ALL of Nvidia rumors turned out to be true, why is it so hard to believe that for once AMD has better GPU architecture, and products than their direct competitor?

Why would AMD rumors be incorrect, if ALL of Nvidia leaks and rumors turned out to be true?

Because AMD always over-promise and under-deliver. I do not see a single indication why NVidia should fall back in efficiency despite having a much larger node advance coming from Turing.
And then of course you have not the slightest idea what rumors are true and what not until the product is released.

Do I need to remind you on the "no consumer Ampere card this year" rumor you were spreading around not long ago? Or maybe your prediction of price and performance of the new NVidia line-up from like 1 week ago?
 

Glo.

Diamond Member
Apr 25, 2015
5,768
4,693
136
Because AMD always over-promise and under-deliver. I do not see a single indication why NVidia should fall back in efficiency despite having a much larger node advance coming from Turing.
And then of course you have not the slightest idea what rumors are true and what not until the product is released.

Do I need to remind you on the "no consumer Ampere card this year" rumor you were spreading around not long ago? Or maybe your prediction of price and performance of the new NVidia line-up from like 1 week ago?
I also stated at the beginning of this year, that next gen gaming cards are on 8 nm process: turned out to be true. I also stated they will not be as efficient as people expect them to be: turnes out to be true, also. I also told you why Nvidia would go to 8 nm process instead of 7 nm TSMC for gaming cards: which also turned out to be true.

You did not believed alongside few other people in this very thread, in any of it. Who has the last laugh now, about it, hmm?

As for my prediction about performance - not confirmed, or disproven yet.

Secondly. Interesting. So AMD underdelivered on the RDNA1 promises?

How about Zen 1 promises? Have they underdelivered and overpromised, or overdelivered and underpromised? What About Zen 2, and Renoir promises? Those people who were working on physical designs of those products have been put into RTG group and worked directly on RDNA2 physical designs.

What about Xbox Series X performance and Efficiency? So far painting a very good picture for AMD.

You may reject the reality, as much as you like, but it won't make it any less real.

Edit. Straight on one of first pages of this very thread:
Ampere is HPC chip, that replaces GV100.

Don't expect anything consumer from Nvidia before at least late 2020, and realistically - 2021.

Edit nr.2. I cackled when I read this:
Samsung EUV 7nm . This process will be more efficient than even 7nm TSMC.
I'm guessing the 3080 will be at least 20% faster than a gtx2080ti.
Save my posts if you like.

This did not aged well .
 
Last edited:

Gideon

Golden Member
Nov 27, 2007
1,723
3,976
136
Because AMD always over-promise and under-deliver. I do not see a single indication why NVidia should fall back in efficiency despite having a much larger node advance coming from Turing.
And then of course you have not the slightest idea what rumors are true and what not until the product is released.

Do I need to remind you on the "no consumer Ampere card this year" rumor you were spreading around not long ago? Or maybe your prediction of price and performance of the new NVidia line-up from like 1 week ago?

I'll post the it in the RDNA2 thread to avoid derailing it any further, but I agree that AMDs hypetrains always go overboard and it's best to be sceptical (and hopefully be pleasantly surprised later on)

Here's a link to my take:
 
Last edited:
Reactions: GodisanAtheist
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |