Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 74 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
292
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

JasonLD

Senior member
Aug 22, 2017
486
447
136
Do the math and you will see the Series X is delivering 12 TF at 140-150w compared to Radeon 5700XT's 9 TF at 225w. Thats 2x the perf/watt.

That perf/watt figure can be very misleading considering Series X GPU is running right at perf/watt sweet spot while 5700XT is stepped a bit out of it in order to be competitive against its Nvidia counterparts.
Who knows if only adding 100mhz to series X GPU could jack up the power figure above 200w?
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
That perf/watt figure can be very misleading considering Series X GPU is running right at perf/watt sweet spot while 5700XT is stepped a bit out of it in order to be competitive against its Nvidia counterparts.
Who knows if only adding 100mhz to series X GPU could jack up the power figure above 200w?

I dont believe you will save 70-80W by lowering RX5700X by 100mhz
 

JasonLD

Senior member
Aug 22, 2017
486
447
136
I dont believe you will save 70-80W by lowering RX5700X by 100mhz

Probably not by just dropping clockspeed alone, but undervolting it may. Perf/Watt just heavily favors higher CU count/lower clockspeed against lower CU count/higher clockspeed. Perf/watt improvement wouldn't look as impressive if PS5 GPU was compared against 5700XT.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
Perf/Watt just heavily favors higher CU count/lower clockspeed against lower CU count/higher clockspeed.

Thats not always the case, its not simple black and white. You can have way higher perf/watt from small GPUs at high clocks vs bigger GPUs with lower clocks.
Just remember, measuring perf/watt without a context is meaningless for Gaming GPUs.

For example, in the latest Techpowerup review, the ASUS TUF RX5600XT has the highest perf/watt at 4K resolution topping even the RTX2080Ti.
But that is a useless metric at 4K because RX5600XT doesnt have the power for 4K gaming.
But if you take the performance of the RX5600XT at 1080p as a context and compare its perf/watt against the same performance in that resolution then you get a meaningful result out of that perf/watt. Because that shows that RX5600XT has ~10% higher perf/watt at the same performance as the competition the RTX2060.

Perf/watt improvement wouldn't look as impressive if PS5 GPU was compared against 5700XT.

Why not??? you know something we dont ??
 
Last edited:

Gideon

Golden Member
Nov 27, 2007
1,710
3,927
136
5700 is a very interesting comparison. It has the same CU count and Memory bus width as PS5 @180W and yet is clocked nearly 600 mhz lower (game clock, if Cerney is to be believed). The only unknown is TDP, but I really doubt it's more than Xbox
 
Last edited:

JasonLD

Senior member
Aug 22, 2017
486
447
136
Thats not always the case, its not simple black and white. You can have way higher perf/watt from small GPUs at high clocks vs bigger GPUs with lower clocks.
Just remember, measuring perf/watt without a context is meaningless for Gaming GPUs.

For example, in the latest Techpowerup review, the ASUS TUF RX5600XT has the highest perf/watt at 4K resolution topping even the RTX2080Ti.
But that is a useless metric at 4K because RX5600XT doesnt have the power for 4K gaming.
But if you take the performance of the RX5600XT at 1080p as a context and compare its perf/watt against the same performance in that resolution then you get a meaningful result out of that perf/watt. Because that shows that RX5600XT has ~10% higher perf/watt at the same performance as the competition the RTX2060.

Oh, I meant that it is easier to manipulate the perf/watt in favor toward bigger GPUs with lower clocks vs smaller GPUs with higher clocks. perf/watt can be all over the place with gaming GPUs since product segmentation is ultimately based on performance, not just perf/watt.

Why not??? you know something we dont ??

I expect PS5 and Series X will be very close in terms of TDP. PS5 seems like it went a bit out of comfort zone in order to close the performance gap against Series X.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
That perf/watt figure can be very misleading considering Series X GPU is running right at perf/watt sweet spot while 5700XT is stepped a bit out of it in order to be competitive against its Nvidia counterparts.
Who knows if only adding 100mhz to series X GPU could jack up the power figure above 200w?

Even if you take mobile GPUs like RX 5500M (22CU, 1408sp, Game clock - 1448 Mhz, 4.08TF, 85w) the Xbox Series X is 1.7x the perf/watt of mobile RDNA. So thats a fair comparison as you can consider Series X to be a mobile GPU which is optimized for sweet spot on the V/F curve.

Sony's Mark Cerny made a statement that reducing the GPU freq by 10% cuts power by 27% when asked whether the PS5 GPU will actually be able to run most of the time at 2.23 Ghz.


Put simply, with race to idle out of the equation and both CPU and GPU fully used, the boost clock system should still see both components running near to or at peak frequency most of the time. Cerny also stresses that power consumption and clock speeds don't have a linear relationship. Dropping frequency by 10 per cent reduces power consumption by around 27 per cent. "In general, a 10 per cent power reduction is just a few per cent reduction in frequency," Cerny emphasises.
 
Reactions: Saylick

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
Why exactly? I thought a big reason for Samsung was no lack of wafer supply. I actually expect the opposite this time. If AMD has a true competitor this round, then it's in Nvidia's best interest to flood the market with cards.
Corona + high demand. I think these cards will be hard to get for at least a few months.
 

JasonLD

Senior member
Aug 22, 2017
486
447
136
Even if you take mobile GPUs like RX 5500M (22CU, 1408sp, Game clock - 1448 Mhz, 4.08TF, 85w) the Xbox Series X is 1.7x the perf/watt of mobile RDNA. So thats a fair comparison as you can consider Series X to be a mobile GPU which is optimized for sweet spot on the V/F curve.

Also, I wouldn't solely rely on TF numbers and perf/watt to estimate possible performance figure for top of the line Navi. All the TFlops numbers from RDNA2 might not translate to actual performance figures in terms of traditional rasterization performance.
 
Reactions: NTMBK

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
Also, I wouldn't solely rely on TF numbers and perf/watt to estimate possible performance figure for top of the line Navi. All the TFlops numbers from RDNA2 might not translate to actual performance figures in terms of traditional rasterization performance.



This is what AMD has said about RDNA2. We will see how RDNA2 vs Ampere plays out. Till then I disagree with you.
 

Attachments

  • AMD RDNA2 Efficiency improvements.png
    646.7 KB · Views: 298
Reactions: Saylick

DiogoDX

Senior member
Oct 11, 2012
746
277
136
We don't know the XBOX power and people are already extrapolating the Big Navi power from it. Only thing we know is that it has a 315W PSU. Actual power consumption could be as high as 250W. Just because the ONE X peaks at 175-180W doesn't mean that the Series X will be the same.
 
Reactions: FaaR

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
Can't wait for the 3000 series to come out so I can replace the temporary (sold my 2080TI) AMD 580 I'm using that keeps black screening and freezing up my system. Hopefully there's good competition and more reasonable pricing, we'll see.

Seems like every AMD GPU launch is hype hype hype and then a poor execution. Hopefully more $$$ is being thrown into that division and AMD regains some of it's former glory in the space.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
We don't know the XBOX power and people are already extrapolating the Big Navi power from it. Only thing we know is that it has a 315W PSU. Actual power consumption could be as high as 250W. Just because the ONE X peaks at 175-180W doesn't mean that the Series X will be the same.




First if you did see the Series X teardown and did some basic analysis you would come up with similar conclusion. Dual board design. Dual PSU rails. +12v 21.5A, +12v 5A. Mainboard with SoC, 16 GB GDDR6 memory, onboard 1TB NVMe, external NVMe connector and HDMI 2.1 connects to the +12v 21.5A rail. The daughter I/O board connects to the other +12v 5A rail.

90% efficiency of PSU and assuming a load of 90% (worst case) gives 206w.
8C/16T at 3.66 Ghz draws 54w from Renoir measurements.
4w onboard 1 TB NVMe + 4w for external NVMe + 2w HDMI.

Roughly 150w for GPU + 16 GB GDDR6 (asymmetric setup with 10GB at 560 GB/s and 6 GB at 336 GB/s). The GPU on its own is drawing roughly 120w.
 
Last edited:

MrTeal

Diamond Member
Dec 7, 2003
3,584
1,743
136
First if you did see the Series X teardown and did some basic analysis you would come up with similar conclusion. Dual board design. Dual PSU rails. +12v 21.5A, +12v 5A. Mainboard with SoC, 16 GB GDDR6 memory, onboard 1TB NVMe, external NVMe connector and HDMI 2.1 connects to the +12v 21.5A rail. The daughter I/O board connects to the other +12v 5A rail.

90% efficiency of PSU and assuming a load of 90% (worst case) gives 206w.
8C/16T at 3.66 Ghz draws 54w from Renoir measurements.
4w onboard 1 TB NVMe + 4w for external NVMe + 2w HDMI.

Roughly 150w for GPU + 16 GB GDDR6 (asymmetric setup with 10GB at 560 GB/s and 6 GB at 336 GB/s). The GPU on its own is drawing roughly 120w.
If the output of the PSU is 21.5A on the rail it doesn't matter what the PSU efficiency is, there's a rated 258W available to the motherboard.
 
Reactions: Campy

Glo.

Diamond Member
Apr 25, 2015
5,761
4,666
136
Here we go again... After all the hype, NV killer guerilla marketing, Glo's raving (I do miss RussianSensation) Nvidia are going to announce and launch their cards before AMD has even gotten a finalized design down (for all we know).

Execution machines, those guys. It will be even sadder if they actually are 100% on the Samsung 8/10nm train for their consumer parts. Working with a brand new partner for large dies and still getting the stack out the door before we even hear a peep from AMD.
If you have inferior product it is very important for you to release it faster than your competition.
 
Reactions: Mopetar and raghu78

Konan

Senior member
Jul 28, 2017
360
291
106
If you have inferior product it is very important for you to release it faster than your competition.

First movers attract eager buyers, secondary movers usually go when demand begins to grow and they have an inferior product trying to build off the success of the first.
In this case, with Nvidia we're already seeing smart marketing with the 21 days for 21 years ago approach. A 3 week build up to an announcement followed by a launch has all been planned out. Product segregation over 30-60-90 day plans. No fear. The market is most anxious for a product right now and that is when you launch. Funny it is usually this time that happens, back to school leads to thanksgiving leads to black Friday etc all way till holiday season. This year we have consoles to deal with so better to get out in front of all that.

Bet you that AMD wishes they were launching right now. But no, their board design was late, drivers are late, launch is late and no AIB partners. That smells of poor planning and execution. That said the advantage they have is pricing but the more it is left late the more damage is done.
 
Reactions: DXDiag

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
If the output of the PSU is 21.5A on the rail it doesn't matter what the PSU efficiency is, there's a rated 258W available to the motherboard.

Even if we go with that 258w number the max power draw on the Series X SoC is expected to be around 70-75% of the max available power draw. These PSUs are never loaded upto their max rated limit to account for degradation in rated wattage over expected lifespan. These consoles are expected to last 7-10 years.


Xbox One X drew a max of 180w (172w in Gears of War) with a 245w rated PSU. I would say 220w is a worst case scenario for the Series X mainboard. More realistically 200-210w.
 
Last edited:

FaaR

Golden Member
Dec 28, 2007
1,056
412
136
Why not??? you know something we dont ??
Sony is clocking the hell out of their GPU, right up to the bleeding edge of reliability, according to Cerny. That's going to bump power draw significantly, I'm really curious how much it'll end up being. PS5 is a very large console from what we've seen, could be they needed to engineer a big, sophisticated cooling system to keep the noise level down while still dissipating a significant wattage. Sort of like with launch PS3 units.

90% efficiency of PSU and assuming a load of 90% (worst case) gives 206w.
PSU (in)efficiency goes on top of rated output. It doesn't subtract from it...
 

Glo.

Diamond Member
Apr 25, 2015
5,761
4,666
136
Bet you that AMD wishes they were launching right now. But no, their board design was late, drivers are late, launch is late and no AIB partners. That smells of poor planning and execution. That said the advantage they have is pricing but the more it is left late the more damage is done.
Are you sure about that?

Then why Nvidia is RUSHING the launch of next gen GPUs? There is ONE reason why we see those GPUs so early.

Ask yourself, why is Nvidia rushing this launch, if AMD has inferior products? Do they have inferior products?
 

Konan

Senior member
Jul 28, 2017
360
291
106
Are you sure about that?

Then why Nvidia is RUSHING the launch of next gen GPUs? There is ONE reason why we see those GPUs so early.

Ask yourself, why is Nvidia rushing this launch, if AMD has inferior products? Do they have inferior products?

I don’t think that they are worried about AMD as much as you think they are. You’re making some massive assumptions to fit your conscious narrative.

I believe that they will not have inferior products. I think that they will be competitive and have a great looking stack. It will be progress over their last iteration, but it won’t be perfect and that’s ok.

I think that they will have inferior software. I also think that if they could choose to launch with AIB partners they would.

Both companies are dealing with global issues across their lines of business. The sooner the better.

I don’t believe for a second that it is coincidence with the 21 days 21st year etc. that stuff is planned out months prior (once the product is cooked)
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |