Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 107 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
292
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

MrTeal

Diamond Member
Dec 7, 2003
3,587
1,748
136
Its funny.

Everybody talks about Nvidia price/performance/efficiency = value offering, but nobody is willing to even discuss, the obvious direction - and picking the competition this time around if they provide better value.

Maybe after all people really wanted AMD to be competitive, only to be able to buy Nvidia GPUs cheaper?
It's the Ampere thread, this isn't the place to discuss AMD's positioning especially given there's nothing substantial out there as to what that will be.

You're also barking up the wrong tree. My daily driver GPUs since it became a two horse race have been 9600 XT, HD2400, 6850, 7950, 290, 2x290, 4x290, 480, and 1080 Ti. There's not a lot of green there.
 

Glo.

Diamond Member
Apr 25, 2015
5,765
4,670
136
It's the Ampere thread, this isn't the place to discuss AMD's positioning especially given there's nothing substantial out there as to what that will be.

You're also barking up the wrong tree. My daily driver GPUs since it became a two horse race have been 9600 XT, HD2400, 6850, 7950, 290, 2x290, 4x290, 480, and 1080 Ti. There's not a lot of green there.
I was not accusing you here.

I just find it weird that we are on 107th page of Ampere thread, with unreal hype about their GPU lineup, while we know A LOT about RDNA2, based on consoles, and tech that has been unveiled, and yet, there was only 20 something pages written about it.

We know way more about RDNA2, than about Ampere, officially, and yet, the one that is being talked constantly, is Nvidia.

Isn't it interesting?

So once again. Maybe people really wanted AMD to be competitive with Nvidia, only to buy Nvidia GPUs cheaper?
 
Reactions: spursindonesia

MrTeal

Diamond Member
Dec 7, 2003
3,587
1,748
136
I was not accusing you here.

I just find it weird that we are on 107th page of Ampere thread, with unreal hype about their GPU lineup, while we know A LOT about RDNA2, based on consoles, and tech that has been unveiled, and yet, there was only 20 something pages written about it.

We know way more about RDNA2, than about Ampere, officially, and yet, the one that is being talked constantly, is Nvidia.

Isn't it interesting?

So once again. Maybe people really wanted AMD to be competitive with Nvidia, only to buy Nvidia GPUs cheaper?
That's definitely something that happens, though I'm not sure it's the reason why the Ampere thread is more popular. I would disagree that we know more about RDNA2 than about Ampere. While consumer Ampere might differ, a lot of information was released with the launch of A100. We have some architectural information from the consoles, but that is also not very complete and the semi-custom parts could be quite different from then what we see in GPUs.

More than that though, there are actual GPUs to talk about in here. This thread has basically doubled in the last month with the steady stream of leaks on the dies and SM counts, the 3080 heatsink leak, the 12 pin cable leak, all the board shots, the massive 3090 cooler, etc, etc. There's a lot of info out there on the Ampere side to actually talk about, there's not nearly as much about what the 6xxx series launch will look like or even a solid date on when it'll be.
 

Thala

Golden Member
Nov 12, 2014
1,355
653
136
So once again. Maybe people really wanted AMD to be competitive with Nvidia, only to buy Nvidia GPUs cheaper?

Very likely scenario. Besides there is nothing much wrong with cheaper NVidia GPU in the bigger picture.
But between people wanting AMD to be competitive and AMD being competitive is typically a big gap.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
You missed the Xbox Series X power numbers as claimed by Microsoft. 52 CUs at 1825 Mhz for 12.15 TF at same power consumption as Xbox One GPU. The GPU in itself is <=100w. With 16GB GDDR6 memory its still roughly 130-135w. RDNA2 brings the best physical design methodologies used by the Zen CPU core team. The numbers are there for everyone except the most stubborn Nvidia supporters to see.

Look at the rumoured power numbers for GA104 with 2 SKUs - RTX 3070 Ti (3072 CC) , RTX 3070 (2944 CC)

The GPU and memory will need the same amount of power like the XOX. That is around ~160W or so. The CPU is downclocked from the desktop and will be maxed out at 35W. Microsoft is even shipping the same 310W power supply which would be just overkill.

A SoC is more efficient than a discrete GPU card. Compare RX580 to the XSX and it shows where a equal card could be stand.
 

Konan

Senior member
Jul 28, 2017
360
291
106

That guy is just Moore's Law is Dead's lackey.

Keeps going on about the cooler saying Nvidia is struggling with it? hot and loud? how does he know? He doesn't know how effective it is, no one does yet.
Says the die is massive - how does he know?? there isn't a comparison
Talking about poor yields - he has no clue again


The narrative is damning and slating and it hasn't changed at all while promoting the competition.
Anything possibly negative to say about Ampere he focuses on. I can see why you posted it.
 

Krteq

Senior member
May 22, 2015
993
672
136
You missed the Xbox Series X power numbers as claimed by Microsoft. 52 CUs at 1825 Mhz for 12.15 TF at same power consumption as Xbox One GPU. The GPU in itself is <=100w. With 16GB GDDR6 memory its still roughly 130-135w. RDNA2 brings the best physical design methodologies used by the Zen CPU core team. The numbers are there for everyone except the most stubborn Nvidia supporters to see.

Look at the rumoured power numbers for GA104 with 2 SKUs - RTX 3070 Ti (3072 CC) , RTX 3070 (2944 CC)



250w for roughly 12.3 TF based on a 2 Ghz boost clock.

The writing is on the wall. Samsung 8nm is no match for TSMC N7P. Nvidia underestimated AMD and bet on the wrong foundry and process. Ampere is going to be Nvidia's worst perf/watt product generation after Fermi. The 21 day hype on Geforce twitter account and videos about their cooler design is just damage control.
Hmm, just curious, was there some speculation/leak about estimated MSRP for 3070 and 3070 Ti?
 

uzzi38

Platinum Member
Oct 16, 2019
2,703
6,405
146
The GPU and memory will need the same amount of power like the XOX. That is around ~160W or so. The CPU is downclocked from the desktop and will be maxed out at 35W. Microsoft is even shipping the same 310W power supply which would be just overkill.

A SoC is more efficient than a discrete GPU card. Compare RX580 to the XSX and it shows where a equal card could be stand.
CPU needs to ensure 3.6GHz across 16 threads even in AVX2 workloads. That's a guaranteed minimum of 55W, potentially 60W or more depending on voltages set (remember, consoles have lower yield tolerances than desktop products, they'll overvolt the chips to increase yields).

Even not accounting for that, it's still an easy 55-60W though.
 

uzzi38

Platinum Member
Oct 16, 2019
2,703
6,405
146
The GPU and memory will need the same amount of power like the XOX. That is around ~160W or so. The CPU is downclocked from the desktop and will be maxed out at 35W. Microsoft is even shipping the same 310W power supply which would be just overkill.

A SoC is more efficient than a discrete GPU card. Compare RX580 to the XSX and it shows where a equal card could be stand.
CPU needs to ensure 3.6GHz across 16 threads even in AVX2 workloads. That's a guaranteed minimum of 55W, potentially 60W or more depending on voltages set (remember, consoles have lower yield tolerances than desktop products, they'll overvolt the chips to increase yields).

Even not accounting for that, it's still an easy 55-60W though.
 

Glo.

Diamond Member
Apr 25, 2015
5,765
4,670
136
That guy is just Moore's Law is Dead's lackey.

Keeps going on about the cooler saying Nvidia is struggling with it? hot and loud? how does he know? He doesn't know how effective it is, no one does yet.
Says the die is massive - how does he know?? there isn't a comparison
Talking about poor yields - he has no clue again


The narrative is damning and slating and it hasn't changed at all while promoting the competition.
Anything possibly negative to say about Ampere he focuses on. I can see why you posted it.
I can see why you are being butthurt.

The die is massive since it has 627 mm2. Its hot, because Over 350W of power will not be cool. Is it loud - we don't know. But most likely yes.

Poor yields - I agree. We don't know.

I always like when somebody posts the information, that we do not like, we immediately jump to shoot the messenger, not the message he brings.
 

Konan

Senior member
Jul 28, 2017
360
291
106
.
I can see why you are being butthurt.

The die is massive since it has 627 mm2. Its hot, because Over 350W of power will not be cool. Is it loud - we don't know. But most likely yes.

Poor yields - I agree. We don't know.

I always like when somebody posts the information, that we do not like, we immediately jump to shoot the messenger, not the message he brings.

A 2080Ti die size is 754mm2.

The 3090 is a big top of the stack card. More power. Smaller Die. More Memory etc.
You have no idea if the cooler will do it's job or what ambient noise is.
Not An Apple Fan spouts BS and clickbait for months and months. He actively asks people to give him Patreon $$$$ so he can build a new house. GTFO.


I can see why you are being butthurt.

Please enlighten me???
 
Reactions: DXDiag

Glo.

Diamond Member
Apr 25, 2015
5,765
4,670
136
.


A 2080Ti die size is 754mm2.

The 3090 is a big top of the stack card. More power. Smaller Die. More Memory etc.
You have no idea if the cooler will do it's job or what ambient noise is.
Not An Apple Fan spouts BS and clickbait for months and months. He actively asks people to give him Patreon $$$$ so he can build a new house. GTFO.
You do realize that if a GPU is spewing out 275W is a different cooling capacity than for example 350-375W?

You really, REALLY believe it wont affect the acoustics? Yes, he doesn't know if it will be loud.

But he might be right on his speculation on this front.

So why are you throwing this tantrum?
 

Hitman928

Diamond Member
Apr 15, 2012
5,622
8,847
136
That's not the Ampere cooler airflow, unless every leaked picture has had the rear fan reversed.

That's what I thought too but based upon that video and looking at the leaked pictures again, the video may be correct and the second fan is actually pulling air up towards the top of the case. This would make way more sense than pushing it down. When looking at the pictures I just assumed they used the same fan/hookup on both sides which would mean the second fan was pushing air through the heatsink, but they are probably using a different fan/reversed the polarity meaning it's spinning the opposite direction you would expect based upon the first fan and is actually pulling air through the heatsink. I'm usually more an advocate of pushing air through tightly spaced fins than pulling but we'll see what the end result is in the reviews.
 

Konan

Senior member
Jul 28, 2017
360
291
106
You do realize that if a GPU is spewing out 275W is a different cooling capacity than for example 350-375W?

You really, REALLY believe it wont affect the acoustics? Yes, he doesn't know if it will be loud.

But he might be right on his speculation on this front.
You do realize GPUs aren’t operating at maximum power draw 24/7 in normal use?
Of course when playing games as an example fans spin up and acoustics rise. My point is neither you, me or that nob knows what either performance or acoustics are.

So why are you throwing this tantrum?

There is no tantrum. I’m just pragmatic.
Why are you trying to bait again?
 
Reactions: DXDiag

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
The GPU and memory will need the same amount of power like the XOX. That is around ~160W or so. The CPU is downclocked from the desktop and will be maxed out at 35W. Microsoft is even shipping the same 310W power supply which would be just overkill.

A SoC is more efficient than a discrete GPU card. Compare RX580 to the XSX and it shows where a equal card could be stand.

Xbox One X GPU with 12GB GDDR5 (6.8 Gbps) power draw was roughly 135w. The 8 Jaguar cores at 2.3 Ghz was drawing roughly 25w. The entire system power draw was 172w running Gears of War 4 as measured by anandtech. BTW the power supply was 245w.

"Clearly gaming on older Xbox One games is not much of a chore for the Xbox One X, since the power draw is only about 50 W over idle. But, when gaming with an Xbox One X Enhanced title, such as Gears of War 4, the power draw jumps significantly to 172 W as the peak observed. This is quite a jump over the original console, and makes the cooling system, which is barely audible even under these loads, even more impressive. Compared to a high-end gaming PC though, the power draw is quite a bit less. "

"In the case of the Xbox One X, Microsoft has outfitted it with a 245-Watt universal voltage PSU, and the company claims it is the most efficient ever put into an Xbox."

Renoir at 3.66 Ghz draws 55w. The Xbox Series X is likely to draw jaround 200w for the entire system. Its obvious for anyone to see RDNA2 is vastly more efficient than Ampere.
 

Glo.

Diamond Member
Apr 25, 2015
5,765
4,670
136
You do realize GPUs aren’t operating at maximum power draw 24/7 in normal use?
Of course when playing games as an example fans spin up and acoustics rise. My point is neither you, me or that nob knows what either performance or acoustics are.
If you have a problem with what he said that it will be hot and loud, as 350-375W chip would be under load, why suddenly you change the optics on this?

There is no tantrum. I’m just pragmatic.
Why are you trying to bait again?
Let me quote you, yourself, just few posts higher.
The narrative is damning and slating and it hasn't changed at all while promoting the competition.
Anything possibly negative to say about Ampere he focuses on. I can see why you posted it.
Now let me ask you this.

Who baited first?
 
Reactions: spursindonesia

xpea

Senior member
Feb 14, 2014
449
150
116
So once again. Maybe people really wanted AMD to be competitive with Nvidia, only to buy Nvidia GPUs cheaper?
Why not look at the cold reality?
Last quarter discrete GPU market research from JPR shows Nvidia owning 80% market share and only 20% went to AMD. 1 to 4 ratio. We are close to see AMD totally irrelevant for gamers...
Wonder your self why, then you will lower your expectations about RDNA2

PS hint: it's not only about hardware
 

Glo.

Diamond Member
Apr 25, 2015
5,765
4,670
136
Why not look at the cold reality?
Last quarter discrete GPU market research from JPR shows Nvidia owning 80% market share and only 20% went to AMD. 1 to 4 ratio. We are close to see AMD totally irrelevant for gamers...
Wonder your self why, then you will lower your expectations about RDNA2

PS hint: it's not only about hardware
Obviously.

Its only the users, themselves.
 
Reactions: spursindonesia

maddie

Diamond Member
Jul 18, 2010
4,787
4,771
136
That's what I thought too but based upon that video and looking at the leaked pictures again, the video may be correct and the second fan is actually pulling air up towards the top of the case. This would make way more sense than pushing it down. When looking at the pictures I just assumed they used the same fan/hookup on both sides which would mean the second fan was pushing air through the heatsink, but they are probably using a different fan/reversed the polarity meaning it's spinning the opposite direction you would expect based upon the first fan and is actually pulling air through the heatsink. I'm usually more an advocate of pushing air through tightly spaced fins than pulling but we'll see what the end result is in the reviews.
My reason was the pictures of the fan on the back. Look again at the angles of the blades. Blades are basically airfoils angled for aerodynamic efficiency. These push air into the card, not pull it out. Attached a picture.

As you can see, the fan will spin in a clockwise direction based on the airfoil shape, thus airflow will be into the card, same as the other fan.

Edit: If the fan was spinning anti-clockwise to pull air out and still looked like this, I would expect a roughly 50% + loss of efficiency and airflow.

 
Last edited:

Hitman928

Diamond Member
Apr 15, 2012
5,622
8,847
136
My reason was the pictures of the fan on the back. Look again at the angles of the blades. Blades are basically airfoils angled for aerodynamic efficiency. These push air into the card, not pull it out. Attached a picture.

As you can see, the fan will spin in a clockwise direction based on the airfoil shape, thus airflow will be into the card, same as the other fan.

Edit: If the fan was spinning anti-clockwise to pull air out and still looked like this, I would expect a roughly 50% + loss of efficiency and airflow.

View attachment 28659

It's hard to judge completely without seeing the "front" side of that fan, but it does seem to be a non-optimal design for pull, especially with fins spaced so closely together. It's also possible that this is just a prototype and the end product will have a different fan.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |