Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 126 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
292
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Sure, if you need to bring a negative example, which by the way is highly zoomed and totally ignore the rest of the comparison video, where you see greater detail with DLSS2.0 and minimal ringing/sharpening artifacts, i would call this a very selective argument.
On the bright side, not many of the GPU buying crowd will have this selective view.

You just posted about how there is no loss in quality. Somebody posts loss of quality, and you say people should ignore that? That "super zoomed in" shot is actually not that zoomed in. The text on the wall is huge and the issue can be seen when you are standing quite a ways from it (in game). Text issues are all over the place when DLSS is on.

EDIT: Fixed typo, forgot to add the word 'not' before 'that zoomed in'.
 
Last edited:

moonbogg

Lifer
Jan 8, 2011
10,637
3,095
136
You getting your F5 trigger ready for tomorrow friend?

Multiple things about the launch are discouraging. I'm curious about why these cards require such an enormous TDP. Are they forcing them to perform where they need to be compared to the competition? I have concerns about the Samsung process being inefficient. Based on some leaked slides (rumors) the rasterization improvement isn't very impressive, especially for the 3080. The 3090 only looks impressive with RTX on, and I have no desire to play Minecraft all day to justify an $1800 marked up card, lol. So no, my F5 finger is hibernating until the competition shows their cards or until Nvidia offers a real upgrade in the same price bracket as the 1080Ti. They are clearly trying to force people into a higher price bracket to get that upgrade. I'm just not going to play that game. If that's how it will be, then for the first, well, second time (Turing), I'm not interested.
 

maddie

Diamond Member
Jul 18, 2010
4,787
4,771
136
So even if the reconstructed image has more subpixel details than the native image there would be an IQ degradation based on the argument, that the developer did not want you to see these subpixel details? Ok...very strange interpretation of IQ...but whatever floats your boat...

Besides it is not an "interpretation that nVidia gives" but an interpretation of an algorithm, which is trained on very high resolution and detailed images - it is not that the algorithm randomly adds details but it is based on its knowledge of very high resolution content.
The algorithm 'knows' better? More subpixel details? It's all imaginary. I suppose that artistic forgeries are better than the original also? Tell that to the artist.

I might get banned if I express what I really think.

Many years ago, as a hobby, I was exploring a 'sort of compression' that allowed the use of AI to produce images with very little data. Sort of like telling someone to draw an image from a detailed description. A white plane with 2 engines on the wings, 40 widows per side, a t-tail, etc, etc, etc. Each attempt produced a different image. It was roughly based on how human storytelling works and what we image internally.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
You just posted about how there is no loss in quality. Somebody posts loss of quality, and you say people should ignore that? That "super zoomed in" shot is actually that zoomed in. The text on the wall is huge and the issue can be seen when you are standing quite a ways from it (in game). Text issues are all over the place when DLSS is on.
You did pick 1080 to 4k which is a huge upscale with a corresponding huge performance uplift.
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
View attachment 28994
View attachment 28992

DLSS 2.0 is full of high contrast sharpening. It might give an appearance to eyes that have not spent hours working i photoshop that it is sharper than the original, but.. The best way i can explain it is if you took a photo and posted it on instagram. But before you did, you would go to the settings menu and crank up the HDR, and the contrasts. There is also a pretty weird thing around edges that become jagged, its a type of AA that makes it look like the edges was stitched together with pluses(+x+x+x+x+).
For me, it's not worth the trade off, but to each and their own I guess!

Yeah,I see the same. For people who do not work with ps,etc they can't see the obvious problems. The sharpening is on edges,which make it more appealing to the eye(usm in ps). These so called reviewers,are not proficient with image manipulation and cannot see this.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,877
3,228
126
Gainward released leaks...

RTX 3090 Phoenix
  • CUDA cores: 5248
  • Clock speed: 1695 MHz (Boost)
  • Memory: 24GB GDDR6X
  • Memory clock: 9750 MHz
  • Bandwidth: 936 GB/s
  • PCIe: Gen 4
  • Max power consumption: 350W
  • Output: HDMI 2.1, DisplayPort 1.4a
RTX 3080 Phoenix
  • CUDA cores: 4352
  • Clock speed: 1710 MHz (boost)
  • Memory: 10GB GDDR6X
  • Memory clock: 9500 MHz
  • Bandwidth: 760 GB/s
  • PCIe: Gen 4
  • Max power consumption: 320W
  • Output: HDMI 2.1, DisplayPort 1.4a

Im seriously LMAFOing at that 3090 GPU Mem. 24GB on a single card, with 350W....
Im also wondering when we'll start seeing more DP1.4a monitors.... we are given so many DP ports, yet almost none of them can do HDR over DP unless its 1.4.
 

Saylick

Diamond Member
Sep 10, 2012
3,385
7,151
136
I'm still surprised at the 30W TDP between the two SKUs even though they are clocked about the same, but yet the 3090 has 20% more cores and memory chips. I feel like the real TDP of the 3090 has to be closer to 375W...
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,877
3,228
126
I'm still surprised at the 30W TDP between the two SKUs even though they are clocked about the same, but yet the 3090 has 20% more cores and memory chips. I feel like the real TDP of the 3090 has to be closer to 375W...

yeah that's why i was saying LMFAO... i also do not believe the 3090 can be 350W with more then double the ram, almost 1000 more CUDA cores.
 

CakeMonster

Golden Member
Nov 22, 2012
1,428
535
136
HDMI 2.1 with all features better be supported on all upcoming monitors, because without DP 2.0 support HDMI will have to do for the long awaited better monitor specs. I'd love to see above 120hz on 4-8K monitors soon. DP 2.0 spec has been finished for a long while, I wonder why they couldn't put it on this gen...
 

Saylick

Diamond Member
Sep 10, 2012
3,385
7,151
136

Saylick

Diamond Member
Sep 10, 2012
3,385
7,151
136
Because there might be AIB/OEM models with trash tier bin silicon.
Seems like a whole lotta variability to me if there's trash tier silicon that barely musters the advertised boost of ~1700 MHz and then there's the possibility that you get silicon that boosts above 2000 MHz out of the box without an OC. I now see why AMD started to advertise a Game Clock, as it appears easier to comprehend what the consumer can roughly expect when it comes to typical clocks.
 
Reactions: Gideon

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
Gainward released leaks...

RTX 3090 Phoenix
  • CUDA cores: 5248
  • Clock speed: 1695 MHz (Boost)
  • Memory: 24GB GDDR6X
  • Memory clock: 9750 MHz
  • Bandwidth: 936 GB/s
  • PCIe: Gen 4
  • Max power consumption: 350W
  • Output: HDMI 2.1, DisplayPort 1.4a
RTX 3080 Phoenix
  • CUDA cores: 4352
  • Clock speed: 1710 MHz (boost)
  • Memory: 10GB GDDR6X
  • Memory clock: 9500 MHz
  • Bandwidth: 760 GB/s
  • PCIe: Gen 4
  • Max power consumption: 320W
  • Output: HDMI 2.1, DisplayPort 1.4a

Im seriously LMAFOing at that 3090 GPU Mem. 24GB on a single card, with 350W....
Im also wondering when we'll start seeing more DP1.4a monitors.... we are given so many DP ports, yet almost none of them can do HDR over DP unless its 1.4.

Then the rtx3070 has 8gb.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I'm still surprised at the 30W TDP between the two SKUs even though they are clocked about the same, but yet the 3090 has 20% more cores and memory chips. I feel like the real TDP of the 3090 has to be closer to 375W...

nVidia is known for not correctly listing total power consumption. Difference between TDP and TBP. nVidia typically uses TDP, which is the power consumed by the GPU itself, minus memory/board losses. The 2080Ti for instance consistently uses 30-40W more than nVidia claims while gaming. TPU measured the founders edition (250W TDP) at 273W average gaming, and 289W under peak gaming.

If nVidia is claiming 350W TDP for the 3090, people need to expect it to use near 400W.
 
Last edited:

DXDiag

Member
Nov 12, 2017
165
121
116
You just posted about how there is no loss in quality.
DLSS 2.0 is full of high contrast sharpening. It might give an appearance to eyes that have not spent hours working i photoshop that it is sharper than the original, but.. The best way i can explain it is if you took a photo and posted it on instagram. But before you did, you would go to the settings menu and crank up the HDR, and the contrasts. There is also a pretty weird thing around edges that become jagged, its a type of AA that makes it look like the edges was stitched together with pluses(+x+x+x+x+).

I can post hundreds of pictures where TAA is exhibiting massive loss of details compared to DLSS 2. Simply put, both makes you lose a bit of details, it's just TAA makes you loss a lot lot more, especially during motion. Whereas most of the comparisons posted care about fixed pictures.
 

CP5670

Diamond Member
Jun 24, 2004
5,526
604
126
Im seriously LMAFOing at that 3090 GPU Mem. 24GB on a single card, with 350W....
Im also wondering when we'll start seeing more DP1.4a monitors.... we are given so many DP ports, yet almost none of them can do HDR over DP unless its 1.4.

On the other hand, 10GB for the 3080 is a little disappointing. That would actually be a downgrade for 1080ti owners, even though it's probably not noticeable in practice.
 

Gideon

Golden Member
Nov 27, 2007
1,712
3,931
136
On the other hand, 10GB for the 3080 is a little disappointing. That would actually be a downgrade for 1080ti owners, even though it's probably not noticeable in practice.
Rumor is, there will be 20GB cards, just not at launch. I definitely wouldn't want to go lower than 12GB, 16GB if possible with new consols with ultra fast SSD streaming (not available yet on PC) coming up. Hopefully 3070 also has a 16 GB option or Big Navi, if competitive.
 
Reactions: psolord

kurosaki

Senior member
Feb 7, 2019
258
250
86
TAA is a lossy technique that has a typical "smearing" effect, especially around edges and high contrast areas. DLSS is trying to take a low Res pic upscale it and bumps up the local contrast in the same areas. It's like gaming through a bad Instagram filter. Some may not notice, good for you. Some can't unsee what they have seen and therefor imagedistorting techniques like these won't be turned on.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |