Discussion Ada/'Lovelace'? Next gen Nvidia gaming architecture speculation

Page 62 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

maddie

Diamond Member
Jul 18, 2010
4,787
4,771
136
The 4080, in both iterations at their prices, looks abysmal for performance. They delivered huge on the 4090 and priced accordingly. The 4080 12gb is for sure going to be slower than 3090ti/3090 in quite a few games at 4K. Not sure what they're thinking there price wise. I get 4090, and will probably wind up getting one even at that price once they are in stock and not scalped, it is so fast, but the 4080 looks awful at those prices.

4090 also has quite a few disabled units, leaving lots of room for a 4090ti. Will have to see if they are putting aside chips with further disabled units for a 4080ti of some sort. This launch feels to me like one of those where nvidia ends up adjusting prices downwards in a few months. If those scalping POS end up holding the bag on 4090s, will be a good indicator that prices may end up going down. 4090 scalpers are testing the waters, there is no mining market this time around. How many gamers are up for spending $3K on a scalped gaming GPU, I don't know. Hopefully they get screwed over hard and there is no market for those rip off scalped prices.
We should consider this as an experiment into just how powerful is the Halo effect. We''ll know for sure this generation, I expect.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,262
5,259
136
This launch feels to me like one of those where nvidia ends up adjusting prices downwards in a few months. If those scalping POS end up holding the bag on 4090s, will be a good indicator that prices may end up going down. 4090 scalpers are testing the waters, there is no mining market this time around. How many gamers are up for spending $3K on a scalped gaming GPU, I don't know. Hopefully they get screwed over hard and there is no market for those rip off scalped prices.

I don't expect downward pressure on the 4090, but more on the 4080 12GB/16GB.
 
Reactions: Tlh97 and gdansk

Grooveriding

Diamond Member
Dec 25, 2008
9,108
1,260
126
I don't expect downward pressure on the 4090, but more on the 4080 12GB/16GB.

I agree, 4090 will stay where it is, the performance is there in spades. It's the 4080 12/16 that look off price wise. Where would a 4080ti slot in? $1400? Would make the 4080 16 look even worse if a hypothetical 4080ti is using a further cut down 4090 die.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
It's the 4080 12/16 that look off price wise. Where would a 4080ti slot in? $1400? Would make the 4080 16 look even worse if a hypothetical 4080ti is using a further cut down 4090 die.

The 4080 12GB looks off because it should be the 4070.

Not sure there ever will be a 4080Ti. Not much of a purpose for it. Unless nVidia just ends up with lots of poor 4090 dies, which seems unlikely.
 
Reactions: Mopetar

StinkyPinky

Diamond Member
Jul 6, 2002
6,829
875
126
I actually think the 4090 is a reasonable deal. It is extremely good performance, a huge leap from the 3090 and it's their premium flagship.

For me it's the lower tiers where the pricing sucks.
 
Reactions: Kaluan and DooKey

Saylick

Diamond Member
Sep 10, 2012
3,385
7,151
136
I actually think the 4090 is a reasonable deal. It is extremely good performance, a huge leap from the 3090 and it's their premium flagship.

For me it's the lower tiers where the pricing sucks.
4090 should be $1000 max. Out with the old, in with the new. That's how it used to be. But now Nvidia thinks that they deserve more, even after raking in all the cryptocash.
The 4090 looks like a good deal partly because the 3090 was such a bad deal. In hindsight it was genius marketing because the real volume seller last gen for Nvidia was the 3080 at $699, which they were able to do because they got a very good price for Samsung wafers. The 3090 in comparison was not even 20% faster for more than twice the price. $1499 for 20% more performance, 24 GB of RAM, and 9001% bigger e-peen. It never should have been that expensive to begin with but the 3080 was compromised just enough with its 10 GB of RAM that for those who desire the best, there was only one option to get it: by ponying up $1500.

In reality, the 3090 should have been a $1200 card tops. Basically in-line with the price of the previous generation flagship, the 2080 Ti, with a $100 premium added on.

But now that the market is used to the price of an xx90 class card being $1500, the 4090 all of a sudden looks like a good deal because it offers so much more performance at only, yet again, a $100 premium. If the 3090 did launch at $1199, the $1599 launch price of the 4090 would've looked like an even bigger jump up.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,262
5,259
136
The 4080 12GB looks off because it should be the 4070.

Not sure there ever will be a 4080Ti. Not much of a purpose for it. Unless nVidia just ends up with lots of poor 4090 dies, which seems unlikely.

I think they almost have to name something a 4080Ti. They have so many chips at 4080 level and above, and they will be doing multiple bins of each chip. I don't think there is ever case where NVidia didn't do multiple bins/configs of a GPU die, usually 3 or more. But they really did a lot of dies this time, which makes it harder to find names/slots for them all.

The only question is if the 4080Ti name gets attached to to a bottom AD102 or a maxed AD103.

So for the AD102, expect 4090 Ti, the max enabled, and I'd expect some kind of cut down version, maybe they call it a 4090 20GB, slightly fewer units, maybe even a 12GB model.

For the AD103, possibly the maxed out chip will be the 4080 Ti. Still Lots of room between this and the 4090s. Maybe just two configs, because there isn't much room below, and they already used the 4080 name below.

For the AD104, name is more of problem because this should have been at most a 4070 Ti. So they could definitely slot a slightly cut down version as the 4070 Ti.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,262
5,259
136
In reality, the 3090 should have been a $1200 card tops. Basically in-line with the price of the previous generation flagship, the 2080 Ti, with a $100 premium added on.

Ignore the naming.

The 3090 occupies the slot of the former Titan RTX which was the 24GB version of the 2080 Ti (same chip), and that cost $2500, making the 3090 something of a "bargain" in comparison as well.

For that end of the market both the 3090 and 4090 are not bad relative to what came before, these are all the maxed out 102 chip with 24GB, and each one was a large improvement in perf/$ over the previous generation.

It's always the lower end buyers who get questionable perf/$ increases.
 

Saylick

Diamond Member
Sep 10, 2012
3,385
7,151
136
Computerbase recompiled some benchmarks using the latest Geforce drivers, which nets Ampere some decent gains in CPU limited situations (e.g. CPU bottlenecked games and game in general at lower resolutions). Note that the 4090 does not benefit because it was tested on a non-public driver which looks to have the optimizations in place. So now reviewers either re-test their suite with the latest driver, or what's likely going to happen is that the launch reviews largely stay the same and the 4090 looks better in relation to the 3090 and 3090 Ti than it really is. Can't say I'm surprised that Nvidia did this... It's underhanded, but it has always about the optics for them.





 

Timorous

Golden Member
Oct 27, 2008
1,727
3,152
136
Ignore the naming.

The 3090 occupies the slot of the former Titan RTX which was the 24GB version of the 2080 Ti (same chip), and that cost $2500, making the 3090 something of a "bargain" in comparison as well.

For that end of the market both the 3090 and 4090 are not bad relative to what came before, these are all the maxed out 102 chip with 24GB, and each one was a large improvement in perf/$ over the previous generation.

It's always the lower end buyers who get questionable perf/$ increases.

It did that without having Titan drivers so was a lot lot slower in certain workloads.
 
Reactions: Tlh97

Mopetar

Diamond Member
Jan 31, 2011
8,005
6,449
136
HWUB analyzes DLSS 3 Frame interpolation. He has a similar skeptical view to my own.

Tim notes: Between the visual artifacts, and increased latency, it's really hard to come up with a case where you would use this.
You need to run very high base framerate 120 FPS before you really remove much of DLSS 3 penalties, at which point, you need it less, and if your frame rate exceeds monitor, you will get tearing.

He also says we really shouldn't consider this "real performance".

IMO, it's a mediocre "solution" in search of a problem:

Honestly this makes DLSS3 look absolutely pointless. Unlike DLSS2 which was useful for older mid-range cards to help give a boost and keep them relevant for longer, this is only really useful for the top-end card on today's games.

The worse your starting FPS is, the less useful DLSS3 is so it's not going to look good on a 4060 or with games released 5 years from now that are more taxing and have lower frame rates. Also if you already have a really good frame rate where you won't notice the fake frames and any visual artifacts, you don't really get much more out of the extra frames it adds.
 

Saylick

Diamond Member
Sep 10, 2012
3,385
7,151
136
Honestly this makes DLSS3 look absolutely pointless. Unlike DLSS2 which was useful for older mid-range cards to help give a boost and keep them relevant for longer, this is only really useful for the top-end card on today's games.

The worse your starting FPS is, the less useful DLSS3 is so it's not going to look good on a 4060 or with games released 5 years from now that are more taxing and have lower frame rates. Also if you already have a really good frame rate where you won't notice the fake frames and any visual artifacts, you don't really get much more out of the extra frames it adds.
Yeah, the "sweet spot" for DLSS 3 Frame Generation is the case where you have a 4K120 Hz monitor but can only get around 60 FPS with DLSS 2 and turning on Frame Generation gets you to your monitor's max refresh rate but not above it where tearing happens, or worse you turn on V-sync and your latency takes a hit. This "sweet spot" isn't very large, predictably.
 

Timorous

Golden Member
Oct 27, 2008
1,727
3,152
136
Yeah, the "sweet spot" for DLSS 3 Frame Generation is the case where you have a 4K120 Hz monitor but can only get around 60 FPS with DLSS 2 and turning on Frame Generation gets you to your monitor's max refresh rate but not above it where tearing happens, or worse you turn on V-sync and your latency takes a hit. This "sweet spot" isn't very large, predictably.

That is not what Tim said. Tim said it was if you have a 240+hz monitor and can hit 120+fps already then turning it on gives you smoother visuals and the latency cost is not terrible.

I like how Tim pointed out that when we say card A performs better than card B when refering to FPS we are also implicitly stating that latency for card A is also better and DLSS3 does not give that dual benefit so saying 240fps with DLSS 3 performs better than 120fps with DLSS 2 is not entirely accurate since they give you different things.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,262
5,259
136
Honestly this makes DLSS3 look absolutely pointless. Unlike DLSS2 which was useful for older mid-range cards to help give a boost and keep them relevant for longer, this is only really useful for the top-end card on today's games.

I agree, there is a very narrow slice of usability gained. In no way should this be considered a real frame rate. You need many *'s for all the limitations.

Yeah, the "sweet spot" for DLSS 3 Frame Generation is the case where you have a 4K120 Hz monitor but can only get around 60 FPS with DLSS 2 and turning on Frame Generation gets you to your monitor's max refresh rate but not above it where tearing happens, or worse you turn on V-sync and your latency takes a hit. This "sweet spot" isn't very large, predictably.

According to HWUB, the sweet spot was around 120 FPS base frame rate and a 240+ HZ monitor, and even then not the best idea for competitive gaming.

Around 60 FPS or lower base frame rate the artifacts and latency were more noticeable.

Of course if you already have 120 FPS base frame rate, how much is a some extra visual smoothing going to be needed?
 
Reactions: Tlh97 and Mopetar

Saylick

Diamond Member
Sep 10, 2012
3,385
7,151
136
According to HWUB, the sweet spot was around 120 FPS base frame rate and a 240+ HZ monitor, and even then not the best idea for competitive gaming.

Around 60 FPS or lower base frame rate the artifacts and latency were more noticeable.

Of course if you already have 120 FPS base frame rate, how much is a some extra visual smoothing going to be needed?
That's why I'm saying the sweet spot is 60 fps because that's my personal lower bound of fps needed to get an acceptable input lag. If the sweet spot is a base fps of 120 Hz, is it really a sweet spot if the extra visual smoothing afforded by Frame Generation is not really needed?
 

Mopetar

Diamond Member
Jan 31, 2011
8,005
6,449
136
Why not just turn down the settings at that point? What use is playing on Ultra with DLSS3 just to get better frame rates when half your frames are going to look potentially worse than even Medium instead of just turning down the settings to get the better FPS?

Some of the artifacting isn't that noticeable if it's not what your eyes are focused on, but the flickering text is beyond annoying Sinai hope they can get that fixed.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,262
5,259
136
That's why I'm saying the sweet spot is 60 fps because that's my personal lower bound of fps needed to get an acceptable input lag. If the sweet spot is a base fps of 120 Hz, is it really a sweet spot if the extra visual smoothing afforded by Frame Generation is not really needed?

Except at 60 FPS, when you turn on frame interpolation, you are getting lag more like 40 FPS...

Which is one of the reason he states that you need a higher base frame rate.
 

MrTeal

Diamond Member
Dec 7, 2003
3,584
1,743
136
I agree, there is a very narrow slice of usability gained. In no way should this be considered a real frame rate. You need many *'s for all the limitations.



According to HWUB, the sweet spot was around 120 FPS base frame rate and a 240+ HZ monitor, and even then not the best idea for competitive gaming.

Around 60 FPS or lower base frame rate the artifacts and latency were more noticeable.

Of course if you already have 120 FPS base frame rate, how much is a some extra visual smoothing going to be needed?
Looks like Tim and Steve are going back on the "No Review Samples" list.
 

coercitiv

Diamond Member
Jan 24, 2014
6,393
12,825
136
Yeah, the "sweet spot" for DLSS 3 Frame Generation is the case where you have a 4K120 Hz monitor but can only get around 60 FPS with DLSS 2 and turning on Frame Generation gets you to your monitor's max refresh rate but not above it where tearing happens
That's why I'm saying the sweet spot is 60 fps because that's my personal lower bound of fps needed to get an acceptable input lag.
If you're running 60 FPS /w DLSS, enabling FG will bring you up to ~95+ FPS with the latency of a game running around 45 FPS. You better be running a quickhack build.

 

dacostafilipe

Senior member
Oct 10, 2013
772
244
116
Unlaunching The 12GB 4080

The RTX 4080 12GB is a fantastic graphics card, but it’s not named right. Having two GPUs with the 4080 designation is confusing.

So, we’re pressing the “unlaunch” button on the 4080 12GB. The RTX 4080 16GB is amazing and on track to delight gamers everywhere on November 16th.

If the lines around the block and enthusiasm for the 4090 is any indication, the reception for the 4080 will be awesome.


Ehmm .... what?
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,262
5,259
136

Ehmm .... what?



I thought that was someones idea of joke, until I checked the source.

I guess that is NVidia responding to criticism, sort of. I guess they relaunch as the 4070Ti.

But if they don't also drop the price, the criticism will only redouble...

So will the first run of 4070Ti have an official NVidia 70ti sticker covering the last part of the number?
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |