4070 reviews thread

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SMOGZINN

Lifer
Jun 17, 2005
14,221
4,452
136
Here we are 22 years after the first 4k monitors were sold and we still don't have a mainstream card that can handle them. We are now firmly in the territory where I can buy multiple high end 4k monitors cheaper then a video card to run them.
 

Kocicak

Golden Member
Jan 17, 2019
1,070
1,129
136
4070 seems like a great efficient card with (only) 12 GB of RAM.

A lot of complaints about this card would be eliminated, if they had a version of it with 16 GB.

What is not so great is a lack of competiton from AMD, they still have just old 6000 series in this segment. What are they doing???
 

psolord

Platinum Member
Sep 16, 2009
2,089
1,234
136
That won't stop it from choking at 4K and eventually 1440p just as quick once more games come out that push the VRAM buffer.

It will be fine for a while just like the 970 was and then it won't be. Won't choke as fast as the 8GB (960 2GB back then) did but won't last like the 980Ti (4090) did either.

Duuude, fewer words regarding the GTX 970 OK?

My GTX 970 still runs The Epic Fail of Us, with correct settings and this run is not even with the good version that came later. (non monetized channel-just for fun, for stupid responses like this one xD)


Yes yes, it's not perfect, BUT still runs this PS5 remake better than the previous PS4s did let alone all XBOXes combined, if you get my drift...
 

Hans Gruber

Platinum Member
Dec 23, 2006
2,302
1,216
136
It kind of makes me think AMD and Nvidia are colluding on supply and price of GPU's. These types of arrangements do not end well.
 

In2Photos

Golden Member
Mar 21, 2007
1,991
2,020
136
Cool. A lot of people who want 4070 level of performance will buy this new card instead of old and power hungry RX 6000 cards, and that just because they have nothing to buy from AMD.
As I said before I tuned my 6800XT to use less than 200W and still get stock performance while having 33% more vRAM and still paid less than the $600 MSRP of the 4070. Yes it would be nice if AMD had new cards to compete, but the fact that last gen cards still compete says quite a bit about this release from Nvidia.
 

jpiniero

Lifer
Oct 1, 2010
15,161
5,695
136
Cool. A lot of people who want 4070 level of performance will buy this new card instead of old and power hungry RX 6000 cards, and that just because they have nothing to buy from AMD.

The 6950 XT should be 10-15% faster in raster than the 4070 and you get more VRAM. I doubt AMD is happy about selling the 6950 XT at $599 or so but that's what they are doing.
 

blckgrffn

Diamond Member
May 1, 2003
9,290
3,435
136
www.teamjuchems.com
The 6950 XT should be 10-15% faster in raster than the 4070 and you get more VRAM. I doubt AMD is happy about selling the 6950 XT at $599 or so but that's what they are doing.

I don't know why they would be that mad though? It's got the same BoM as 6800XT essentially and that launched with a $650 MSRP, which certainly had some profit margin baked in. Given the time, I would expect yields to be solid. It still commands a small premium over the 6800XT as well. I am guessing AMD is "fine" with ~$600-$700 for the 6950XT in the way that my is "fine" with me

No doubt a similar performing 7800XT would be more efficient out of the box and have a better margin (maybe we would doubt it?), but it's not like they are totally MIA from the segment. nvidia has pushed the pricing models up high enough that you can imagine some at AMD are wondering why they would pay to develop a new $600 part at all when what they have is "fine" and they continue to "win" at prices sub $600 where surely many sales are being made in the retail channel. By winning I mean their products are already highly competitive on all fronts and they are probably wooing as many GPU curious buyers as they can.

Sure it's no EPYC or even a FireGL chip but its not like the 7nm TSMC chips are the hot stuff that they want to be using for those products anyway.

With regards to the 4070, the real sin is that the RDNA 2 parts are old, not that they are too slow or really too power hungry with even a modicum of tuning. I am certain that is lip service now given how expansive the power budgets for CPUs and GPUs have been in the last few years, and anyone that really cares about power consumption is going to be limiting frame rates, etc. to keep their power bill in check.

What I would like like to see (for grins) is a power usage graph at say capped 90 fps at 1440P high settings on these cards. Every one would argue what those settings would be, I get it, but running uncapped FPS and measuring power usage is silly given how trivial it is to trim it down. I suppose it's like every test being run at "Ultra" settings when we know their is likely something in that "ultra" that kneecaps frame rates and is nearly imperceptible in game.
 

Golgatha

Lifer
Jul 18, 2003
12,239
643
126
I don't know why they would be that mad though? It's got the same BoM as 6800XT essentially and that launched with a $650 MSRP, which certainly had some profit margin baked in. Given the time, I would expect yields to be solid. It still commands a small premium over the 6800XT as well. I am guessing AMD is "fine" with ~$600-$700 for the 6950XT in the way that my is "fine" with me

No doubt a similar performing 7800XT would be more efficient out of the box and have a better margin (maybe we would doubt it?), but it's not like they are totally MIA from the segment. nvidia has pushed the pricing models up high enough that you can imagine some at AMD are wondering why they would pay to develop a new $600 part at all when what they have is "fine" and they continue to "win" at prices sub $600 where surely many sales are being made in the retail channel. By winning I mean their products are already highly competitive on all fronts and they are probably wooing as many GPU curious buyers as they can.

Sure it's no EPYC or even a FireGL chip but its not like the 7nm TSMC chips are the hot stuff that they want to be using for those products anyway.

With regards to the 4070, the real sin is that the RDNA 2 parts are old, not that they are too slow or really too power hungry with even a modicum of tuning. I am certain that is lip service now given how expansive the power budgets for CPUs and GPUs have been in the last few years, and anyone that really cares about power consumption is going to be limiting frame rates, etc. to keep their power bill in check.

What I would like like to see (for grins) is a power usage graph at say capped 90 fps at 1440P high settings on these cards. Every one would argue what those settings would be, I get it, but running uncapped FPS and measuring power usage is silly given how trivial it is to trim it down. I suppose it's like every test being run at "Ultra" settings when we know their is likely something in that "ultra" that kneecaps frame rates and is nearly imperceptible in game.

Anecdotal thing here, but I was running a 3DMark stress test in a window so I could monitor temps, etc. in another window. I guess it locked me to 165 FPS likely due to a windowed versus full-screen setting in the drivers. Anyway, my system was getting 165 FPS with 27% GPU utilization on my 4090 and doing it all within about a 200w power budget.
 
Last edited:

MrTeal

Diamond Member
Dec 7, 2003
3,609
1,806
136
Huh? The 6950 XT pulls 300W at stock without any PL tuning. Yes that's 50% more than the 4070 for only 15% more performance, but if 100W is the difference in tripping breakers you need to revisit your electrical.

Now, if you want to say that you're concerned about the cost to the environment or your wallet, fair enough. Where I am if I average 2 hours of gaming a day I'd pay about $10 more in electricity a year which isn't a tonne, but it's not zero.
 
Reactions: Tlh97 and Elfear

blckgrffn

Diamond Member
May 1, 2003
9,290
3,435
136
www.teamjuchems.com
Anecdotal thing here, but I was running a 3DMark stress test in a window so I could monitor temps, etc. in another window. I guess it locked me to 165 FPS likely due to a windowed versus full-screen setting in the drivers. Anyway, I was getting 165 FPS with 27% GPU utilization on my 4090 and doing it all within about a 200w power budget.

Nice - even anecdotal that is the kind of thing I am talking about. With Radeon Chill my 6800 is constantly flirting with ~100W (I'll have to check it again) while I game at my preferred settings.

If you were happy with "just" 165 fps, then you are running effortlessly (given the extra ~200w potential power usage?) inside the max power budget of the 4070. And if the 4070 has to run full bonkers mode to get 165 fps and use the same 200W, what kind of efficiency increase we even seeing?

So I guess I am saying, normalized for some performance level, what is the power usage?
 

Golgatha

Lifer
Jul 18, 2003
12,239
643
126
Nice - even anecdotal that is the kind of thing I am talking about. With Radeon Chill my 6800 is constantly flirting with ~100W (I'll have to check it again) while I game at my preferred settings.

If you were happy with "just" 165 fps, then you are running effortlessly (given the extra ~200w potential power usage?) inside the max power budget of the 4070. And if the 4070 has to run full bonkers mode to get 165 fps and use the same 200W, what kind of efficiency increase we even seeing?

So I guess I am saying, normalized for some performance level, what is the power usage?
If you shoot for settings both cards can run a game at 165 FPS at, then the power budgets would be nearly identical, but % utilization for the 4070 would be around 75% (rough estimation) compared to 27% on my 4090 in this 3DMark stress test example. They're the same architecture and the only advantage the 4090 would give is to run at settings the 4070 isn't capable of running, but it will use more power overall when doing so.
 
Last edited:
Reactions: blckgrffn

blckgrffn

Diamond Member
May 1, 2003
9,290
3,435
136
www.teamjuchems.com
If you shoot for settings both cards can run a game at 165 FPS at, then the power budgets would be nearly identical, but % utilization for the 4070 would be around 75% (rough estimation) compared to 27% on my 4090 in this 3DMark stress test example. They're the same architecture and the only advantage the 4090 would give is to run at settings the 4070 isn't capable of running, but it will use more power overall when doing so.

Yeah, I am just wondering if the scaling is really that perfect, or if the cut L2 and other decisions might make it in fact scale worse especially as it (the 4070) nears its performance limit.

Right at the limit is where I observe these parts being at their worst, especially given how aggressively they have been tuned.

Regardless I appreciate you sharing

All this talk of efficiency seems focused on the bleeding edge of performance for a given card.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
29,558
24,418
146
If you shoot for settings both cards can run a game at 165 FPS at, then the power budgets would be identical. They're the same architecture and the only advantage the 4090 would give is to run at settings the 4070 isn't capable of running, but it will use more power overall when doing so.
I have done this with a RX 6400 and RX 6600XT. The 6600XT can run the same game same settings same frame cap, and use LESS power than the 6400.

It is interesting to read the discussion begin to focus on power consumption for evaluating the 4070 v 6950XT at the same price. It is the same battle Intel users fight in the CPU forum. Reviews show it being a power hog. Users explain that with some simple tweaking they get the performance they are after without anywhere near the crazy power draw reviewers clamor about.

Anyone that has been in this hobby a hot second knows it is perfunctory to undervolt or otherwise power limit AMD GPUs immediately upon install. Or as is popular with many of us now, use Chill. You could get similar performance to the 4070 running full out, and probably draw no more than 50W more with the previous AMD flagship. That's a guess of course, but someone will test it. The size of the 4070 is the big advantage in the match up. I could throw one in my NR200 but my 6800 is way too big.

One last note on reviews. We know everything is run full out because it's a race, and the winners often, literally get more cash, just like any other racing. I view it as what I can expect as the card ages and I have to pull out all the stops to get acceptable performance. However, going back to Polaris and Vega I haven't had an AMD card that wouldn't do better than stock performance while undervolted or otherwise power limited. So the worst case scenario of review power figures is just that, worst case.
 

Mopetar

Diamond Member
Jan 31, 2011
8,099
6,725
136
The problem is that the 4070 is using the third largest due this generation when normally it uses the second largest die. Most people wouldn't notice since it's still the 104 die, but this time the xx80 GPU is using AD-103 and historically the xx70 GPU is just a more cut down version of that same die.

In any previous generation this would have been an xx60 card. Over time the xx70 branding has really slid down the stack. If you go back far enough the xx70 cards were the second best NVidia card you could buy, but now it's been shoved so far down the stack that it's about as close to the bottom as it is to the top.
 

blckgrffn

Diamond Member
May 1, 2003
9,290
3,435
136
www.teamjuchems.com
The problem is that the 4070 is using the third largest due this generation when normally it uses the second largest die. Most people wouldn't notice since it's still the 104 die, but this time the xx80 GPU is using AD-103 and historically the xx70 GPU is just a more cut down version of that same die.

In any previous generation this would have been an xx60 card. Over time the xx70 branding has really slid down the stack. If you go back far enough the xx70 cards were the second best NVidia card you could buy, but now it's been shoved so far down the stack that it's about as close to the bottom as it is to the top.

Just marketing mind games. The numbers mean nothing, really, except what nvidia wants them to mean.

The marketing department figured out the "70" number was really valuable and had already launched the "90" number, so now it's time to milk it. Think of the shareholder value.
 
Reactions: Saylick

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
29,558
24,418
146
The problem is that the 4070 is using the third largest due this generation when normally it uses the second largest die. Most people wouldn't notice since it's still the 104 die, but this time the xx80 GPU is using AD-103 and historically the xx70 GPU is just a more cut down version of that same die.

In any previous generation this would have been an xx60 card. Over time the xx70 branding has really slid down the stack. If you go back far enough the xx70 cards were the second best NVidia card you could buy, but now it's been shoved so far down the stack that it's about as close to the bottom as it is to the top.
Cool explanation, thanks for taking the time.

I equate this to Nvidia learning to use the same marketing tricks as movie theaters. It's price discrimination. The larger the size you get, the more value you get. Pick the smaller one and you get ripped off the most.
 

In2Photos

Golden Member
Mar 21, 2007
1,991
2,020
136
The problem is that the 4070 is using the third largest due this generation when normally it uses the second largest die. Most people wouldn't notice since it's still the 104 die, but this time the xx80 GPU is using AD-103 and historically the xx70 GPU is just a more cut down version of that same die.

In any previous generation this would have been an xx60 card. Over time the xx70 branding has really slid down the stack. If you go back far enough the xx70 cards were the second best NVidia card you could buy, but now it's been shoved so far down the stack that it's about as close to the bottom as it is to the top.
Jensen tries to tell us that Moore's law is dead, but it really seems like Nvidia has just shifted the dies in the stack. So if this were a 60 series card it would have previous gen 80 series performance just like it did before. On top of that they raised the price point. So it's really a 60 series for $600. Yikes.
 

Saylick

Diamond Member
Sep 10, 2012
3,510
7,766
136
Just marketing mind games. The numbers mean nothing, really, except what nvidia wants them to mean.

The marketing department figured out the "70" number was really valuable and had already launched the "90" number, so now it's time to milk it. Think of the shareholder value.
The problem is that the 4070 is using the third largest due this generation when normally it uses the second largest die. Most people wouldn't notice since it's still the 104 die, but this time the xx80 GPU is using AD-103 and historically the xx70 GPU is just a more cut down version of that same die.

In any previous generation this would have been an xx60 card. Over time the xx70 branding has really slid down the stack. If you go back far enough the xx70 cards were the second best NVidia card you could buy, but now it's been shoved so far down the stack that it's about as close to the bottom as it is to the top.
Yep, when the top dog is now a 90-class card when it used to be an 80-class card, the 70-class card is basically what used to be a 60-class card. A 60-class die should be roughly middle of the pack for affordable and accessible prices for most people. $600 is not exactly affordable and accessible to most.

The worst part is that while it matches a RTX 3080 in raster, the only leg Nvidia can use to prop up it's high price tag is software, chiefly DLSS 3. The increase from 10 GB to 12 GB is not worth much when we really should be getting 16 GB at the $600 price point.

However, I think we all know the truth that if Nvidia offered the following, it would crush their potential future sales because such a card would satisfy the vast majority of their userbase for many years.
- RTX 3080 Ti performance (solid for 1440p gaming)
- DLSS 3
- 16 GB VRAM
- $499 MSRP (I'd actually be okay with $599 if the above were at least true)

A true 70-class card should have been the above, where it almost matches the previous flagship for ~$500.

Historically, we've had the following:
GTX 1070 8GB = GTX Titan X 12 GB for ~$400
RTX 2070 Super 8 GB = RTX 1080 Ti 11 GB for $499
RTX 3080 8 GB = RTX 2080 Ti 11 GB for $499

And now?
RTX 4070 12 GB = RTX 3080 10 GB for $599.

I think it's especially telling that they've restricted their 16 GB cards to the overly priced $1200 RTX 4080, which gives them PLENTY of possibilities for planned obsolescence down the road. There's literally no Ada offering with 16 GB until you cross the $1000 mark.

It would be super obvious to me when Nvidia launches Blackwell that we get a 16 GB card at $700 in the form of the RTX 5070 that matches the RTX 4080 in performance, and then we'll hear JHH say the line:
 
Last edited:
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |