4070 reviews thread

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

jpiniero

Lifer
Oct 1, 2010
15,161
5,695
136
The problem is that the 4070 is using the third largest due this generation when normally it uses the second largest die. Most people wouldn't notice since it's still the 104 die, but this time the xx80 GPU is using AD-103 and historically the xx70 GPU is just a more cut down version of that same die.

The x103 die has usually existed on paper I think, nVidia just never used it until Ampere.
 

MrTeal

Diamond Member
Dec 7, 2003
3,609
1,807
136
I imagine part of the pain in designing these mid tier cards is that wide memory buses are expensive both in die size and power consumption. The memory controller doesn't scale nearly as well as logic, so it probably makes performance sense to shrink it as much as possible. It just forces memory to some weird choices.

I'm really not even sure that 192/12GB isn't unreasonable for a 4070. The pain in the stack is that the 4070 Ti should probably be a 16GB cut down AD103, and then 4070/4060 Ti running 12GB. 16GB would be better, but it's probably not nearly as bad as the 8GB was on the 3070 Ti or will be on the 4060 Ti. With the 4060 probably being around 3060 Ti performance, it and the 4050 would probably be fine with 128/8GB as long as it does come in as a ~$350ish card.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
29,558
24,418
146
Nvidia or AMD launch 8GB cards for over $300 and they will get demolished in reviews, and by gamers. There are already $350 cards with 12GB and 16GB. If you have to reduce settings of the new shiny when you don't on last gen, how do you spin that?
 

Rigg

Senior member
May 6, 2020
540
1,273
136
The size of the 4070 is the big advantage in the match up. I could throw one in my NR200 but my 6800 is way too big.
I'm confused by this comment. An NR200 can handle triple slot 330 mm cards. There are a bunch of 6800/6800xt/6900xt cards that will fit in that case. A reference card would fit with room to spare. I've only ever built in a NR200 knock off so maybe I'm missing something. That being said it certainly could be an issue for a lot of cases. The general trend toward massive GPU coolers is annoying as hell. I had to go sans middle fan in my meshify C to get my 6900xt Asus TUF card to fit.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
29,558
24,418
146
I'm confused by this comment. An NR200 can handle triple slot 330 mm cards. There are a bunch of 6800/6800xt/6900xt cards that will fit in that case. A reference card would fit with room to spare. I've only ever built in a NR200 knock off so maybe I'm missing something. That being said it certainly could be an issue for a lot of cases. The general trend toward massive GPU coolers is annoying as hell. I had to go sans middle fan in my meshify C to get my 6900xt Asus TUF card to fit.
Fair question. I think I could just squeak it in with the cutout so it sits almost against the front panel. It would also have to rely on sucking in air from the bottom of the case, since it will be right next to it. However, I'd have to pull the top fans, top mount my AIO, and get a different PSU. My current EVGA 650W gold is long, and cables jutt out and interfere with long cards. It was $50 or so in For sale/for trade, couldn't pass it up.

Dual fan 6600XT and 3060ti are drop ins the way it's setup. My main PC is a Phantek full tower.

Hence, I now amend my comment to reflect, that while the card could technically fit, and I mean JUST fit. My XFX QICK is way too big for my setup.

EDIT: Dammit @Rigg. 😝 Now I know what this weekend's project is. I have a gold 550W SFX that is shorter and the cables don't interfere. I'll top mount the rad and see if I can cram it in there. That or I'll pull the AIO and use a Wraith Spire on the 5600X to simplify things.
 
Last edited:

blckgrffn

Diamond Member
May 1, 2003
9,290
3,435
136
www.teamjuchems.com
Nvidia or AMD launch 8GB cards for over $300 and they will get demolished in reviews, and by gamers. There are already $350 cards with 12GB and 16GB. If you have to reduce settings of the new shiny when you don't on last gen, how do you spin that?

But... DLSS.

That's how.

Oh yeah, and your army of influencers. Use them too.
 

Rigg

Senior member
May 6, 2020
540
1,273
136
Fair question. I think I could just squeak it in with the cutout so it sits almost against the front panel. It would also have to rely on sucking in air from the bottom of the case, since it will be right next to it. However, I'd have to pull the top fans, top mount my AIO, and get a different PSU. My current EVGA 650W gold is long, and cables jutt out and interfere with long cards. It was $50 or so in For sale/for trade, couldn't pass it up.

Dual fan 6600XT and 3060ti are drop ins the way it's setup. My main PC is a Phantek full tower.

Hence, I now amend my comment to reflect, that while the card could technically fit, and I mean JUST fit. My XFX QICK is way too big for my setup.

EDIT: Dammit @Rigg. 😝 Now I know what this weekend's project is. I have a gold 550W SFX that is shorter and the cables don't interfere. I'll top mount the rad and see if I can cram it in there. That or I'll pull the AIO and use a Wraith Spire on the 5600X to simplify things.
I just squeaked a peerless assassin 120 in the Sama clone but a spire is probably just fine for the 5600x. Especially if its one of the good ones with the copper slug.

I think you make a good general point about the size though. The 6800/6900 reference is 267 mm long while the Zotac Twin Edge 4070 is 226 mm. That's a lot of GPU power in a small card.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
29,558
24,418
146
I just squeaked a peerless assassin 120 in the Sama clone but a spire is probably just fine for the 5600x. Especially if its one of the good ones with the copper slug.

I think you make a good general point about the size though. The 6800/6900 reference is 267 mm long while the Zotac Twin Edge 4070 is 226 mm. That's a lot of GPU power in a small card.
Yup, I could toss a dual fan 4070 in my the NR200 and have a powerful, cool and quiet gamer.

BTW. Cooler Master says I can't top mount the AIO, so that would mean modding = nope. Air cooling it is then. It also states max card length is 330mm and my card is 340mm. But imma do it anyways. I'll PM you pic if I get it in there. that's what she said
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
29,558
24,418
146
But... DLSS.

That's how.

Oh yeah, and your army of influencers. Use them too.
DLSS and FSR look really bad when starting from native 1080. That ain't gonna fly. The reality distortion field is failing. Those trying to damage control and tell everyone 8GB is fine for 1080 are losing the fight everywhere I look. They are trying, but VRAM is rapidly dominating mindshare in a way DLSS and RT have never done.

4070 is going to age poorly because of 12GB. Most of us here agree on that. Releasing 8GB for over $300? I will be like this -

 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
29,558
24,418
146
PS5 has 13-14 GB given to games. 12 really should be fine as long as DirectStorage delivers.
Good point. I should have qualified it as aging poorly for a $600+ card.
 

fleshconsumed

Diamond Member
Feb 21, 2002
6,485
2,363
136
PS5 has 13-14 GB given to games. 12 really should be fine as long as DirectStorage delivers.
That's a poor bet going forward. PC ports typically have higher more demanding visuals which use more VRAM, you're also assuming DirestStorage will deliver. $600 is a lot of money to bet that it all works out. I wouldn't make that bet. That's just planned obsolesce on nvidia's part.
 

blckgrffn

Diamond Member
May 1, 2003
9,290
3,435
136
www.teamjuchems.com
That's a poor bet going forward. PC ports typically have higher more demanding visuals which use more VRAM, you're also assuming DirestStorage will deliver. $600 is a lot of money to bet that it all works out. I wouldn't make that bet. That's just planned obsolesce on nvidia's part.
Two x70 generations in a row where it's like the GPU will be fine when it becomes less desirable.

I still think the 3070 is the worst offender. Oh well.
 

Kocicak

Golden Member
Jan 17, 2019
1,070
1,129
136
Today I got the cheapest sort of 4070, it is the three fan Gigabyte model, it has aluminium backplate, I believe the cheapest three fan MSI has just a plastic cover. I think this may be the best cheap 4070 available.


Here are some numbers after a few hours of playing, the max power draw seems to be set to 200W, the maximal boost frequency was 2850 MHz.



BTW the case is poorly ventilated, it might have influenced the numbers somehow, I think I should tie the front intake fans to GPU temperature...

I quite like this GPU, before that I tested 4070 Ti, which is too powerful for me and I also felt stupid for having such expensive card with just 12 GB of RAM, I feel less stupid now with a plain 4070, which I got for nearly 30% less money than what I paid for the Ti model.

BTW the upscaling and frame generation seems to work fine, I like it.
 

blckgrffn

Diamond Member
May 1, 2003
9,290
3,435
136
www.teamjuchems.com
Today I got the cheapest sort of 4070, it is the three fan Gigabyte model, it has aluminium backplate, I believe the cheapest three fan MSI has just a plastic cover. I think this may be the best cheap 4070 available.


Here are some numbers after a few hours of playing, the max power draw seems to be set to 200W, the maximal boost frequency was 2850 MHz.
(snipped)
BTW the case is poorly ventilated, it might have influenced the numbers somehow, I think I should tie the front intake fans to GPU temperature...

I quite like this GPU, before that I tested 4070 Ti, which is too powerful for me and I also felt stupid for having such expensive card with just 12 GB of RAM, I feel less stupid now with a plain 4070, which I got for nearly 30% less money than what I paid for the Ti model.

BTW the upscaling and frame generation seems to work fine, I like it.
That's the winning mindset, I think. We'll look back in a couple years and the difference in GPU power between the 4070 an 4070ti will be a wash but in the mean time you put hundreds of dollars back in your pocket. Winning!

What are the titles you are playing and what did you upgrade from?
 
Feb 4, 2009
35,245
16,713
136
A little different right now as PC sales have crashed as we navigate high inflation and massive interest rate hikes aimed at taming that inflation. Plus worries that inflation battle will lead to recession. It's significant damper on a lot of discretionary spending.

So probably not crushing demand like some previous releases, but still a large success that will quickly rise up the Steam HW survey, and it will probably kill 6800XT sales.

So some really good 6800 XT prices might be coming for value buyers.
I’ve been speculating for years, maybe a decade. Will video cards ultimately become like discrete audio where basic on board audio chips work perfectly fine for what the overwhelming majority of us do.
Are we approaching that point that point where the cards are all close to maxed out power and new generations will only marginally improved.
 

Kocicak

Golden Member
Jan 17, 2019
1,070
1,129
136
That's the winning mindset, I think. We'll look back in a couple years and the difference in GPU power between the 4070 an 4070ti will be a wash but in the mean time you put hundreds of dollars back in your pocket. Winning!

What are the titles you are playing and what did you upgrade from?
I am a special case, I got to gaming just recently, now I have just three titles in my steam account, Hogwarts legacy, Forza Horizon 5 and F1 22. I am thinking about MS flight sim.

So I did not upgrade from anything, I have a small 24 inch FHD gaming monitor now and a normal 32 inch 60HZ 4K monitor, I hope this 4070 will be able to run the gaming monitor fine.
 
Reactions: blckgrffn

Heartbreaker

Diamond Member
Apr 3, 2006
4,334
5,451
136
I’ve been speculating for years, maybe a decade. Will video cards ultimately become like discrete audio where basic on board audio chips work perfectly fine for what the overwhelming majority of us do.
Are we approaching that point that point where the cards are all close to maxed out power and new generations will only marginally improved.

So you mean Integrated Graphics takes over a wider swath making dGPU much more niche?

My speculation is that integrated graphics gets a serious competitive boost when Windows on ARM gets more pervasive, and games start including ARM binaries.

That opens up the field for anyone to make a Windows SoC/APU.

Then you can have AMD, NVidia, Intel, Qualcomm, Samsung, (and many more) all competing to make the best Windows SoC/APU, and those competitive forces are bound to drive the integrated GPU performance upward.

Then I could really see discrete GPUs become much more niche.
 
Feb 4, 2009
35,245
16,713
136
So you mean Integrated Graphics takes over a wider swath making dGPU much more niche?

My speculation is that integrated graphics gets a serious competitive boost when Windows on ARM gets more pervasive, and games start including ARM binaries.

That opens up the field for anyone to make a Windows SoC/APU.

Then you can have AMD, NVidia, Intel, Qualcomm, Samsung, (and many more) all competing to make the best Windows SoC/APU, and those competitive forces are bound to drive the integrated GPU performance upward.

Then I could really see discrete GPUs become much more niche.
More or less. Basically nearly all the low hanging fruit is gone and now an excessive amount of Design or power or whatever needs to be put into the discreet cards to make them relevant. Also why a good swath of nvidia cards are gimped with low amounts of memory. Easier to support the upper end that way and it keeps margins healthy. Just sucks for us, however at some point it will change.
I am totally spitballing and I am certainly not an expert in the field.
 

Mopetar

Diamond Member
Jan 31, 2011
8,099
6,725
136
The x103 die has usually existed on paper I think, nVidia just never used it until Ampere.

It doesn't matter what they call the dies. It's the relative position in the stack. In recent prior history, the xx80 GPU was almost always made on the 102 die, which was typically the top consumer die. But if you go back prior to Pascal, the top die didn't use the 102 designation. Instead it was GM-200 for Maxwell or GK-100/GK-100 (or GK-110/GF-110 for the second generation) for Kepler/Fermi.

The only other time an xx70 GPU was on the third die was Turing, and that is probably one of the worst regarded generations of NVidia cards in long time. And when you look back to Fermi and Kepler when NVidia really started to solidify their modern naming scheme the x70 GPU was on the top die, in some cases having close to 90% of the computational power as the top of the line GPU being sold. Now it's barely 1/3 of the top-end GPU and when NVidia releases a 4090 Ti it might drop below that.

This is an xx70 card in name only. It's a pale imitation of what xx70 once meant and the worst xx70 card relative to the top card ever released. Though it's been something that's been diluted over time, I think they've ruined a proud heritage by calling this thing a 4070. They may as well rename it from Lovelace to Loveless.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |