4070 reviews thread

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Heartbreaker

Diamond Member
Apr 3, 2006
4,263
5,260
136
That story was a bit confusing as to whether nVidia wanted to cancel orders or delay it. Either way with AI at full hype they shouldn't have any problem with the N4 wafers.

All such stories are most likely made up nonsense. They are also never verifiable, and as such great fodder for the clickbaiters to build stories around.

In reality all of these contracts will have extensive contingency sections. Slowdowns will be handled as specified in the contracts.

Everyone likes to dream that a company will be forced to produce more than they need, leading to a big oversupply, and steep discounts. But that is a pipe dream.

The mining boom and bust presents the biggest chance for oversupply discounts. This happened twice recently with a nearly perfect storm, when transitioning from NVidia 1000 to 2000 series, and again recently from 3000 to 4000.

You had production running at max trying to fill insatiable mining+gaming demand, and then that bubble just popped and demand evaporated, and both times it was also at the end of a product cycle just before release of new generation. This is the absolute worse timing for a company as they just can't sit back and wait for product to clear, they have to clear it fast to make way for new generation.

So with that perfect storm we did get some discounts. I remember some nice 1080Ti pricing, but nothing like a fire sale. I remember people waiting because there were assuming the oversupply would force much steeper discounts, but 1080 Ti suppy was essentially gone in about a month, with no steeper discounts.

This time the discounts look steeper on 3080 Ti and above, but those were mostly highly inflated mining prices brought back to reality, and again they cleared quickly. People were talking about NVidias oversupply long after the cards were gone, and they offered no discounts at all on lower end parts, likely have lower supply, and more time for them to draw down.

IMO, that's about as good as you can expect on inventory oversupply discounts, and both times they were rather limited.

Now the current situation, there is ZERO pressure on 4000 series. It's new, and NVidia has all the time in the world to adjust production/inventory levels. There is no 5000 series coming anytime soon. They have multiple options to use wafer capacity, redirecting them to hottest products when one backs up in the channel.

In theory AMD could come out with a super value product to blunt NVidia sales but I see small chance of that. They could come out with a RX 7800, with 16GB and charge $100 less, but that's a double impact on AMDs margin. Recognize that AMDs current price for the 6800XT previous gen clearance, is only $20 less MSRP than 4070. I think it will painful enough to start the 16GB 7800 XT at $600, let alone $500...
 
Reactions: PJVol

coercitiv

Diamond Member
Jan 24, 2014
6,403
12,864
136
Pat Gelsinger visited Taiwan twice just to make sure TSMC will follow said contracts to the letter.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,814
4,108
136
Here's the latest at videocards.com. An article stating the models of all next gen GPU's that are at or below MSRP. There is a selection for everything but the 4070 which is just listed as "Too many to list". Must be a hot seller!

Also, possibly the stupidest argument for NVIDIA's VRAM shenanigans yet:

"I think Nvidia 12gb VRAM = 14GB AMD VRAM cuz compression is more advanced so its more efficient...". To be fair, the username is THC, so they may not have the most sound of mind at the moment.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,263
5,260
136
Here's the latest at videocards.com. An article stating the models of all next gen GPU's that are at or below MSRP. There is a selection for everything but the 4070 which is just listed as "Too many to list". Must be a hot seller!

Also, possibly the stupidest argument for NVIDIA's VRAM shenanigans yet:

"I think Nvidia 12gb VRAM = 14GB AMD VRAM cuz compression is more advanced so its more efficient...". To be fair, the username is THC, so they may not have the most sound of mind at the moment.

So 4000 series is selling for MSRP. How is this an issue worth mentioning?

Why are reading, let alone repeating dumb user comments at rumor site. What next dumb comments from WCCFTech users?
 

maddie

Diamond Member
Jul 18, 2010
4,787
4,771
136
All such stories are most likely made up nonsense. They are also never verifiable, and as such great fodder for the clickbaiters to build stories around.

In reality all of these contracts will have extensive contingency sections. Slowdowns will be handled as specified in the contracts.

Everyone likes to dream that a company will be forced to produce more than they need, leading to a big oversupply, and steep discounts. But that is a pipe dream.

The mining boom and bust presents the biggest chance for oversupply discounts. This happened twice recently with a nearly perfect storm, when transitioning from NVidia 1000 to 2000 series, and again recently from 3000 to 4000.

You had production running at max trying to fill insatiable mining+gaming demand, and then that bubble just popped and demand evaporated, and both times it was also at the end of a product cycle just before release of new generation. This is the absolute worse timing for a company as they just can't sit back and wait for product to clear, they have to clear it fast to make way for new generation.

So with that perfect storm we did get some discounts. I remember some nice 1080Ti pricing, but nothing like a fire sale. I remember people waiting because there were assuming the oversupply would force much steeper discounts, but 1080 Ti suppy was essentially gone in about a month, with no steeper discounts.

This time the discounts look steeper on 3080 Ti and above, but those were mostly highly inflated mining prices brought back to reality, and again they cleared quickly. People were talking about NVidias oversupply long after the cards were gone, and they offered no discounts at all on lower end parts, likely have lower supply, and more time for them to draw down.

IMO, that's about as good as you can expect on inventory oversupply discounts, and both times they were rather limited.

Now the current situation, there is ZERO pressure on 4000 series. It's new, and NVidia has all the time in the world to adjust production/inventory levels. There is no 5000 series coming anytime soon. They have multiple options to use wafer capacity, redirecting them to hottest products when one backs up in the channel.

In theory AMD could come out with a super value product to blunt NVidia sales but I see small chance of that. They could come out with a RX 7800, with 16GB and charge $100 less, but that's a double impact on AMDs margin. Recognize that AMDs current price for the 6800XT previous gen clearance, is only $20 less MSRP than 4070. I think it will painful enough to start the 16GB 7800 XT at $600, let alone $500...
I think, therefore it is?
 

Thunder 57

Platinum Member
Aug 19, 2007
2,814
4,108
136
So 4000 series is selling for MSRP. How is this an issue worth mentioning?

Why are reading, let alone repeating dumb user comments at rumor site. What next dumb comments from WCCFTech users?

I'd rather nothing sell at or near MSRP. If people buy this at these prices they will know they can get away with it (AMD too).

I read comments to get a view of what the market is thinking. I'm seeing a lot of first time AMD GPU purchasers for example. Gives you an idea of what the future might hold.
 
Feb 4, 2009
34,703
15,951
136
Is intel starting to look better yet?
Seems they are our best hope. Drivers appear to improved substantially. Now we just need a new release with better “stuff” inside.
 
Reactions: Thunder 57

Thunder 57

Platinum Member
Aug 19, 2007
2,814
4,108
136
So you liked Mining prices better?

Selling at MSRP was kind of the norm before Mining.

Of course not. And selling at MSRP was fine, when MSRP's weren't a bunch of crap that jump by a huge amount in one genereation for cards that didn't increase in performance anywhere like they used to for the past... seems like forever.
 
Reactions: KompuKare

Rigg

Senior member
May 6, 2020
475
1,004
136
May be not so flawed as you may think. I actually made a multicriteria decision making model with the criteria being performance (raster and RT), performance per price, RAM and energy efficiency. I evaluated average of 4K and 1440P performance.

I was shocked how often 4090 was coming on top with different criteria weights combinations. Even if you did not value RT performance strongly. Frankly if I had a use for such high performance and could justify paying such price for it, it is a no brainer, 4090 is an exceptional product - it is a benchmark of today. Before I thought 4090 is a silly overpriced product, now I think differently.

By the way RX 7900 XTX was also very often coming on top, often on par or just slightly behind the 4090.

4070 Ti was evaluated often so bad as RX 6000 cards thanks to its low RAM. I believe the 4070 would pull ahead of them a bit thanks to its better perf per price.

I could add the 4070 to the evaluation and update the prices... Would anybody interested to see the model?
I can see how things might change if you factor in RAM and power efficiency. The post I replied to didn't mention either of those things. I question how to properly weigh RAM though. Once you are over 16GB I'm not sure it matters that much at least for gaming workloads.
 

Mopetar

Diamond Member
Jan 31, 2011
8,024
6,479
136
Looking at that reminds me of the outrage when the 2000 series launched. Significant price increase for moderate performance increase. Yet that is tame compared to what they are doing this time.

Honestly I think the 2000 series wasn't any better, and probably worse. The 2070 launched at $600 and was also using the third largest (TU-106) die so if anything this is a repeat of history.

At least DLSS isn't half-baked and the 4070 can actually handle RT without cratering the frame rate. $600 is still too much though, which was why the 3070 dropped back to $500.

The thing that both have in common is that AMD was pretty much absent from the market. I guess they've cut prices on RDNA2 cards to fill the price brackets, but that's not quite the same as having new product to compete with.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,814
4,108
136
Honestly I think the 2000 series wasn't any better, and probably worse. The 2070 launched at $600 and was also using the third largest (TU-106) die so if anything this is a repeat of history.

At least DLSS isn't half-baked and the 4070 can actually handle RT without cratering the frame rate. $600 is still too much though, which was why the 3070 dropped back to $500.

The thing that both have in common is that AMD was pretty much absent from the market. I guess they've cut prices on RDNA2 cards to fill the price brackets, but that's not quite the same as having new product to compete with.

At least the 2000 series had new features to help justify the price. Being new though they were basically useless and now those cards are too slow for RT. I think some of us were hoping it would be a one time "tax" if you will.

Looking at that image again, it's worse than it implies. It used to be the 70 series would be faster and cheaper than the 80 series which were the top tier cards. Now we have the 90 series as well so lately the 70 series has only been faster (or this time tied) with the second highest tier. This is really a 4060 Ti at best. Makes you wonder what the really one will be like.
 
Reactions: KompuKare

Thunder 57

Platinum Member
Aug 19, 2007
2,814
4,108
136
A bit OT:

Did you guys see Anandtech wasn't even sampled a 4070? Probably none of the new cards. I guess AMD and NVIDIA said if they aren't going to do GPU reviews, they won't send them any. I don't understand Ryan's excuse about it taking too much time. The main site is a shadow of its former self. Even CPU reviews seem half assed.

Go back and read a 1080 or 7970 review and you get more detail on the arch than you could ever ask for with analysis and benchmarks. Nobody could touch the quality of the reviews. I think the site is just under poor leadership. Not sure what changed with Ryan, but I get why Ian left for greener pastures.
 
Reactions: KompuKare and ZGR

VirtualLarry

No Lifer
Aug 25, 2001
56,453
10,120
126
Did you guys see Anandtech wasn't even sampled a 4070?
Even RandomGamingInHD and NotAnAppleFan (I think?) got sampled. I could be wrong about the last one, he's not overly pro-NV either.)

But with the number of YT creators in the tech space getting sampled (carpet-bombed?LOL), that's not a positive sign.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,814
4,108
136
Even RandomGamingInHD and NotAnAppleFan (I think?) got sampled. I could be wrong about the last one, he's not overly pro-NV either.)

But with the number of YT creators in the tech space getting sampled (carpet-bombed?LOL), that's not a positive sign.

The actual quote is even worse than I remember. Really sounds like incompetence or just lack of caring at this point. I guess covering the ROG phones was more important .

Ryan Smith - Thursday, April 13, 2023 - link

We were not briefed by NVIDIA on the GeForce RTX 4070.

On a quieter news day I would have picked it up anyhow. But I'm short on time this week, so there hasn't been an opportunity to work on it yet.
 
Reactions: ZGR

Kocicak

Senior member
Jan 17, 2019
982
974
136
I have been testing a new 2560x1440 monitor. 4070 in Hogwarts legacy, all ultra incl. RT, DLSS Quality, frame generation on, results in a great image quality, most of the time I see more than 80 fps in the most difficult scenes, often more than 100. I am really happy with it. BTW the game crashes occasionally with RT on, I wonder if they will fix this game ever.

What is funny that with this 1708x961 rendering resolution, I saw the game using even 10.3 GB of the card RAM. So this card probably cannot handle rendering natively 2560x1440 resolution with the highest settings in one of the most demanding games of today. But it manages to deliver nice experience with those upscaling and frame generation technologies. I am not sure what to think about it.

I should test dropping the quality to high, I am not sure ultra settings are even visible at this low rendering resolution.
 
Reactions: Ranulf

Mopetar

Diamond Member
Jan 31, 2011
8,024
6,479
136
At least the 2000 series had new features to help justify the price. Being new though they were basically useless and now those cards are too slow for RT. I think some of us were hoping it would be a one time "tax" if you will.

The 4070 has new features as well though. Granted, I think they're even more worthless than what Turing brought to the table, but that's my own personal opinion and not worth any more than that.

The biggest reason I consider the 2070 worse is that it was a price jump from $370 to $600, whereas here it's $500 to $600. The 2070 had the same VRAM as the 1070, whereas the 4070 at least bumped that up. The 4070 has roughly 50% more raw computational power than the 3070. The 2070 was only ~15% above the 1070. If it weren't for the vastly higher memory bandwidth Turing would have been a complete joke in non-RT benchmarks.

Turing is the worst architecture from Nvidia in at least two decades. They've mostly had great architectures, but Turing sticks out as a disappointing, overpriced lemon, and NVidia not supporting it with the newest RT features only means it's stayed bad.
 
Reactions: KompuKare

In2Photos

Golden Member
Mar 21, 2007
1,688
1,699
136
I have been testing a new 2560x1440 monitor. 4070 in Hogwarts legacy, all ultra incl. RT, DLSS Quality, frame generation on, results in a great image quality, most of the time I see more than 80 fps in the most difficult scenes, often more than 100. I am really happy with it. BTW the game crashes occasionally with RT on, I wonder if they will fix this game ever.

What is funny that with this 1708x961 rendering resolution, I saw the game using even 10.3 GB of the card RAM. So this card probably cannot handle rendering natively 2560x1440 resolution with the highest settings in one of the most demanding games of today. But it manages to deliver nice experience with those upscaling and frame generation technologies. I am not sure what to think about it.

I should test dropping the quality to high, I am not sure ultra settings are even visible at this low rendering resolution.
Let me pour you another glass of Kool-Aid!
 

Thunder 57

Platinum Member
Aug 19, 2007
2,814
4,108
136
The 4070 has new features as well though. Granted, I think they're even more worthless than what Turing brought to the table, but that's my own personal opinion and not worth any more than that.

The biggest reason I consider the 2070 worse is that it was a price jump from $370 to $600, whereas here it's $500 to $600. The 2070 had the same VRAM as the 1070, whereas the 4070 at least bumped that up. The 4070 has roughly 50% more raw computational power than the 3070. The 2070 was only ~15% above the 1070. If it weren't for the vastly higher memory bandwidth Turing would have been a complete joke in non-RT benchmarks.

Turing is the worst architecture from Nvidia in at least two decades. They've mostly had great architectures, but Turing sticks out as a disappointing, overpriced lemon, and NVidia not supporting it with the newest RT features only means it's stayed bad.

Fermi and Geforce FX were way worse. You are right though about the pricing being awful and performance increase lackluster.
 
Reactions: KompuKare

psolord

Platinum Member
Sep 16, 2009
2,015
1,225
136
Here's the latest at videocards.com. An article stating the models of all next gen GPU's that are at or below MSRP. There is a selection for everything but the 4070 which is just listed as "Too many to list". Must be a hot seller!

Also, possibly the stupidest argument for NVIDIA's VRAM shenanigans yet:

"I think Nvidia 12gb VRAM = 14GB AMD VRAM cuz compression is more advanced so its more efficient...". To be fair, the username is THC, so they may not have the most sound of mind at the moment.

12GB is not only = AMD 14GBs. It's 24GBs actually, because of dlss4.

I swear to god, we will see that on the 5000 series. They will perforate every other pixel on the textures and then replace it by AI. You heard it here first. I want royalties for the idea, mr leather jacket, you hear?
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |