Jaskalas
Lifer
- Jun 23, 2004
- 34,782
- 8,886
- 136
BTW going from 40 to 100 fps feels like MAGIC.
As someone showed, DLSS 3, aka Frame Gen, is only having to perform rasterization on 1/8th of the pixels displayed. Or something like that.
BTW going from 40 to 100 fps feels like MAGIC.
I'm sure it does. Just remember that magic tricks are only an illusion.BTW going from 40 to 100 fps feels like MAGIC.
LMAO, just like any motion interpolating TV made in the last 15+ years. And with TVs there's no nVidia tax or vendor lock.BTW going from 40 to 100 fps feels like MAGIC.
That would be the case with frame generation and DLSS Performance, which renders just a quarter of the image.As someone showed, DLSS 3, aka Frame Gen, is only having to perform rasterization on 1/8th of the pixels displayed. Or something like that.
It's interesting to see how that goes... Because overclocked 3060 Ti can be quite close to stock 3070. So pointless... 249 € max or it is a huge flop (it's a flop already with 8 GB VRAM).Should be slightly above but more or less.
Turning off motion flow is the first thing I have done with every new TV I have bought in the last 10+ years and now I'm supposed to turn it back on because Nvidia says so?LMAO, just like any motion interpolating TV made in the last 15+ years. And with TVs there no nVidia tax or vendor lock.
If my video games get the "soap opera effect" I'm gonna be pretty angry. I want high framerate with native rendering. Nothing more and nothing less.Turning off motion flow is the first thing I have done with every new TV I have bought in the last 10+ years and now I'm supposed to turn it back on because Nvidia says so?
Turning off motion flow is the first thing I have done with every new TV I have bought in the last 10+ years and now I'm supposed to turn it back on because Nvidia says so?
Yeah I agree. The "soap opera effect" is terrible. It's the first setting I change every time I set up a new TV as well. While I think DLSS and frame generation are pretty good features, I don't want to see them being used as a crutch for poorly optimized games and/or as a selling point for new overpriced GPU's. Reviewers need to be damn clear in the distinction between FPS numbers using frame generation and fully rendered FPS numbers.If my video games get the "soap opera effect" I'm gonna be pretty angry. I want high framerate with native rendering. Nothing more and nothing less.
Yup, they are useful for poor people who cannot afford 4090 for fully rendered gameplay.While I think DLSS and frame generation are pretty good features
Lucky that 4070 is not overpriced at all, after inflation correction it costs the same as 2070 and 3070 and performs so well as 3080 with much lower power draw.new overpriced GPU's.
Lucky that 4070 places in top positions of efficiency and performance per dollar charts using just fully rendered frames. Even the most dishonest reviewer has no reason to fix anything.Reviewers need to be damn clear in the distinction between FPS numbers using frame generation and fully rendered FPS numbers.
Not quite as good as the scrub version of the 3080 much less the good 12GB versionLucky that 4070 is not overpriced at all, after inflation correction it costs the same as 2070 and 3070 and performs so well as 3080 with much lower power draw.
Not quite as good as the scrub version of the 3080 much less the good 12GB version
3080 12GB was available at $700 from multiple retailers for a month or two if I remember right. But I was comparing the scrub version of the card that was 5% faster in that testsuite from this month anyways so don't see why you have your panties in a bunch. I was specifically talking about performance, quoting a post talking about performance. Not power draw, DLSS 3, AV1 encoder, etc.The 12GB 3080 had something like minimum MSRP of $900, and you can get 4070 Ti for less than that, it really doesn't belong in this conversation.
As far as 3080 10GB being better, it squeaks out an miniscule lead, only at 4K.
It's also older tech, was $100 more, has 2GB less VRAM, no AV1 encoder, no DLSS 3, and consumes 50%+ more power.
I get that 4070 isn't exciting to many people, but it's NOT worse than the 3080.
3080 12GB was available at $700 from multiple retailers for a month or two if I remember right.
But I was comparing the scrub version of the card that was 5% faster in that testsuite from this month anyways so
I was specifically talking about performance, quoting a post talking about performance. Not power draw, DLSS 3, AV1 encoder, etc.
Might as well call the 3060 Ti and 3070 the same gpu if undetectable when gaming is the standard. Fact is 3080 10GB is 5% faster at 4k in that testsuite of 20+ games. Why would I care more about lower resolutions where cpu bounds start creeping in? I wouldn't evaluate the power of gaming cpus by how they run 1440p, I'd want to see 720p results so I could eliminate the gpu as a factor as much as possible. It's not like it's a stretch testing 3080 and 4070 at 4k. Not like I'm trying to compare a 3050 to a 2060 at 4k or something.That's a completely irrelevant and undetectable difference.
I focused on the one of the two specific things the post I quoted mentioned. You're the one bringing irrelevant data in. There is no issue with the power draw part, the issue was saying they performed the same when they don't. I know you love to constantly muddy up the waters, you have been doing it nonstop in this thread. I really hope you have a grip of Nvidia stock you're trying to protect the share price of because otherwise it's pretty pathetic the way you constantly white knight for a corporation.Yes, you want to focus on one irrelevant difference, and ignore all the ways that 4070 is better, including price, power, features...
I specifically remember there being a few different models of 3080 12GB for $700 because I kept emailing my brother the deals I'd come across over a few weeks when he was looking for a 3080 after his 1080 Ti died. All I said was they were $700 for a month or two.Only when on clearance, before clearance A big deal was made of EVGA ones being available for MSRP, and the cheapest one was still over $900.
Might as well call the 3060 Ti and 3070 the same gpu if undetectable when gaming is the standard.
I specifically remember there being a few different models of 3080 12GB for $700 because I kept emailing my brother the deals I'd come across when he was looking for a 3080 after his 1080 Ti died.
Maybe you're right, it might have been the $720 to $730 deals I was rememberingIf you emailed it you have records and don't need to go by memory...
Pretty sure you are remembering the high end RTX 3000 clearance because these were NOT $700 cards.
The base 10GB cards were the $700 cards.
Here we go: $687.50Ah no. 3070 is 10%,13%, 14% faster than 3060 Ti. That's a small difference.
But it's significantly higher than your -2%, 0%, 5% insignificant difference.
If you emailed it you have records and don't need to go by memory...
Pretty sure you are remembering the high end RTX 3000 clearance because these were NOT $700 cards.
The base 10GB cards were the $700 cards.
God you argue in such bad faithSome unicorn double coupon deal doesn't make it the MSRP.
Your EVGA card link above is in my MSRP list from last April, and they were excited about it being offered at MSRP:
RTX 3080 12GB XC3 Ultra — $980
That's the MSRP.
Some unicorn double coupon deal doesn't make it the MSRP.
Your EVGA card link above is in my MSRP list from last April, and they were excited about it being offered at MSRP:
RTX 3080 12GB XC3 Ultra — $980
That's the MSRP.
Who cares about MSRP anymore? What are the actual cards selling for? You point out the "double coupon" listing while conveniently ignoring the $720-730 ones, after you claimed they were $900+ cards.
Did a GPU review begin by telling me I should change my job to make the card seem cheaper?4070 vs 4070ti, is it worth the extra $200?
4070 vs 4070ti, is it worth the extra $200?
24% better performance for 33% more money.