Nvidia ,Rtx2080ti,2080,(2070 review is now live!) information thread. Reviews and prices

Page 34 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
While I agree that's how it typically works. I have not read any particular language that indicates that it's an either/or this time. GDDR5 was rated at 8Gbps and 1.5V. GDDR6 is rated at 14Gbps and 1.35V. That's where Samsung is getting their figures from. GDDR5X increased bandwidth to 11Gbps at 1.5V, so it slots somewhere in the upper middle of the curve.

I believe most of the savings are from moving to a smaller node.

No argument that it's not a cost balance, same as GDDR6. Otherwise everything would be using GDDR6 or HBM2.
GDDR5X is 1.35V also.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
I always question manufacturer claims. It's in their best interest to show only the good side of the product. Running a business is hard enough already, why would you talk about the bad stuff?

It used to be processes gave 30-40% increase in frequency, or 50% decrease in power. Then that became 20% faster, or 30% lower power.

Now, they still claim the 20% faster/30% lower power, but you need to add the fine print that says "only at certain frequency ranges, and after significant circuit optimizations". With Pascal, you might remember Nvidia put a lot of work into optimizing its circuit for higher frequency. Moving to 16nm from 28nm wasn't enough. More work, and less gain, is what happens now.

Of course, we're still getting nice increases for memory. However, miracles don't happen there either.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,301
68
91
www.frostyhacks.blogspot.com
The market for ray tracing is non existing since no games really have it and nobody wants to play at 1080p. So that argument holds no ground.

Well the steam hardware survey has 1080p as the most common screen resolution at 62% of users, and 4k at 1.3%, like I said before we're talking about quite small markets here. Raytracing is one of those things where you have to actually create the demand in the first place, it's part of innovating, it's painful and awkward to get started because devs wont make features in games that gamers can't use and hardware vendors don't want to make hardware for which there are no games, so either you stifle innovation and we never get anything new again, or the hardware vendors have to take a leap of faith and build something and then go through the awkward phase of not having many games support the tech and generally support it badly, until eventually it becomes ubiquitous. I've mentioned this before i think in this thread, it's happened many times throughout the PC graphics history with all sorts of things like tessellation and shader 2.0

You don't have to buy into the tech, but what I would say is that these awkward phases always have existed and always will, all Nvidia can really do is work as closely with dev studios as possible and they're already doing that.

For the vast majority of gamers who are running 1080p and 60hz it's more attractive to have a video card that offers graphical fidelity increase rather than tons of extra power which they simply don't need.

I wonder about the veracity of this statement and if there is a disconnect between the tech community and the broader gaming market. I've noticed (and even years ago when high refresh monitors first started appearing) that at least in the circle the sentiment seemed to lean more towards high resolutions over higher refresh. But I do question whether or not that actually applies to the broader PC gaming market.

I'm aware of market research data from Digitimes which seems to indicate the growth rate in annual sales of high refresh monitors has doubled each of the last two years. Also that manufacturers in general are now heavily targeting this segment and if you look at their new releases are predominantly high refresh options. Based on steam survey data 1080p monitor market share isn't just the highest it's also growing several order of magnitudes higher compared to other resolutions (4k and 1440p both went down slightly in the last one even). Anecdotally in terms of other communities I might interact with (outside of tech circles) there seems to be a much higher interest for high refresh relative to high resolution.

In general I've always had this feeling that tech forums might lean more towards high resolution vs. high refresh more so compared to the broader PC gaming market. I actually wonder if a high refresh vs high resolution survey done in this sub forum, the CPU sub forum, and the gaming sub forum if there would be a significant difference in results.

I'd be willing to bet that a large portion of the high refresh monitors are going to competitive gamers, typically the people who see the most benefit from high refresh displays because they need every edge they can can get. And the amount of competitive gaming is on the rise with esports and steaming and whatnot. It's also something that's somewhat limited by bandwidth of the connectors and the push for high resolutions like 4k has enabled the bandwidth we need for high refresh rates, the original 4k monitors and TVs worked with multiple video connectors or worked at 30hz because there simply wasn't enough bandwidth available. As connector standards increased bandwidth for 4k that means smaller resolutions like 1080p could be run at higher refresh rates. You'll notice that it wasn't a push for faster refresh rates itself that made those connectors available, that only happened when 4k came along. We've had the technology with TN panels to make 240hz monitors for a long time now, the 240hz panels we see today are 1ms panels and we've had 1ms panels for god knows how long.

Most online tech/gaming communities are for people who are tech heads and most of us have extremely high end PCs because that's what we're into, if you averaged the spec of most people who post on these forums it would be way above average. For some perspective the target frame rate of consoles which is hardware aimed at a casual audience is 30fps for almost all AAA games because they know the consumer market for this stuff prefers the additional pretty graphics over a smoother experience, only in games where graphics don't really matter will they target 60fps.

That also rings somewhat true of people who like good quality monitors, because moving to monitors faster than 60hz basically forces you onto a TN panel and many people don't want to move away from true 10bit colour to 6bit+2 bit dithering because it looks awful, and due to the viewing angles of TNs you're also limited in display size to 27" at a push. The reason I guessed that competitive gamers would likely make up a lot of the people using these monitors is because they're generally function over form, they'll take a hit in colour accuracy if it means a faster panel that allows them to better compete, in a similar way that they'll turn down graphics settings for faster frame rates. I think it's also generally true that the competitive games are the simpler ones as well, like CS:GO and MOBAs which run on engines that are very fast. You can get 240fps out of CS easily, you cannot get that out of many other games simply because we don't have CPUs that can run modern AAA games that fast, they seem to top out around about 140-160fps when not GPU limited and using mainstream high end CPUs, as always with CPU bottlenecking you're kinda stuffed because most settings in games that you can alter for performance mostly load on the GPU not the CPU.

The broad gaming market doesn't want super expensive stuff either.

Their overall volume is going to decrease with RTX, but it'll be more than made up by the price increase.

7nm stuff isn't going to be cheaper either. Maybe they'll keep the price same and actually increased perf/$ that gen.

Probably true. Jumping from 12/14nm down to 7nm is a big leap as well, it's nearly half the size, that means something like 3-4x more transistors on the chip? Crazy
 
Last edited:

Head1985

Golden Member
Jul 8, 2014
1,866
699
136
Early review over at [H] of a 2070 they got their hands on... an MSI AIB card. Conclusion is 15 to 20% faster than GTX1080 but focuses on the fact that the MSI is clocked among the highest out of the box. Still decent:

https://www.hardocp.com/article/2018/10/14/msi_geforce_rtx_2070_gaming_z_performance_review/
Ok lets look at 2070 vs 1080 2560x1440:
tomb raider 16%faster
far cry5 4%faster
kingdome come 13%faster
wolf2 16%faster
mass effect 10%faster
gears of war 18%faster
deus ex 12%faster
Bf1 7%faster

average 12% faster.For 600USD its fail.
 
Last edited:
Reactions: psolord

Head1985

Golden Member
Jul 8, 2014
1,866
699
136
They will need have same bandwidth.2070 have like 40% more bandwidth and its12% faster so...
 

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
For anyone that pre-ordered a ti from Best Buy check your email, mine finally shipped today.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Ok lets look at 2070 vs 1080 2560x1440:
....
average 12% faster.For 600USD its fail.
12% faster today, but you know that gap will grow as you have:

- DLAA so you get better upscaling and/or AA then 1080.
- RTX which might or might not be of use, chances are you'll be able to enable the lowest level of global illumination.
- In addition Turing is significantly faster for vulcan, so going forward as more games switch to vulcan it's likely to have a larger lead. History teaches us that newer architectures tend to pull away from older ones over time, pretty likely to be the case here.

Also you've gone for founders prices, not the headline $500 price, so chances are you'll get sub $600 cards when the initial rush blows over. I certainly wouldn't consider getting a 1080 over a 2070 any longer.
 

lixlax

Member
Nov 6, 2014
184
158
116
For anyone that pre-ordered a ti from Best Buy check your email, mine finally shipped today.
Almost a month after launch? For the price these things cost I would expect nothing less than a instant day 1 delivery on a gold plate.
Basically people gave out interest free loans for 1-1,5 months.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
Almost a month after launch? For the price these things cost I would expect nothing less than a instant day 1 delivery on a gold plate.
Basically people gave out interest free loans for 1-1,5 months.

Yeah pretty much. To be fair, the best buy page said they wouldn't ship till October so right in line with expectations. I was more annoyed by my Nvidia store purchase since I expected that to ship on launch day but didn't get until last week after multiple delays
 

Malogeek

Golden Member
Mar 5, 2017
1,390
778
136
yaktribe.org
Also you've gone for founders prices, not the headline $500 price, so chances are you'll get sub $600 cards when the initial rush blows over.
Not likely for a long time. What's your interpretation of "initial rush"? 2080's have been for sale since day 1 and still easily available but the prices haven't changed at all. They want to sell their Pascals for full price still as well. The 2070 is cheaper so will sell more? Probably a bit yes but the value is still very questionable for RTX, especially as we drop below 2080 since there's zero quantifiable data on the performance of the RTX components. Again it's just gambling that the RTX tax will be of benefit at all beyond some DLSS with questionable quality and zero current implementations.

The 2070 just reinforces the fact that consumers should either wait for some titles to be available to actually test these features, or go to the used market to get a Pascal that will either be much cheaper (1080 for $300+) or faster and still cheaper (1080ti), depending on their current hardware (I bought a 1080 Xtreme for $350 after RTX launch, upgrading from my 970). Consumers targeting the 2070 are obviously not willing to pay $800+ for a GPU and thus are currently likely on a 1060 or 980 or lower. Anyone with a 1070 or higher wouldn't be considering a 2070, especially with no RTX feature data.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
It takes several months for the initial spike in demand to work through.

Are those comparison prices second hand? Not really quite the same!
 

Elfear

Diamond Member
May 30, 2004
7,115
690
126
If I'm going to spend $500 ,I might as well get the latest tech and the performance I need.
Would I need a 2080? No, that's way to much card.

You missed his point. If price points stayed the same or slightly higher with Turing, your $500-600 would have easily gotten you an RTX 2080. Because prices went to the moon, you can only afford a 2070.
 

Malogeek

Golden Member
Mar 5, 2017
1,390
778
136
yaktribe.org
Are those comparison prices second hand? Not really quite the same!
Yes, I specified used market, it's the whole point. The argument I was making had nothing to do with comparing launch prices or MSRPs, it's about what consumers can buy with their money now. It's up to people to use their brains and decide whether they care about a new product vs a used product, that hasn't changed. The fact remains that the used market is a big factor where $200 can be saved on a Pascal for the price conscious market (those targeting 2070/2060).

Waiting for this initial "rush" (I don't see any significant demand for 2080's...) to be over also reinforces my first point, where by the time prices may or may not be lower there might be some actual games with RTX features to determine value.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
With second hand stuff you’re losing things that a lot of people will think are worth the extra money.

Honestly, 1/3rd or so off isn’t a remotely generous discount for second hand ownership. See a ton of other things you can buy.
 

Malogeek

Golden Member
Mar 5, 2017
1,390
778
136
yaktribe.org
With second hand stuff you’re losing things that a lot of people will think are worth the extra money.

Honestly, 1/3rd or so off isn’t a remotely generous discount for second hand ownership. See a ton of other things you can buy.
Which is a very subjective and a separate discussion, even its own dedicated thread if anyone cares. Hence commenting that people can make up their own mind on the benefits and drawbacks of used GPUs.
 

alexruiz

Platinum Member
Sep 21, 2001
2,836
556
126
The GTX 1080 used by [H]ardOCP is the MSI Gaming X.
It is a fast 1080, but there are quite a few 1080s faster than it. If the RTX 2070 Gaming Z is indeed one of the higher clocked 2070s, the margin compared to a 1080 Zotac amp! extreme or Gigabyte aorus extreme would be lower than the one shown on the review
 

MrTeal

Diamond Member
Dec 7, 2003
3,586
1,746
136
With second hand stuff you’re losing things that a lot of people will think are worth the extra money.

Honestly, 1/3rd or so off isn’t a remotely generous discount for second hand ownership. See a ton of other things you can buy.
Entirely depends on the think. Some items like CPUs lose almost no value second hand. Buy a mattress today and sell it tomorrow still in the plastic and you'll get 1/2 what you paid if you're lucky. Video cards don't have a lot of depreciation relative to a new version of the same model because they don't generally wear out before they're obsoleted, and saving 10-20% (plus any taxes that may apply) is often worth the possible loss of warranty.
 

psolord

Platinum Member
Sep 16, 2009
2,015
1,225
136
This is how Nvidia makes the RTX cards' performance delta look better.

 

Tweak155

Lifer
Sep 23, 2003
11,448
262
126
This is how Nvidia makes the RTX cards' performance delta look better.

Definitely looks nefarious... would only be 100% useful if we see if RTX performance dropped as well, but I don't think RTX will run on 399.24 will it? Maybe a previous 4xx driver retained performance that this comparison could be made on.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Reviewers are not going to make the RTX cards look better, are they?

So what's the point?

These sorts of stunts simply don't work any more.

They last about 30 seconds when they hit the real world.
 

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
The problem with the 20 series is the price, the 2070 at 600. Right now you can get the 1070 is under 300, if you look for deals you can get the 1080 for around 400 and I have seen it under 400, and the 1080ti for around 600 and I have seen it under 600.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |