It was on Reddit. Posted a link a few messages up.
The link you posted was to videocardz
It was on Reddit. Posted a link a few messages up.
NDA on the slides... yawn. Videocardz is click farming.https://www.reddit.com/r/nvidia/comments/99fg7b/nda_ends_in_15_mins/?sort=new
The link was from Reddit.
Cheaper and lacking lots of features of the 2070.One would hope, being that 1080 ti is cheaper.
That chart might as well be measuring KekRays. It doesn't declare exactly what is being measured. The 1 line is below the max 1080 performance. /clap Nvidia
It's a very common graph style. You put the standard at 100% then rank everything else accordingly.Yea I'm not sure how to understand that graph. Not a good comparison at all.
35% increase for the 2080 over the 1080 in existing games would bode well, imo.So 40-45% in cherry picked games (and even 4 with HDR where we know that Pascal isn't good at). Reviews will show a ~35% increase IMO, a few titles here and there with more/less.
But besides that we don't even know what is being measured there. Sigh.
35% increase for the 2080 over the 1080 in existing games would bode well, imo.
It will likely only get better as drivers improve.
It's a very common graph style. You put the standard at 100% then rank everything else accordingly.
35% increase for the 2080 over the 1080 in existing games would bode well, imo.
It will likely only get better as drivers improve.
Sounds like what we were hearing about Vega.If we are already at a decent performance increase, in a month it will probably be even better, and as drivers mature it should get a lot better.
1080TI is also 66.6% faster than 1070 at 4k and i dont think 2070 will be anywhere near that faster vs 1070.Well that lends credibility to the 1080 Ti +8% rumour. They saw thoseslides before hand maybe?
https://tpucdn.com/reviews/NVIDIA/GeForce_GTX_1080_Ti/images/perfrel_3840_2160.png
108 / 74 = 46%, which seems about the average of the non-DLSS games on that chart. Whether or not we actually get that in non-cherry picked games is less assured. Consider not only the cherry picking, but it's probably FE vs FE and 2080 will throttle less due to open air dual fans.
It's not Amdahls law. GPU load is essentially 100% parallel.
It's the memory bandwidth which I pointed out for the 2080ti comparison, which you neglected here.
While Shader performance could have jumped 80% for GP104, memory bandwidth only went up about 40%, thus limiting performance.
2080 shader and memory bandwidth both increase around 30%, so it wont be bandwidth starved.
We can only go by specified boost for clock speed, we don't know how high 2080ti boost in real loads but going by the much improved cooler design, we can bet that will be much higher as well.
30% is a reasonable baseline.
No they are not. Nothing is ever 100% parallel unless the individual tasks are 100% independent (which they aren't in a rendering pipeline).
Besides, you seem to be of the belief that Amdahl's law only refers to the compute load. It doesn't, it refers to all the loads and bottlenecks involved in delivering the final result, so it also includes stuff like I/O load.
If you include a bunch of optimistic bets, sure, but then we're not talking about a baseline anymore.
Baseline without including any sort of optimistic assumptions (i.e. a proper baseline) is 15-20%.
Vega didn't get better over time?Sounds like what we were hearing about Vega.
I hope they show 1080ti vs 2080ti. Sounds promising and I should finally be able to go from 1440p to 4k.https://www.techradar.com/news/the-...ally-twice-as-powerful-as-the-nvidia-gtx-1080
Twice as fast as the 1080? Its twice in a few areas if you turn on DLSS.
35% increase for the 2080 over the 1080 in existing games would bode well, imo.
It will likely only get better as drivers improve.