NVIDIA Geforce GTX 1070 Thread

Page 14 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Relative performance in 15 games @ 1440p. If the 1070 is a 1080p card, then all others below it are worse.

mabe you missed my explanation/post.....

Its not Nvidia or AMD shifting the goalposts, its the game developers.

You could buy a gtx1070 now and think your set for 1080p and bam Crysis 4 is released and your in the 40's @ 1080p and your suddenly turning down shadows and godrays.
Then bam Battlefield 5 is released now your using high settings instead of ultra.

Would it bother me ? not really but some people like their ultra settings.

or mabe later game releases will be like Doom where just about anyone with a mid range card can play with healthy fps.

Now it depends on if your the type of person that likes to play it safe and not worry about settings for a bit or if your willing to turn down a few settings.

Its really quite simple, game developers move OUR goalposts.

make sense?, there is no definitive answer, unless you have a crystal ball.

Your either a "better safe than sorry "person or a "ahh I can turn down a few settings" person.
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,142
131
Just an update. According to Chiphell the GP104-150 SKU, first mentioned by HardwareBattle back in April is actually a mobile (notebook) design. This means the VGA 'leaked' by Inno3D is most likely GP106-400 based, 'Geforce GTX 1060' could be the next in line to launch.

 

amenx

Diamond Member
Dec 17, 2004
4,012
2,284
136
mabe you missed my explanation/post.....



make sense?, there is no definitive answer, unless you have a crystal ball.

Your either a "better safe than sorry "person or a "ahh I can turn down a few settings" person.
Well, I know my answer applies to me and has held well in the 2 1/2 years I've been on 1440p. No game has ever let me down on that res. I could ALWAYS find settings where things looked good at reasonable FPS. In fact even in the most crippling games, I would prefer 1440p at lower settings than higher at 1080p.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Settings, how do they work?! :hmm:

Ok, let's go with that view. If 1070 is only a MAX settings 1080p 60Hz card, and it's 60% faster than a 780Ti/970/290, you agree then 780Ti/290/970 are only MAX settings 1280x1024 cards? Then 950/960/280X/770 users should have all been gaming at 1280x1024 or 720p in 2014-2015 because 980 was also a 1080p 60Hz card? Do you realize how insane your defense of 1070 = 1080p 60Hz card sounds?!!

It's amazing how for 15+ years gamers aimed 60 FPS averages at 1080p/1200p, but now suddenly I should want 60 FPS minimums in a slow-paced ROTTR game and every other game? Next thing you know when GV104 comes out with yet another 60-70% more performance, we'll still call it a 1080p 60Hz card* but now with DSR maxed out.

Some people here are just either too cheap to throw away peasant 1080p 60Hz monitors or have never used a 28-40" 1440p/4K one. The defense for 1080p 60Hz screens is pathetic. Did people defend 1024x768, then 1280x1024, then 1600x1200 as we moved to more affordable high resolution CRTs?

Also we have a guy claiming 25" 1080p is superior gaming experience to a larger monitor. That's why almost all the guys buying high end rigs on this forum have moved to 1440p/3440x1440/4K or multi-monitors? They must be all stupid wasting their money.

1080p itself is just an arbitrarily number set as a result of HDTV resolution standard. For PC gaming, I want the best visuals, not console visuals.

Care to guess how many people with a 22-25" 1080p 60Hz panel on these forums, if they were given a FREE 43" 4K 60Hz monitor, would say: "No, please, I want my 2004 screen size and resolution back."
https://www.amazon.com/gp/aw/d/B01E18XRY2/ref=mp_s_a_1_1?qid=1465656074&sr=8-1&pi=SX200_QL40&keywords=phillips+43+4k+monitor&dpPl=1&dpID=515p3zoLJhL&ref=plSrch

tential is right-- almost all the people defending 1080p 60Hz gaming either have never gamed on a good large monitor or are just too cheap to buy a new high-end card and a new monitor together, or simply don't have the $/budget to afford it. Not having the budget or enough $ is a perfectly logical reason to stick to 1080p 60Hz, but stop lying about 1080p 60Hz being a superior gaming experience.

Even for competitive gaming, all the top guys who play competitive FPS use 100-165Hz screens.

I'll just say there will always be people who will pay $700-1000 for a card and game on a crap $100-200 22-25" 1080p 60Hz monitor. Same way there will be guys in the hood with $3-4K expensive rims and tires on their POS $10-15K car.

Point is someone was given 1070 and a 4K 120Hz OLED, are they going to claim that IPS/TN 1080p 60Hz gaming is better? Give me a break. Call it what it is -- a lot of people defend 1080p 60Hz gaming because they can afford to buy a $200-400 card, sell it in 2 years with a $100 loss in value, but they shake when asked to pay $500-800 for a 1440p/4K monitor. It's too expensive for them. That's why 1080p 60Hz and lower dominates Steam, not because 1080p 60Hz is somehow superior.

Let's face it, if tomorrow you could buy a 27-28" 1440p 120Hz G-Sync HDR panel for $100-150, you'd get one for your 1070. That's the truth that 1080p 60Hz gamers hide behind. I am man enough to admit that I don't have the budget for a 34" 3440x1440 100Hz G-Sync Panel and 1080 SLI but I sure as hell don't defend 1080p 60Hz or 1440p 60Hz gaming in that context.
 
Last edited:

tential

Diamond Member
May 13, 2008
7,355
642
121
Well, I know my answer applies to me and has held well in the 2 1/2 years I've been on 1440p. No game has ever let me down on that res. I could ALWAYS find settings where things looked good at reasonable FPS. In fact even in the most crippling games, I would prefer 1440p at lower settings than higher at 1080p.
Most people, myself included (back in the past anyway) wouldn't believe you. Since vsr though and more resolution scaling I've tested myself. On my 1080p screen, resolution makes the most difference. If I can render at a higher resolution it's far more worth it over slightly better shadows.

Overwatch even when I turned down the settings a TON so I could get 4k working was not an iq regression which is what I was expecting.

People are married to their resolutions which is sad rather than trying to get the best iq possible. Ultra settings everything isn't worth it if you're enabling settings that kill fps over getting a massive iq boost from resolution increases.

I was hesitant myself, but now, I'd game at even medium settings if I can get 4k resolution. Resolution is so important I can't believe I settled at 1080p at one point. I'm ashamed to have put 1440p monitors on blast and 1440p gamers. I now see every resolution step you can get is pretty important up til 4k. Maybe higher, not sure yet as I can't use vsr above 4k because amd sucks unlike Nvidia dsr.

Edit: I swear my win rate increased significantly in overwatch too once I did that, but I could be wrong. I just may have gotten better at support and my guess is good supports will have high win rates.
 

Thinker_145

Senior member
Apr 19, 2016
609
58
91
Ok, let's go with that view. If 1070 is only a MAX settings 1080p 60Hz card, and it's 60% faster than a 780Ti/970/290, you agree then 780Ti/290/970 are only MAX settings 1280x1024 cards.

It's amazing how for 15+ years gamers aimed 60 FPS averages at 1080p/1200p, but now suddenly I should want 60 FPS minimums in a slow-paced ROTTR game and every other game? Next thing you know when GV104 comes out with yet another 60-70% more performance, we'll still call it a 1080p 60Hz card* but now with DSR maxed out.

Some people here are just either too cheap to throw away peasant 1080p 60Hz monitors or have never used a 28-40" 1440p/4K one. The defense for 1080p 60Hz screens is pathetic. Did people defend 1024x768, then 1280x1024, then 1600x1200 as we moved to more affordable high resolution CRTs?

Also we have a guy claiming 25" 1080p is superior gaming experience to a larger monitor. That's why almost all the guys buying high end rigs on this forum have moved to 1440p/3440x1440/4K or multi-monitors? They must be all stupid wasting their money.

1080p itself is just an arbitrarily number set as a result of HDTV resolution standard. For PC gaming, I want the best visuals, not console visuals.

Care to guess how many people with a 22-25" 1080p 60Hz panel on these forums, if they were given a FREE 43" 4K 60Hz monitor, would say: "No, please, I want my 2004 screen size and resolution back."
https://www.amazon.com/gp/aw/d/B01E...4k+monitor&dpPl=1&dpID=515p3zoLJhL&ref=plSrch

tential is right-- almost all the people defending 1080p 60Hz gaming either have never games on a good large monitor or are just too cheap to buy a new high-end card and a new monitor together.

Even for competitive gaming, all the top guys who play competitive FPS use 100-165Hz screens.

I'll just say there will always be people who will pay $700-1000 for a card and game on a crap $100-200 22-25" 1080p 60Hz monitor. Same way there will be guys in the hood with $3-4K expensive rims and tires on their POS $10-15K car.

Point is someone was given 1070 and a 4K 120Hz OLED, are they going to claim that IPS/TN 1080p 60Hz gaming is better? Give me a break. Call it what it is -- a lot of people defend 1080p 60Hz gaming because they can afford to buy a $200-400 card, sell it in 2 years with a $100 loss in value, but they shake when asked to pay $500-800 for a 1440p/4K monitor. It's too expensive for them. That's why 1080p 60Hz and lower dominates Steam, not because 1080p 60Hz is somehow superior.

Let's face it, if tomorrow you could buy a 27-28" 1440p 120Hz G-Sync HDR panel for $100-150, you'd get one for you 1070. That's the truth that 1080p 60Hz gamers hide behind. I am man enough to admit that I don't have the budget for a 34" 3440x1440 100Hz G-Sync Panel and 1080 SLI but I sure as hell don't defend 1080p 60Hz or 1440p 60Hz gaming in that context.
You are absolutely ridiculous putting words in my mouth left and right. I NEVER said a 1080p screen is better than a 4K screen I was talking about a 1080p 50" screen.

Talking about a 4K screen do you realize that it perfectly scales to 1080p? That's a luxury a 1440p screen doesn't have. If I were given a 4K screen as a gift I'll just play most of my games at 1080p or I'll just sell it off if I didn't enjoy that option because of poor PPI. Ultra settings + lower resolution >>> medium settings + higher resolution.

I have never denied the importance of high refresh rate. I would definitely consider it far more important than uber resolution.

Talking about an OLED screen oh hell ya I would be willing to compromise settings for that. I am dreaming for the day when I can afford one for my PC, I'll definitely be willing to break the bank for that.

And btw CRT monitors could scale to pixel perfection any resolution that they supported. So that's why back in those days it made no sense to stick to a lower resolution CRT if you could afford a better one.

You can enjoy your needless pixels for lower performance and graphical fidelity. I am happy with ultra settings and 8xAA thank you very much. I respect your preference and so you should of those who have a different one. To say that one only sticks to 1080p because they can't afford something better is just your invalid opinion and one that ignores the reality of PC gaming where 1440p continues to be a dual card resolution for maxed out gaming let alone 4K.
 
Last edited:

tential

Diamond Member
May 13, 2008
7,355
642
121
Ultra settings are great to a point. Moving to extreme shadows or something like that kills fps but barely improves iq. I'll take medium shadows every day if I can hit 4k resolution over 1080p "maxed" out.

I mean look at recent screens from games. Most games you can't tell the difference between the high settings. Like mirror's edge, hyper and ultra is barely a difference. I'd rather have more resolution Tha a slightly better shadow far away in the distance that I can't see due to poor 1080p resolution.

Biggest issue with 4k is refresh rate. But I feel bad for anyone buying a 1070 to game at 1080p. Step your resolution/dsr game up.
 

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
You are absolutely ridiculous putting words in my mouth left and right. I NEVER said a 1080p screen is better than a 4K screen I was talking about a 1080p 50" screen.

Get used to it, he'll make stuff up and then ramble on for paragraphs and paragraphs hoping no one notices.

Ok, let's go with that view. If 1070 is only a MAX settings 1080p 60Hz card, and it's 60% faster than a 780Ti/970/290, you agree then 780Ti/290/970 are only MAX settings 1280x1024 cards? Then 950/960/280X/770 users should have all been gaming at 1280x1024 or 720p in 2014-2015 because 980 was also a 1080p 60Hz card? Do you realize how insane your defense of 1070 = 1080p 60Hz card sounds?!!

Are you new to PC gaming? Your post would make sense if games didn't get more demanding, but guess what! They do!

I will admit I didn't read the rest of your post because I've been burned by that in the past, wasting minutes of my life reading pointless nonsense. So if you mentioned anything that contradicts your first paragraph (very likely), I apologize in advance.
 

sirmo

Golden Member
Oct 10, 2011
1,014
391
136
I have friends that have car hobbies, motorcycles, even sneaker collectors. They spend way more than 400$ A YEAR on their hobbies.

I bet my father spends more than that on cleaning supplies for his motorcycle and truck.

Most adults have jobs and can easily afford a 4 or 500$ card, especially every 2 or 3 years.

just my opinion.
You're living in a bubble if you think that's the majority. A $400-500 card also means a $1000+ computer build. rx480 enables decent $700 PC builds. Which opens up premium PC gaming experience to folks who wouldn't even consider building a PC in the past.

I think it's also a bit shortsighted to not recognize that what AMD is doing is great for PC gaming as a whole. Even for us enthusiasts who can afford any piece of gear, this broadens the market, bringing new people into it, allows game publishers to not ignore PC gamers when it comes to console ports etc.. there are tons of advantages to AMD's rx480 price point.

Even if the card isn't for you, you should recognize its benefits to PC gaming as a whole.
 

garagisti

Senior member
Aug 7, 2007
592
7
81
Does anyone know if the Pascal chips released so far have h265 with hdr (10-bit colour) acceleration built into the cards? I have read at some obscure place while using google to dig so, that the cards only support h265, 8-bit colour with no HDR. Any information on this will be appreciated.

Yes, almost everyone i know uses their PC about 50-50 for gaming and media playback.

edit:
http://www.notebookcheck.net/Nvidia-Pascal-Architecture-Overview.165493.0.html
So confirmation that 1080 does have h265 acceleration. Good. Now can someone confirm this for 1070 too?
 
Last edited:

SteveGrabowski

Diamond Member
Oct 20, 2014
7,127
5,998
136
Care to guess how many people with a 22-25" 1080p 60Hz panel on these forums, if they were given a FREE 43" 4K 60Hz monitor, would say: "No, please, I want my 2004 screen size and resolution back."
https://www.amazon.com/gp/aw/d/B01E...4k+monitor&dpPl=1&dpID=515p3zoLJhL&ref=plSrch

You'd have a point on 1440p, but if you gave me a 43" 4k monitor I'd sell it and buy a 27"-32" 1440p one. With how horrible multicard support has been the last few months I would not want a 4k monitor. The Division and Ashes are about the only recent games to have reasonable scaling (and only for AMD on Ashes), and Ashes just seems like a DX12 demo. It's unbelievable how bad multi-gpu support has been lately. I mean check GameGPU's benches for ROTR, Hitman, Gears of War Ultimate, Forza, Quantum Break, Just Cause 3, Dark Souls III, Far Cry Primal, Doom, Total War Warhammer, and Mirror's Edge Catalyst.
 

ZGR

Platinum Member
Oct 26, 2012
2,054
661
136
You'd have a point on 1440p, but if you gave me a 43" 4k monitor I'd sell it and buy a 27"-32" 1440p one. With how horrible multicard support has been the last few months I would not want a 4k monitor. The Division and Ashes are about the only recent games to have reasonable scaling (and only for AMD on Ashes), and Ashes just seems like a DX12 demo. It's unbelievable how bad multi-gpu support has been lately. I mean check GameGPU's benches for ROTR, Hitman, Gears of War Ultimate, Forza, Quantum Break, Just Cause 3, Dark Souls III, Far Cry Primal, Doom, Total War Warhammer, and Mirror's Edge Catalyst.

Funny, because I would choose the 4k monitor every time, and use lower settings than the one they use in benchmarks which tanks FPS.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
7,127
5,998
136
Funny, because I would choose the 4k monitor every time, and use lower settings than the one they use in benchmarks which tanks FPS.

Are you going to get a consistent 60 fps at 4k medium on say Witcher 3, much less newer titles, even with a GTX 1080?
 

Thinker_145

Senior member
Apr 19, 2016
609
58
91
Funny, because I would choose the 4k monitor every time, and use lower settings than the one they use in benchmarks which tanks FPS.
Thats your preference and that's fine but not everyone has that preference.

Sent from my HTC One M9 using Tapatalk
 

Pandasaurus

Member
Aug 19, 2012
196
2
76
Are you going to get a consistent 60 fps at 4k medium on say Witcher 3, much less newer titles, even with a GTX 1080?

I would say that's entirely plausible, but you'll never see a benchmark at anything other than Ultra settings with every possible option enabled, which causes everyone to whine about low framerates. With the new (non-beta) drivers, HardwareUnboxed is showing the 1070 getting 33 FPS average (24 minimum) running Witcher 3 at 4K/Ultra+HairWorks+SSAO. Crysis 3 at 4K/Ultra+4xSMAA, the 1070 drops down to 23 FPS average (18 minimum).

Lose the AA (do you really need it at 4K on any size screen?), drop the settings down a bit, I wouldn't be surprised if you got much closer to 60 FPS.

That said, I'm used to <40 FPS at 1920x1200/Low, so... Anything is an improvement for me.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
7,127
5,998
136
I would say that's entirely plausible, but you'll never see a benchmark at anything other than Ultra settings with every possible option enabled, which causes everyone to whine about low framerates. With the new (non-beta) drivers, HardwareUnboxed is showing the 1070 getting 33 FPS average (24 minimum) running Witcher 3 at 4K/Ultra+HairWorks+SSAO. Crysis 3 at 4K/Ultra+4xSMAA, the 1070 drops down to 23 FPS average (18 minimum).

Lose the AA (do you really need it at 4K on any size screen?), drop the settings down a bit, I wouldn't be surprised if you got much closer to 60 FPS.

That said, I'm used to <40 FPS at 1920x1200/Low, so... Anything is an improvement for me.

Tech of Tomorrow showed Witcher 3 running 30-35 fps at 4k medium on a GTX 1070 from around the 6:00 mark to the 9:00 mark.

https://www.youtube.com/watch?v=2Z3PtF5jlZ4

He is pulling fps mostly in the 30s at 4k medium in ROTR also.

https://www.youtube.com/watch?v=r9q4V_eCx-Q

I do think it's monumentally stupid he tests with an i5-6500, but at 4k I wouldn't think that would matter.
 
Last edited:

torlen11cc

Member
Jun 22, 2013
143
5
81
I've seen claims that the GTX 1070/1080 are hitting thermal limits and throttling after a few minutes in some games (like The Witcher 3).
Is that true?
 

Pandasaurus

Member
Aug 19, 2012
196
2
76
DigitalFoundry and HardwareUnboxed both tested with an OC'd 6700K and got 30+ FPS average on Max/Ultra (though DigitalFoundry was testing with HairWorks off). Same goes for TechPowerUp and Tom's Hardware (both with HairWorks off).

This would seem to either indicate a CPU bottleneck with the i5-6500, or that there is minimal performance hit going from Medium to Ultra settings. I would suspect it's the former, not the latter. I'm not an expert, though. I could be wrong.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Not even the FE 1080 now they've fixed the fan curves - for some dumb reason the early review 1080's had a really, really passive fan curve so did drop off their boost clocks.
(So add a bit of noise to the review measurements.).

No idea about the 1070's. Not really a great idea to buy an FE anyway - the better AIB coolers will be notably quieter for the same level of cooling performance.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
Hmm yes you absolutely need AA.



Sent from my HTC One M9 using Tapatalk
Not at 4k.... I would never waste performance on aa at 4k. The whole reason I run at high resolutions is to get rid of the necessity of AA. That's why I downsample especially as well.

AA is the last option you would improve at 4k resolution. Honestly if I need aa, I'm not adding it. I'm rendering at a higher resolution. Aa is such a bad excuse for rendering at a higher resolution.
 
Last edited:

Thinker_145

Senior member
Apr 19, 2016
609
58
91
Not at 4k.... I would never waste performance on aa at 4k. The whole reason I run at high resolutions is to get rid of the necessity of AA. That's why I downsample especially as well.

AA is the last option you would improve at 4k resolution. Honestly if I need aa, I'm not adding it. I'm rendering at a higher resolution. Aa is such a bad excuse for rendering at a higher resolution.
What are you talking about? The need for AA is directly proportionate with PPI not what the resolution is lol.

I can see perhaps no need for AA on a 24" 4K monitor but I have played games at pretty high PPI and you always visibly benefit from at least 2xAA. I have admittedly not played on a 24" 4K screen but I have played on a much bigger one(don't rememeber exact size) and you absolutely do benefit from AA.

Sent from my HTC One M9 using Tapatalk
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |