AMD 7970 v Nvidia GTX 770

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Feb 19, 2009
10,457
10
76
Indeed it doesn't.

Just a pinch of faith

Nope, just logic. Based on previous data with GE titles, logically, with even more focus on GCN on consoles, its advantageous for AMD.

Now, does your own faith blind you from seeing that simple logic?
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Nope, just logic. Based on previous data with GE titles, logically, with even more focus on GCN on consoles, its advantageous for AMD.

Now, does your own faith blind you from seeing that simple logic?

LOL all I've said it takes a pinch of faith to believe that

games will run better on radeons

It seems that that alone is enough to question my logic skills

Anyway,
based on previous data with GE titles...can you guess how these games are going to run on radeons?

Based on that data, can you guess how many of those games are going to be GE,
or is it safe to assume Nvidia will pull out of gaming all together and everything will be consumed by AMD and Gaming Evolved :ninja:
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
If you look at the benchmarks from GameGPU, there is a larger spread between 780 OC and 770 OC then there is between 760 OC and 770 OC. This makes sense since 780 has higher overclocking room than 770 does.

We've already been over this before. Those GameGPU benches aren't valid as far as I'm concerned, because they use barely playable, or completely unplayable settings via excessively high usage of AA.. You can make any game unplayable if you increase the level of AA enough..

Whats the point of benching a GPU using settings that you know doesn't reflect real world usage? Who buys a GTX 770/780/Titan or Radeon 7970/7950, then plays Crysis 3 using SMAA 4x at 2560x1600 very high settings?

Nobody, at least no one with sense. Those settings are meant to be played with multiple GPUs, not single GPUs..

1Ghz HD7970 for $310 vs. $450 4GB GTX770 is not "slightly more". If you are talking about SLI vs. CF, then it's $620 vs. $900! You first said $1,300 is way too much for 780s in SLI vs. $900 for 770s, and yet 780 OC is 30-35% faster than 770 OC but 770 OC is barely faster than 7970 OC CF yet costs nearly $300 more for 770 SLI. Logic does not compute. You just artificially created a constraint that you won't pay more than $1,000 for dual-GPUs. That still doesn't relieve 770s from their poor value proposition.

This is only valid if you take the GameGPU benches as gospel, which I don't. This may be a surprise to you, but there are plenty of other reputable hardware websites that have enough sense to know NOT to use excessively high levels of AA combined with high resolution and maxed out settings to bench SINGLE GPU cards.

Those are much more reasonable prices but even then the 4GB version would be 29% more expensive than 1Ghz 7970.

And yet AMD finds itself having to lower their prices to compete.. Why is that I wonder, if the 7970 Ghz is so comparable to the GTX 770 4GB?
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Games like tombraider, hitman absolution, sleeping dogs, dirt showdown, Grid 2 , company of heroes 2 are running significantly faster on AMD HD 7970 Ghz wrt GTX 770. in sleeping dogs HD 7970 ghz is close behind titan.

Some games are always going to favor one architecture more than another..

Also, performance can change over time with driver updates and patches. Some of those games you outlined have only just recently been released, so while they may run faster on AMD hardware right now, doesn't mean thats going to be the case a few months from now.

Crysis 3, Bioshock Infinite, Far Cry 3, Medal of Honor Warfighter are just a few examples of games that ran faster on AMD hardware on their release, but after a few months and some patches, now run either equally fast, or faster on NVidia hardware.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,193
2
76
I dont know why you would say that. even at 2560 the 770 and 680 are even with the 7970 when all 3 are oced. not too mention the 770 Hard had was a dud of an overclocker. the 7970 does do much better in Tomb Raider but that is with the TressFX enabled. disable that and there goes that lead.

http://www.hardocp.com/article/2013/06/06/msi_n770_lightning_overclocking_review/3#.UdPRGPmyDf4

7.8 GHz on memory is a dud?

They probably could have upped the core with lower memory clocks if my understanding of nvidia oc'ing at this time is right.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Whats the point of benching a GPU using settings that you know doesn't reflect real world usage? Who buys a GTX 770/780/Titan or Radeon 7970/7950, then plays Crysis 3 using SMAA 4x at 2560x1600 very high settings?

What are you talking about? If you are going to complain about unrealistic AA settings for single GPUs, then it makes no sense to spend $100-130 more for a 770 4GB over 7970 1Ghz/GE. If you are going to compare your setup to 780 SLi, then the comparison is GTX 780 OC SLI vs. GTX770 OC SLI.

This is the whole point of that review because it shows how much 780OC beats 770OC by. Now take those numbers and increase framerates 80-90%, the performance advantage 780 OC vs. 770 OC in SLI remains at 30-35% and 780 OC SLI is perfectly playable.

Again, GTX770 4GB is the one of the most overpriced cards now compared to anything else on the market. It needs to be $379-399 to make sense. With 780 you are getting 30% overclocking headroom and bragging rights of the fastest card or a mini-Titan. With 770 4GB, you pay $100-130 more over AMD's equivalent that's barely 2-5% slower. Rip-off! Also, someone spending $900 on GPUs is going to use SMAA 4x at 1600P.

All your posts thus far have tried to justify why 770 4GB SLI setup is worth its price. You keep bringing up the point how $50 extra for 4GB is worth it but 760 4GB SLI would have only cost $580. That means even against 760s, 770s are overpriced, nevermind 7970s or 7950s overclocked.
 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
Originally Posted by BallaTheFeared
We both know that was a weak attempt at best for a member call out. The only company I could possibly be shilling for is AMD, because they don't require disclosure.

That is incorrect,AMD require full disclosure of any employees of the company or marketing affiliates when posting on technical,gaming or related forums.
 
Feb 19, 2009
10,457
10
76
LOL all I've said it takes a pinch of faith to believe that

Based on that data, can you guess how many of those games are going to be GE,
or is it safe to assume Nvidia will pull out of gaming all together and everything will be consumed by AMD and Gaming Evolved :ninja:

Are you trying to argue against next gen consoles being on AMD ecosystems, with full development from the ground up to take advantage of the hardware.. somehow does not lead to an advantage for AMD GPUs??

I'm trying to understand where you are coming from, but I sense too much koolaid from wherever it is..
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Are you trying to argue against next gen consoles being on AMD ecosystems, with full development from the ground up to take advantage of the hardware.. somehow does not lead to an advantage for AMD GPUs??

I'm trying to understand where you are coming from, but I sense too much koolaid from wherever it is..

Are you trying to say that in 3-4 years AMD will have the same uarch?

Do you know the strengths of Maxwell?
 

24601

Golden Member
Jun 10, 2007
1,683
39
86
You don't? Look at the recent PC GE games. Now imagine GE being infinitely more effective when its designed ground up to take advantage of the hardware inside consoles.. that's radeon GCN and Jaguar.

Doesn't take a genius to figure out future crossplatform games will run better on radeons due to the overwhelming development focus.

Wow, my 7970 has infinitely multiplied performance!

Infinite > quad titan obviously.

I'll never upgrade again!

Nothing will be better than infinite!
 
Feb 19, 2009
10,457
10
76
Are you trying to say that in 3-4 years AMD will have the same uarch?

Do you know the strengths of Maxwell?

GCN is just the start, GCN v2 isn't even here yet.. so yeah, im pretty certain GCN is gonna be around for awhile considering their push for HSA with every new gen.

Is this magical Maxwell going to replace AMD in the consoles that most developers will code for?

If you people can hint at TWIMTBP with Witcher 3 and Batman as giving NV an advantage, why is it so hard to accept that developing games ground up for consoles would give AMD an advantage?? Hmm...
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
GCN is just the start, GCN v2 isn't even here yet.. so yeah, im pretty certain GCN is gonna be around for awhile considering their push for HSA with every new gen.

Is this magical Maxwell going to replace AMD in the consoles that most developers will code for?

If you people can hint at TWIMTBP with Witcher 3 and Batman as giving NV an advantage, why is it so hard to accept that developing games ground up for consoles would give AMD an advantage?? Hmm...

Problem is once GCN v2, and whatever comes after that will be so far removed from GCN v1 it won't matter anymore. It's like saying the GPU in the xbox 360 or PS3 affected anything on this end, let's not be silly.

Current GCN chips don't have anything to do with HSA, neither will 2.0 and neither will any future dedicated graphics cards as far as anyone should know.

No, but GCN strengths could be GCN 2.0's weaknesses against Maxwell, which is the same tech in the consoles that you seem to think will make for an advantage towards AMD. Then what? If there is any truth to the matter, Nvidia will have an advantage in every game that comes out, unless AMD creates new spin for PC versions of course, but that is neither here nor there in the context that you're discussing.

The only real advantage is in early access to driver development and adding effects that end users can enjoy, such as those with hybrid PhysX, SSAA where there would have been none, and TressFX.

Point is in 3-5 years when consoles are in full swing they'll still be using GCN 1.0 whereas both AMD and Nvidia will most likely be two uarch's removed.


I have both, in case I have a brain haemorrhage and decide to game with PhysX enabled... ooo, so many fluff, fluff everywhere!!

I game with both, at the same time... In fact I'm going to forgo my case plans because nothing will accommodate what I want to do and get a PCIe extension and a faster dedicated PhysX card because my 9800GT won't cut the mustard with CFX. Though so far in Metro LL in actual gameplay it has been flawless, at least with a single 7950 which isn't that bad without SSAA. It's not like CF scales well anyways.
 
Last edited:
Feb 19, 2009
10,457
10
76
The only real advantage is in early access to driver development and adding effects that end users can enjoy, such as those with hybrid PhysX, SSAA where there would have been none, and TressFX.

And there we have it. Games developed ground up to take advantage of AMD features and optimized for GCN would obviously have flow on benefits for AMD GPUs on the PC. The benefits vary, it could be a TressFX-like feature that runs better on AMD, or it could be purely optimization like in CoH 2, where a 7870 is stomping all over a gtx680.

It reminds me of NV games in its glory days, where a lowly NV GPU stomps all over a top radeon. Not sure why NV isn't using some of their $$ on more developers to push TWIMTBP harder..

As to the xbox360 and PS3.. entirely different architectures, developers had to make sure their games ran well on the different hardware; namely two different console arc and dx9/11 on PC. The situation is a bit different this time, when both major consoles have essentially the same ecosystem. How much simpler is it to optimize your game when its that scenario?
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
And there we have it. Games developed ground up to take advantage of AMD features and optimized for GCN would obviously have flow on benefits for AMD GPUs on the PC. The benefits vary, it could be a TressFX-like feature that runs better on AMD, or it could be purely optimization like in CoH 2, where a 7870 is stomping all over a gtx680.

You completely missed the point, what happens on consoles will be for consoles. When it comes to PC AMD doesn't have a lock, and tomorrow nVidia could do just what you say in the next paragraph, buy everyone out and turn everything in TWIMTBP, dev'ed for consoles or not there is no hand in hand taking place.

Or it could be purely nVidia not having time to tweak their June 25th driver release for a product that released on June 25th.

As to the xbox360 and PS3.. entirely different architectures, developers had to make sure their games ran well on the different hardware; namely two different console arc and dx9/11 on PC. The situation is a bit different this time, when both major consoles have essentially the same ecosystem. How much simpler is it to optimize your game when its that scenario?


They were based on desktop products of their time, but you're right... They're entirely different architectures compared to what we have today.. Didn't I just try to make that point to you a post ago? GCN that is in upcoming consoles will have very few similarities to desktop products in a few years, just as last gen consoles have barely anything in common with current gen desktop uarch's.

The two consoles are actually pretty different on the technical level, and will require different coding methods to take advantage of each.

Desktop doesn't standstill, unlike fixed hardware what is a strength today might be a weakness tomorrow. Next year Nvidia will have a new uarch out, it will be the end of the Fermi line. The year after that, AMD will have a new uarch out, it will be the end of GCN as it is today, as it is in the consoles, which will be even further detached with possible DX12, and DX13... Just like current consoles are now with today's graphics cards.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
What are you talking about? If you are going to complain about unrealistic AA settings for single GPUs, then it makes no sense to spend $100-130 more for a 770 4GB over 7970 1Ghz/GE. If you are going to compare your setup to 780 SLi, then the comparison is GTX 780 OC SLI vs. GTX770 OC SLI.

And yet the benches you showed me were from single GPU cards.. I'm talking about these.

And I'm not comparing my 770 OC SLI to 780 OC SLI. Why would I want to do that? The whole point of overclocking is free performance. I know my cards will never catch up with overclocked 780s, but they can catch up with stock clocked 780s, which are already ridiculously fast..

If I wanted overclocked 780s, I would have spent the additional 400 dollars for them....but I didn't think it was worth it so I didn't.

This is the whole point of that review because it shows how much 780OC beats 770OC by. Now take those numbers and increase framerates 80-90%, the performance advantage 780 OC vs. 770 OC in SLI remains at 30-35% and 780 OC SLI is perfectly playable.

Yes, but what you don't seem to understand, is that using such high levels of AA in tandem with high resolution skews the benchmarks towards cards with wider memory buses and more VRAM, ie the GTX 780 and the 7970.

That's why the difference is so great. If you read reviews from websites like Anandtech, Guru3d, Tech Report etcetera, you'll notice they don't use high levels of AA in demanding games when benching single GPU cards at high resolution. They bench at the highest IQ the card is capable of playing at comfortably.

What's the point of benching the Witcher 2 at 2560x1600 with ubersampling enabled, when not even an overclocked Titan can get playable frame rates?

That proves nothing.

Again, GTX770 4GB is the one of the most overpriced cards now compared to anything else on the market. It needs to be $379-399 to make sense. With 780 you are getting 30% overclocking headroom and bragging rights of the fastest card or a mini-Titan. With 770 4GB, you pay $100-130 more over AMD's equivalent that's barely 2-5% slower. Rip-off!

Well you can always argue prices, but one of the reasons NVidia is able to get such prices for their GPUs is because AMD isn't a strong competitor and not a good bargain for high end gamers due to the fact that Crossfire is [redacted] and doesn't work properly.

So until AMD fixes Crossfire, they are not on equal footing. Personally, I wouldn't care if the 7970 GE was 200 dollars, I still wouldn't buy it.

Also, someone spending $900 on GPUs is going to use SMAA 4x at 1600P.

I recently spent $900 on GPUs, and I game at 1440p and have never used SMAA 4x. Using high amounts of AA at high resolution is unnecessary, since the high pixel density smooth out the edges quite a bit already.

I play Crysis 3 at 2560x1440 very high everything and SMAA 2x and I can see no noticeable jaggies. Far Cry 3 I don't even use MSAA, I just use the in game FXAA via PostFX and that gets rid of 98% of all jaggies on the screen for almost no performance hit. The other 2% I never see and can't be bothered to look for them.

Bioshock Infinite, I use in game FXAA for that at ultra settings. Batman Arkham City I use CSAA 8x, which is supposedly the equivalent of 4x MSAA but runs much faster.

If I had 780 SLI, I would still use the exact same settings that I do now, because there's no tangible increase in IQ by using more expensive forms of AA.....unless I take screenshots and obsess of jaggies..

All your posts thus far have tried to justify why 770 4GB SLI setup is worth its price. You keep bringing up the point how $50 extra for 4GB is worth it but 760 4GB SLI would have only cost $580. That means even against 760s, 770s are overpriced, nevermind 7970s or 7950s overclocked.

I don't have to justify anything, as it's my money and I can spend it how I please.

I'm just responding to yours and others asinine comments saying how the GTX 770 is the worst bang for the buck card high end gaming card, and the 7970 is the best bang for the buck for high end gaming, when Crossfire doesn't even work

And then using cherry picked benchmarks that use ridiculous levels of anti aliasing to prove your point.. Also, the 760 is a midrange card so of course it's going to have better value than the 770, and the 770 is going to have better value than the 780, and the 780 is going to have better value than the Titan.

See a pattern yet?

In the end though, AMD is the one that's having to drop their prices, not NVidia..

So don't forget that

Warning issued for inappropriate language.
--stahlhart
 
Last edited by a moderator:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
What are you talking about? If you are going to complain about unrealistic AA settings for single GPUs, then it makes no sense to spend $100-130 more for a 770 4GB over 7970 1Ghz/GE. If you are going to compare your setup to 780 SLi, then the comparison is GTX 780 OC SLI vs. GTX770 OC SLI.

This is the whole point of that review because it shows how much 780OC beats 770OC by. Now take those numbers and increase framerates 80-90%, the performance advantage 780 OC vs. 770 OC in SLI remains at 30-35% and 780 OC SLI is perfectly playable.

Again, GTX770 4GB is the one of the most overpriced cards now compared to anything else on the market. It needs to be $379-399 to make sense. With 780 you are getting 30% overclocking headroom and bragging rights of the fastest card or a mini-Titan. With 770 4GB, you pay $100-130 more over AMD's equivalent that's barely 2-5% slower. Rip-off! Also, someone spending $900 on GPUs is going to use SMAA 4x at 1600P.

All your posts thus far have tried to justify why 770 4GB SLI setup is worth its price. You keep bringing up the point how $50 extra for 4GB is worth it but 760 4GB SLI would have only cost $580. That means even against 760s, 770s are overpriced, nevermind 7970s or 7950s overclocked.

The GTX 770 is not over-priced compared to the competition as a whole -- only if one cherry picks a cheaper sku and blankets price performance as a whole.

http://www.newegg.com/Product/Produc...=-1&isNodeId=1

http://www.newegg.com/Product/Produc...=-1&isNodeId=1
 
Feb 19, 2009
10,457
10
76
They were based on desktop products of their time.

Cell certainly was nothing similar to desktop products. The GPU arch was also completely different for the two consoles.. so we had 2 different CPU arch and 2 different GPU arch. It would not have been feasible for cross-platform development to focus on optimizing to maximize a single arc.. unless it was a platform exclusive title, ie. Metal Gear series for PS3, amazing graphics for what the hardware was on paper.

My point is moving forward, developers have a single ecosystem to code to, which they can heavily optimize for since their efforts would reap benefits for both major consoles as well as a portion of the PC users (its an "infinitely" better situation than wasting effort/time/$$ to optimize for the PS3 only, and having games run like crap on xbox/pc). The consoles are dx11.1 capable with some HSA function, if they take advantage of that, which GPU hardware do you think it benefits more on the PC; AMD, NV, neither/neutral?

I don't disagree the PC is a moving platform, but my point stands. We are at the first iteration of this GCN micro-uarch, AMD plans on evolving it in subsequent generation, not revolutionizing it into a new micro-uarch. It's not a VLIW to GCN jump. It's like a VLIW from the radeon 3xxx series -> 4xxx -> 5xxx and 6xxx leap, which is still going on in APUs and mobile discrete and optimizations that benefit VLIW benefit the entire downstream generations.

Looking at it objectively, I cannot see a scenario where game devs focusing on a complete AMD ecosystem to utilize it to the max, somehow would not lead to benefits for their desktop GPUs. It's completely contrary to logic to take that position.

Edit: The other scenario is, Maxwell is very GCN like and suddenly whatever is optimized for GCN runs flawlessly on Maxwell as well.. likely?? Not sure.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
GCN is just the start, GCN v2 isn't even here yet.. so yeah, im pretty certain GCN is gonna be around for awhile considering their push for HSA with every new gen.

Is this magical Maxwell going to replace AMD in the consoles that most developers will code for?

If you people can hint at TWIMTBP with Witcher 3 and Batman as giving NV an advantage, why is it so hard to accept that developing games ground up for consoles would give AMD an advantage?? Hmm...

It may be an advantage but I am not going to buy a PC product based on consoles unless one shows clear advantages except talk. Similar things were said when ATI/AMD won nintendo and X-box contracts years ago based on their graphic technologies were used.
 
Feb 19, 2009
10,457
10
76
Wii was rubbish, it never influenced graphics like xbox and ps3.. but as I've said, it was a different scenario, two different cpu arch, two different gpu arch (AMD and NV) for those two consoles. It makes no sense for cross-platform games to be heavily optimized for one architecture, it only occurred on PS3 exclusives. Now, the situation is entirely optimal for game developers to focus on optimizing their games for one architecture.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Why would developers ignore where most of the PC marketshare is? What if future Intel and nVidia PC iterations are much more flexible and robust architectures?
 
Feb 19, 2009
10,457
10
76
Why would developers ignore where most of the PC marketshare is? What if future Intel and nVidia PC iterations are much more flexible and robust architectures?

1. Cross-platform games outsell on consoles vs PC by a massive margin. Pretty certain this is the reason for the "console portitis syndrome". If they are to devote manpower and $ to optimize, they are going to do it for the most valuable segment: consoles. As a byproduct, AMD's PC GPU will benefit. I'm not even going to pretend devs will focus any bit of effort to optimize for PC radeons.. its a bonus side-effect.

2. That is the unknown part. If Maxwell is very flexible, then its sweet sailings, no worries! If not, IMO, its going to look a lot like CoH 2 performance deltas going forward.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I like the idea more-so based on developers possibly embracing x86 multi-cores and garnering more efficiency!
 
Status
Not open for further replies.
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |