so why buy the 6xx series ? (august '12)

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AnandThenMan

Diamond Member
Nov 11, 2004
3,949
504
126
In summary you are saying some people will buy Nvidia products. No way. :sneaky:
Kepler offers nice efficiency and performance; and this balance may be appealing to some.

nVidia offers a strong brand name that may be appealing to some!

nVidia offers some feature differentiation that may be appealing to some!
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Indeed! Some may purchase AMD as well for these potential strengths:

GCN offers strong compute capabilities and may be appealing to some!

GCN offers strong over-clocking prowess and may be appealing to some!

GCN offers the fastest single GPU performance and may be appealing to some!

GCN offers the most default ram and may be appealing to some!

GCN offers strong performance/value and may be appealing to some!

The key is they both have strengths and personally allow each individual to find the product that best fits their subjective tastes, tolerances and wallet - and simply enjoy their purchase and hopefully share views with others -- pros and cons.

For which one is better for all? Personally allow the market place to decide this.
 

chimaxi83

Diamond Member
May 18, 2003
5,649
61
101
Kepler offers nice efficiency and performance; and this balance may be appealing to some.

nVidia offers a strong brand name that may be appealing to some!

nVidia offers some feature differentiation that may be appealing to some!

I can almost visualize this being chanted to a beat, with cheerleaders clad in green.
 

YBS1

Golden Member
May 14, 2000
1,945
129
106
Well, I'm convinced. Anybody want to buy a couple of crappy 680 Lightnings? 100 bucks each.
 

PCboy

Senior member
Jul 9, 2001
847
0
0
so? scaling in 2 games doesn't mean anything when your library of 10 other titles doesn't work. Plus microstutter makes a big difference. Look at some of the reviews that show minimum fps too. A lot of the time the AMD cards have higher averages and max but the minumums are much lower than they should be. The experience during gameplay has to be affected by that.

Oh boy, not this again. I'm sure both of you have had quad-SLI experience and zero hard lock/CTD issues. There's a reason why people like myself and others (l88bastard, levesque, tsm106, vega, etc.) use 7970s for multimonitor setups instead of our 680/690s and microstuttering is obviously not one of them. There's barely any microstutter from both vendors unless I max out the AA/AF at 5760x1200.

Anyways, I know I won't sway anyone's beliefs here so I'll just show myself out of this thread.
 
Last edited:

Axon

Platinum Member
Sep 25, 2003
2,541
1
76
Because VIA is the last x86 maker besides Intel and AMD. And they are still alive, tho heavily downsized. And AMD keeps getting smaller and smaller.

Intel needs AMD to avoid pesky monopoly law suits
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Heres a quote from the 7970 ghz edition review from anandtech "The end result is that while AMD has tied NVIDIA for the single-GPU performance crown with the Radeon HD 7970 GHz Edition, the GeForce GTX 680 is still the more desirable gaming card. There are a million exceptions to this statement of course (and it goes both ways), but as we said before, these cards may be tied but they're anything but equal."
That's actually a really good quote that I think sums things up nicely. With performance as close as it is we have to fall back to evaluating products based on "softer" metrics, such as heat/noise and feature sets. At times GTX 600 and Radeon HD 7000 are widely different in these regards, and since these are "soft" metrics there is no right answer. Either card would be quite good depending on what your needs are, IMHO.

PS: Just going to add that when I opened this thread I was full of dread, but despite how the question has been posed so far you guys are staying on topic and focused. Kudos for that
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
so? scaling in 2 games doesn't mean anything when your library of 10 other titles doesn't work. Plus microstutter makes a big difference. Look at some of the reviews that show minimum fps too. A lot of the time the AMD cards have higher averages and max but the minumums are much lower than they should be. The experience during gameplay has to be affected by that.



Most games overstate the amount of memory in use and you specifically said bandwidth. I know for a fact that memory bandwidth won't make the difference between unplayable and smooth fps with a new title that stresses the GPU like say the original crysis did a few years back.


I'm sensitive to MS and tearing and just an overall bit of perfectionist for these things. I notice MS when in xfire at anything <60fps with vsync on. Makes for funtime with cattering to 60fps min, but so far so good.

Single cards I also don't like the dip from 60 to 40/30 though.

I'll say I prefer 1 card at 40fps over xxxfps with MS. But for me its 60fps locked or bust these days.

Vsync a must for MS, though this was the same with my 460SLI setup. Some games worse than others. Standouts are Skyrim, Heaven also shows MS well.
 

The Alias

Senior member
Aug 22, 2012
647
58
91
That's actually a really good quote that I think sums things up nicely. With performance as close as it is we have to fall back to evaluating products based on "softer" metrics, such as heat/noise and feature sets. At times GTX 600 and Radeon HD 7000 are widely different in these regards, and since these are "soft" metrics there is no right answer. Either card would be quite good depending on what your needs are, IMHO.

the thing is their statement that the 680 is no longer the more desirable card because the 7970 ghz is only available in non ref flavors that eliminate heat and noise so then you're left with a stalemate between a $500 card and a $470 card
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
5 fps at 50 fps avg is a huge difference. It doesn't matter at 20fps b/c they're both slideshows, and it doesn't matter at 120 fps b/c the display wouldn't show a difference, but at 50 fps even a casual gamer would immediately see a difference.

Below 50fps. Both would feel like garbage because the minimums are around 10 at times. Not fun.

sli yes, nvidia features physx which I think will be replaced by direct compute, 3d vision yes (but people hardly use that), adaptive v-sync which I can't really tell the difference from that and a frame rate limiter and need clarification on how the tecnologies are different as the hwcanucks article didn't differentiate the two, then we have txaa which has only been implemented in one game .

times have changed . the 680 retails for $30 more and performs worse in the majority of games at max settings (info garnered in other forumsfrom people who had both cards) . Not to mention the ghz edition cards are all non reference and don't have heat issues like the reference did .

DirectCompute =/= Physx.

Framerate limiter will keep your cards just fast enough so as to keep your FPS at the level you select. For example, Skyrim has issues when you run 200fps. So I lock it to 80fps and all the flashing textures and AI weirdness goes away. The bonus is your cards will run cooler, use less power, and be much quieter. When I limit the FPS to 80 I still have vsync off to eliminate input lag. It just lets my cards run just enough to keep that 80fps and not waste power when it's not needed.

Adaptive vsync is not the same and I cannot believe you couldn't figure out the difference yourself. It's like regular vsync except without the FPS drops. Vsync without triple buffering will drop in multiples of 60 when the fps goes down(assuming 60hz refresh rate). You don't go from 60fps to 45fps. Instead you go to 30fps, then 20, then 15. That is why games sometimes stutter if you cannot keep a constant 60fps and you get a drop. It drops by a huge amount. Adaptive vsync will dynamically turn vsync off when you get the drops so you don't get the stuttering effect and no 30fps hit, and turn vsync on again when you are at 60fps (or 120fps if you have 120hz monitor or even 85hz on a CRT). It never eliminates screen tearing completely, but it does eliminate the drastic FPS drop due to lack of triple buffering.


Oh boy, not this again. I'm sure both of you have had quad-SLI experience and zero hard lock/CTD issues. There's a reason why people like myself and others (l88bastard, levesque, tsm106, vega, etc.) use 7970s for multimonitor setups instead of our 680/690s and microstuttering is obviously not one of them. There's barely any microstutter from both vendors unless I max out the AA/AF at 5760x1200.

Anyways, I know I won't sway anyone's beliefs here so I'll just show myself out of this thread.

Who was talking about multimonitor? Who mentioned quad SLI? Nobody...so you're replying to a ghost or something.

I'm sensitive to MS and tearing and just an overall bit of perfectionist for these things. I notice MS when in xfire at anything <60fps with vsync on. Makes for funtime with cattering to 60fps min, but so far so good.

Single cards I also don't like the dip from 60 to 40/30 though.

I'll say I prefer 1 card at 40fps over xxxfps with MS. But for me its 60fps locked or bust these days.

Vsync a must for MS, though this was the same with my 460SLI setup. Some games worse than others. Standouts are Skyrim, Heaven also shows MS well.

I've grown accustomed to turning vsync off no matter what that way I get max performance with no input lag. To me input lag is far worse than any tearing (if I'm above 70fps I don't see much tearing anyhow). Microstutter is something I ignore if I see it. Never bad enough to affect me.


the thing is their statement that the 680 is no longer the more desirable card because the 7970 ghz is only available in non ref flavors that eliminate heat and noise so then you're left with a stalemate between a $500 card and a $470 card

Power consumption too...can't forget that. Plus did you notice? The way AMD chose to compete with Nvidia is to market Boost clocks! That's right...take the same idea Nvidia had and use it on their cards. Only wait, they upped the voltage too much and as a result shot the power consumption up unnecessarily. Nvidia still does Boost better than AMD.
 
Last edited:

The Alias

Senior member
Aug 22, 2012
647
58
91
Framerate limiter will keep your cards just fast enough so as to keep your FPS at the level you select. For example, Skyrim has issues when you run 200fps. So I lock it to 80fps and all the flashing textures and AI weirdness goes away. The bonus is your cards will run cooler, use less power, and be much quieter. When I limit the FPS to 80 I still have vsync off to eliminate input lag. It just lets my cards run just enough to keep that 80fps and not waste power when it's not needed.

Adaptive vsync is not the same and I cannot believe you couldn't figure out the difference yourself. It's like regular vsync except without the FPS drops. Vsync without triple buffering will drop in multiples of 60 when the fps goes down(assuming 60hz refresh rate). You don't go from 60fps to 45fps. Instead you go to 30fps, then 20, then 15. That is why games sometimes stutter if you cannot keep a constant 60fps and you get a drop. It drops by a huge amount. Adaptive vsync will dynamically turn vsync off when you get the drops so you don't get the stuttering effect and no 30fps hit, and turn vsync on again when you are at 60fps (or 120fps if you have 120hz monitor or even 85hz on a CRT). It never eliminates screen tearing completely, but it does eliminate the drastic FPS drop due to lack of triple buffering.

Power consumption too...can't forget that. Plus did you notice? The way AMD chose to compete with Nvidia is to market Boost clocks! That's right...take the same idea Nvidia had and use it on their cards. Only wait, they upped the voltage too much and as a result shot the power consumption up unnecessarily. Nvidia still does Boost better than AMD.


no my point was adaptive v-sync turns it off and on depending on whether you gpu will drop below your monitors refresh rate . So my question was why not set your framerate limiter to like five fps below its refresh rate (i.e 70 fps for a 75hz monitor) and benefit the exact same way as having adaptive v-sync AND not have input lag

As for power, let's break this down . :

7970ghz-680= 55w
7970-670= 50w
7950-660ti= 5w

when at full stress according to guru3d and at long idle :

7970ghz-680= -10w
7970-670= -7w
7950-660ti= -8w
according to anandtech

Now let's say you game 4 hours a day, leaving 8 hours for sleep, 8 hours for work and 4 hours for family time (who use another computer when they are home and need one) . So the equation I'm going to use is

4 x the full stress difference of a matchup + 20 x the long idle difference of a matchup then I'll convert that into cost using a kwh per dollar rate of $0.09 (my towns rate)

Now with that said, let's get some results :

7970ghz - 680 = 4*(55w) + 20*(-10w) = 220w - 200w = 20w = $.0018
7970 - 670 = 4*(50w) + 20*(-7w) = 200w - 140w = 60w = $.0054
7950 - 660ti = 4*(5w) + 20(-8w) = 20w - 160w = -140w = $-.0122

and this is over the course of a day . So it's really not even worth arguing over since the difference you pay over a month is not even a dollar ! Let alone something substantial .

I wouldn't call nvidias way better as their "boost" feature makes their clocks go up way beyond specification and I find that really sneaky

overall I don't like the "boost" feature on neither architecture although I prefer amds better because "boost" goes out the window when you oc, but you have to tweak that as well on nvidia systems .
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
no my point was adaptive v-sync turns it off and on depending on whether you gpu will drop below your monitors refresh rate . So my question was why not set your framerate limiter to like five fps below its refresh rate (i.e 70 fps for a 75hz monitor) and benefit the exact same way as having adaptive v-sync AND not have input lag

Simple, with vsync off you get a ton of screen tearing when your FPS is low like that. I've noticed that when you get up past 70fps the screen tearing seems to lessen but when I'm at 50fps it's terrible.
 

The Alias

Senior member
Aug 22, 2012
647
58
91
Simple, with vsync off you get a ton of screen tearing when your FPS is low like that. I've noticed that when you get up past 70fps the screen tearing seems to lessen but when I'm at 50fps it's terrible.
oh I never noticed it below 75 fps tbh
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |