Gigabyte GTX680 retail pictures

Page 12 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Eugene86

Member
Dec 18, 2007
160
0
71
What is this NVidia-weighted benchmarks nonsense? People who buy a card buy it for the games they play. I could care less if Crysis 2, Metro 2033, SKYRIM or BF3 is optimized for NV or AMD. Why does this always turn to NV vs. AMD "optimized" to try to defend one card or another? In those very games, HD7970 thrashes HD6970. Are they HD7970 optimized then? Gamers buy the fastest card for the games the play. In those games, NV has the fastest card. It's like saying all the benchmarks where SB wins vs. Bulldozer are "Intel optimized"? People don't care - they buy the fastest process for their specific tasks.

It's not my opinion. In Tom's review it beats HD7970 at 1080P in a bunch of key games (Crysis 2, Metro 2033, SKYRIM, Dirt 3, Battlefield 3). That's not conclusive, but it has an 11-25% lead from that review.



If that review is accurate, then even if GTX680 is $550, HD7970 would need a price cut.



Not negate all other factors. I never said it does. But in this case, HD7970 doesn't even win and it has a louder cooler (that's 0/2). HD7970's reference cooler is way too loud for sufficient overclocking at comfortable noise levels and here a stock GTX680 is on avg. 18% faster to begin with.... Obviously there are some HD7970s for $550 with non-reference coolers such as (Powercolor PCS+), but for those people who want a quieter blower card, GTX680 seems to have the edge already. Plus, rumored overclocks of 1300mhz on a stock cooler seems promising. HD7970 can't run at those speeds with a reference cooler at comfortable noise levels.

Again from that review, GTX680 is a better card in almost every way imaginable. Let's wait for other reviews to see if this holds.

Thank you for saving me the time to type this out
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
What is this NVidia-weighted benchmarks nonsense? People who buy a card buy it for the games they play. I could care less if Crysis 2, Metro 2033, SKYRIM or BF3 is optimized for NV or AMD. Why does this always turn to NV vs. AMD "optimized" to try to defend one card or another? In those very games, HD7970 thrashes HD6970. Are they HD7970 optimized then? Gamers buy the fastest card for the games the play. In those games, NV has the fastest card. It's like saying all the benchmarks where SB wins vs. Bulldozer are "Intel optimized"? People don't care - they buy the fastest process for their specific tasks.
Not sure if serious...

So you don't even question why they didn't use MSAA in Skyrim or why they didn't use FXAA in BF3? Where's the multi-monitor performance, especially since that's a major new feature on the GTX 680? Yet these 6 tests are enough for you to proclaim:
So as an all around card, GTX680 looks better already.
I guess I like to see more games tested at higher IQ settings before forming a definitive opinion.
It's not my opinion. In Tom's review it beats HD7970 at 1080P in a bunch of key games (Crysis 2, Metro 2033, SKYRIM, Dirt 3, Battlefield 3). That's not conclusive, but it has an 11-25% lead from that review.

If that review is accurate, then even if GTX680 is $550, HD7970 would need a price cut.
So now this review is no longer surefire evidence that the GTX 680 is the all around better card? I'm not quite sure what you're arguing here since you've switched your stance now.
Not negate all other factors. I never said it does. But in this case, HD7970 doesn't even win and it has a louder cooler (that's 0/2). HD7970's reference cooler is way too loud for sufficient overclocking at comfortable noise levels and here a stock GTX680 is on avg. 18% faster to begin with....
4dB difference is "way too loud?" You realize that 4dB isn't even a perceivable difference in most environments without a quantitative measure from sound equipment?
Obviously there are some HD7970s for $550 with non-reference coolers such as (Powercolor PCS+), but for those people who want a quieter blower card, GTX680 seems to have the edge already. Plus, rumored overclocks of 1300mhz on a stock cooler seems promising. HD7970 can't run at those speeds with a reference cooler at comfortable noise levels.
But there hasn't even been a meta-analysis of overclocking results yet. Besides, a 7970 would only have to hit ~1200MHz to match the same overclock of a GTX 680 at 1300MHz. 1200MHz is easily doable on the reference cooler without touching the stock fan profile. I'm not sure where you're getting your information on the 7970. Also, how are you so sure that the GTX 680 did 1300MHz at "comfortable" noise levels?
Again from that review, GTX680 is a better card in almost every way imaginable. Let's wait for other reviews to see if this holds.
Waiting for other reviews would be a good idea.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Not sure if serious...

So you don't even question why they didn't use MSAA in Skyrim or why they didn't use FXAA in BF3? Where's the multi-monitor performance, especially since that's a major new feature on the GTX 680? Yet these 6 tests are enough for you to proclaim:
I guess I like to see more games tested at higher IQ settings before forming a definitive opinion.
So now this review is no longer surefire evidence that the GTX 680 is the all around better card? I'm not quite sure what you're arguing here since you've switched your stance now.
4dB difference is "way too loud?" You realize that 4dB isn't even a perceivable difference in most environments without a quantitative measure from sound equipment?
But there hasn't even been a meta-analysis of overclocking results yet. Besides, a 7970 would only have to hit ~1200MHz to match the same overclock of a GTX 680 at 1300MHz. 1200MHz is easily doable on the reference cooler without touching the stock fan profile. I'm not sure where you're getting your information on the 7970. Also, how are you so sure that the GTX 680 did 1300MHz at "comfortable" noise levels?
Waiting for other reviews would be a good idea.


1) He's right, it doesn't matter if the game is NV or AMD optimized or any BS like that. You buy the card for the games that you play, it's not our fault that NV has 10x the developer relations that AMD has. If you have a problem with it, bring it up with AMD.

2) 4dB is definetely perceivable. I don't think you realize dB is a logarithmic scale and that an increase by 4dB is almost 50% more noise perceived.

3) Stop comparing overclocks. Overclocks are always an unknown.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
I think you will see those with 7970s already in their sig 'downplay' the GTX 680 and everyone else identify the 680 as a better card for the price vs. 7970 (at the current price). I agree that IF the 680 is $499, its a better card for everything except 1600P (appears to be a wash) and 2x/3x displays where the extra RAM and bandwidth help AMD. It will be really interesting to see where the 4GB cards come into play down the line.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
1) He's right, it doesn't matter if the game is NV or AMD optimized or any BS like that. You buy the card for the games that you play, it's not our fault that NV has 10x the developer relations that AMD has. If you have a problem with it, bring it up with AMD.
That's not the point I'm arguing, so I have no idea what you're getting at. My point is that it isn't prudent to generalize overall performance from a set of cookie cutter press box reviews. However, if you're playing HAWX 2 with tessellation disabled, then this is the card you've been waiting for.
2) 4dB is definetely perceivable. I don't think you realize dB is a logarithmic scale and that an increase by 4dB is almost 50% more noise perceived.
No it's not, and this is a common error I see made by people who don't understand the decibel unit (or sound in general). In summary, a decibel relates a quantity, in this case intensity, to reference. Sound pressure does not equate to perceived loudness. In practice, perceived loudness doubles with an order of magnitude of sound pressure, but this is only in the middle range of human hearing. Something as soft as 40dB isn't in that range, and as I said, the difference of 4dB there would be hard to perceive. This also isn't even taking the frequency/tonal quality into consideration, which is probably what most reviews will report on.

3) Stop comparing overclocks. Overclocks are always an unknown.
And yet most of us here do it, so it will be discussed.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
That's not the point I'm arguing, so I have no idea what you're getting at. My point is that it isn't prudent to generalize overall performance from a set of cookie cutter press box reviews. However, if you're playing HAWX 2 with tessellation disabled, then this is the card you've been waiting for.
No it's not, and this is a common error I see made by people who don't understand the decibel unit (or sound in general). In summary, a decibel relates a quantity, in this case intensity, to reference. Sound pressure does not equate to perceived loudness. In practice, perceived loudness doubles with an order of magnitude of sound pressure, but this is only in the middle range of human hearing. Something as soft as 40dB isn't in that range, and as I said, the difference of 4dB there would be hard to perceive. This also isn't even taking the frequency/tonal quality into consideration, which is probably what most reviews will report on.

And yet most of us here do it, so it will be discussed.

What cookie cutter press box reviews? Tom (as bad as that site is) benched all regular and popular games (BF3, Dirt3, Skyrim, Crysis 2), most of which are present in the set AT uses. When Ryan's review comes out are you going to say that the games are also cookie cutter press box? Nobody cares about HAWX 2 and that's not why people will buy the GTX 680. They will buy it because (in order of importance)

1) Faster
2) More features (CUDA, Physx, FXAA/TXAA, Adaptive Vsync, and so on)
3) Cheaper (?), or at least more perf/$
4) Quieter
5) Less power
6) Cooler
7) I could go on but I'll stop.

NV has had points 1 and 2 down for quite a bit now, but it's been they've had points 3,4,5,and 6. There is now absolutely zero reason to buy a 7970, unless it gets at least a $100 price cut.

And in regards to dB, I think it's you who doesn't understand. Open any audio editing program and give your waveform a +4dB volume increase, and please tell me if you can hear the difference.

Discuss overclocking all you want, it means nothing when comparing a product. Do you ever hear a car company trying to sell you their car because if you mod the engine it becomes better than the competitor?
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
What cookie cutter press box reviews? Tom (as bad as that site is) benched all regular and popular games (BF3, Dirt3, Skyrim, Crysis 2), most of which are present in the set AT uses. When Ryan's review comes out are you going to say that the games are also cookie cutter press box? Nobody cares about HAWX 2 and that's not why people will buy the GTX 680.
Ryan benches a wider spectrum of games at much higher quality settings. No one buys a $500 graphics card to play HAWX 2 with tessellation disabled, or to play Skyrim without any MSAA, yet you're using those results to argue that the GTX 680 is a better buy. You don't see the irony?
They will buy it because (in order of importance)

1) Faster
2) More features (CUDA, Physx, FXAA/TXAA, Adaptive Vsync, and so on)
3) Cheaper (?), or at least more perf/$
4) Quieter
5) Less power
6) Cooler
7) I could go on but I'll stop.
I think the power draw and the features are the main selling points for the GTX 680. The faster, better performance/$, and quieter are all circumstantial at best. As I said, there's not enough reviews out to form an informed opinion, yet you're already championing the GTX 680. It's fine to be excited about new hardware or to be a fan of nvidia, but if I see opinions I disagree with or statements that are just plain incorrect (such as what was written about dB), I will post my own opinion or correction.
NV has had points 1 and 2 down for quite a bit now, but it's been they've had points 3,4,5,and 6. There is now absolutely zero reason to buy a 7970, unless it gets at least a $100 price cut.
Not really, I haven't seen any definitive data on high-resolution performance. I'll be quite surprised if the GTX 680 can match the 7970 at 1600p+ gaming, especially once they're overclocked and the settings are cranked. Hence I'm eagerly awaiting [H]'s enthusiast-targeted reviews. If I had to extrapolate, I think the GTX 680 will take the 1080p-preference, but high-resolution gaming has been an Achilles heel for them.
And in regards to dB, I think it's you who doesn't understand. Open any audio editing program and give your waveform a +4dB volume increase, and please tell me if you can hear the difference.
Again, you've simplified the concept to the point that you're incorrect. As I've said, it's not only the change in the intensity, but also it's relative position from reference as well as its sound signature. That's a lot of data to interpret from a single reading by a decibel meter. Furthermore, I guarantee someone could run into your room, change your stereo's gain by 4dB, and you wouldn't notice a bit of difference when you turned your music on. Also consider that they literally could move the sound meter a foot in any direction and that would be enough to change the sound reading by 4dB due to resonance and interference from within the test bed. So not only is it a question of the human ear simply not being that sensitive, but also a matter of testing fidelity and the large margin of error this type of testing entails. If you want to learn more about this, I really like this link: http://www.sengpielaudio.com/calculator-levelchange.htm . You can play with the calculators to better understand these concepts. If not, ask away or start another thread and I'll be happy to explain it in detail.

Discuss overclocking all you want, it means nothing when comparing a product. Do you ever hear a car company trying to sell you their car because if you mod the engine it becomes better than the competitor?
I take it by your use of car analogy that you have no defense for your weak point. Let me reiterate that this is an enthusiast community and most of us here overclock. Overclocking meta-analysis and averages are important information to consumers, it's the reason the 5850's, GTX 460's, and 7970's are popular among enthusiasts. If you don't overclock or don't care for it, kindly ignore the posts about it.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
There is now absolutely zero reason to buy a 7970, unless it gets at least a $100 price cut.

Hehehe. Wait til you get a bigger screen, go multi-monitor, use 2049mb of frame buffer, overclock to 1250/1800, or perhaps bench higher for HWbot. OC'd 7970 CF is already beating GTX680 SLI in 3dmark 11 Extreme.

The 680 is an awesome card - I'm not knocking it, I'm just providing the reasons you'd want a 7970. The higher the resolution, the faster the 7970 is in comparison, until the point that the 680 loses at 1600p in Metro. High resolutions setups don't stop at 1600p. There is SSAA and multi-display beyond that.

Fuad wrote the 680's "price just dropped this morning" to $499. Since it's already listed for pre-order from $556-$590, it would appear you are correct about a $100 price cut.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Ryan benches a wider spectrum of games at much higher quality settings. No one buys a $500 graphics card to play HAWX 2 with tessellation disabled, or to play Skyrim without any MSAA, yet you're using those results to argue that the GTX 680 is a better buy. You don't see the irony?
I think the power draw and the features are the main selling points for the GTX 680. The faster, better performance/$, and quieter are all circumstantial at best. As I said, there's not enough reviews out to form an informed opinion, yet you're already championing the GTX 680. It's fine to be excited about new hardware or to be a fan of nvidia, but if I see opinions I disagree with or statements that are just plain incorrect (such as what was written about dB), I will post my own opinion or correction.
Not really, I haven't seen any definitive data on high-resolution performance. I'll be quite surprised if the GTX 680 can match the 7970 at 1600p+ gaming, especially once they're overclocked and the settings are cranked. Hence I'm eagerly awaiting [H]'s enthusiast-targeted reviews. If I had to extrapolate, I think the GTX 680 will take the 1080p-preference, but high-resolution gaming has been an Achilles heel for them.
Again, you've simplified the concept to the point that you're incorrect. As I've said, it's not only the change in the intensity, but also it's relative position from reference as well as its sound signature. That's a lot of data to interpret from a single reading by a decibel meter. Furthermore, I guarantee someone could run into your room, change your stereo's gain by 4dB, and you wouldn't notice a bit of difference when you turned your music on. Also consider that they literally could move the sound meter a foot in any direction and that would be enough to change the sound reading by 4dB due to resonance and interference from within the test bed. So not only is it a question of the human ear simply not being that sensitive, but also a matter of testing fidelity and the large margin of error this type of testing entails. If you want to learn more about this, I really like this link: http://www.sengpielaudio.com/calculator-levelchange.htm . You can play with the calculators to better understand these concepts. If not, ask away or start another thread and I'll be happy to explain it in detail.

I take it by your use of car analogy that you have no defense for your weak point. Let me reiterate that this is an enthusiast community and most of us here overclock. Overclocking meta-analysis and averages are important information to consumers, it's the reason the 5850's, GTX 460's, and 7970's are popular among enthusiasts. If you don't overclock or don't care for it, kindly ignore the posts about it.


Haha speaking of irony, that's cute. Those benchmarks aren't cpu limited, you don't need to add MSAA to see the potential of the card. As the workload gets tougher, performance will proportionally scale, until you hit a bottleneck (buffer or bandwidth). There is absolutely nothing (minus exotic hardware/software scenarios) that requires over 2GB of buffer, and the bandwidth is comparable to a GTX 580. So please. Any discrepancy that you see at 2560 in particular games is most likely just caused by the early driver (wouldn't be the first time for NV).

I don't understand why you keep beating this dB drum and bringing in all sorts of variables. I've already stated: apples to apples 4dB is almost 50% more perceived noise of whatever the baseline is (if the baseline is low, obviously the change will be hard to perceive). All your BS about pitch, positioning, room, is all irrelevant when comparing two products in identical environment. My god, have you ever taken a science course?

My car analogy is perfect. And this is far from an enthusiast community. Anandtech is probably the largest tech site in the world, it has perhaps millions of forum users, there are tons of average users here who wouldn't even consider overclocking. The kind of community you describe is at [H] or xtremesystems.



Hehehe. Wait til you get a bigger screen, go multi-monitor, use 2049mb of frame buffer, overclock to 1250/1800, or perhaps bench higher for HWbot. OC'd 7970 CF is already beating GTX680 SLI in 3dmark 11 Extreme.

The 680 is an awesome card - I'm not knocking it, I'm just providing the reasons you'd want a 7970. The higher the resolution, the faster the 7970 is in comparison, until the point that the 680 loses at 1600p in Metro. High resolutions setups don't stop at 1600p. There is SSAA and multi-display beyond that.

Fuad wrote the 680's "price just dropped this morning" to $499. Since it's already listed for pre-order from $556-$590, it would appear you are correct about a $100 price cut.


I already have a 2560x1440 display and I'm the only one with that resolution in my circle of real life acquaintances. That already goes to tell you how "common" this resolution is. I know a total of zero people who game with multiple displays or 3D. The only scenario I've witnessed 1536MB not being enough is with BF3 at 1440p with 4xMSAA. I'm already outside the norm, what kind of a setup would put the 7970 in a favorable light? 3x30" monitors? 12 megapixels? Fine, for the whole handful of those folks we'll recommend 7970s.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
I already have a 2560x1440 display and I'm the only one with that resolution in my circle of real life acquaintances. That already goes to tell you how "common" this resolution is. I know a total of zero people who game with multiple displays or 3D. The only scenario I've witnessed 1536MB not being enough is with BF3 at 1440p with 4xMSAA. I'm already outside the norm, what kind of a setup would put the 7970 in a favorable light? 3x30" monitors? 12 megapixels? Fine, for the whole handful of those folks we'll recommend 7970s.

Agreed.

The 680 will be the most popular by far. <- I bet you agree to this.

Profit margins on a low power 294mm2 die selling for the same price as a high power low-yield 550mm2 die are going to make Nvidia a killing too.

World records will still be set with 1.8ghz 7970's, unless gk104 has some amazing scaling with overclocks. I've seen very few 3x30" setups - Vega & Ryan Shrout is about all i can think of off the top of my head.
 

Diceman2037

Member
Dec 19, 2011
54
0
66
Intel HD3000 will not appear in your device manager unless are you using lucid virtnu software on a z68 board. It is not in my DM - p8z68 board. Secondly, HD3000 will never show regardless if you run Heaven DX11 in fullscreen (at least it didn't with the older version) I seriously doubt many people use it because its a buggy piece of crap that lowers your discrete GPU performance. (benchmarks on AT or TH prove this, which is why most don't use it)

I dunno, it just seems sketchy. We'll find out soon enough though, if its true it looks like new toys this week for me. Then I will bug balla to figure out the ways of water cooling.

It can and does, and has and will again for many of people.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Haha speaking of irony, that's cute. Those benchmarks aren't cpu limited, you don't need to add MSAA to see the potential of the card. As the workload gets tougher, performance will proportionally scale, until you hit a bottleneck (buffer or bandwidth). There is absolutely nothing (minus exotic hardware/software scenarios) that requires over 2GB of buffer, and the bandwidth is comparable to a GTX 580. So please. Any discrepancy that you see at 2560 in particular games is most likely just caused by the early driver (wouldn't be the first time for NV).
You couldn't be more wrong. Different AA modes, resolutions, etc. tax different components of the GPU. Where shader power might limit one scenario, texture fillrate can easily limit another within the same game.
I don't understand why you keep beating this dB drum and bringing in all sorts of variables. I've already stated: apples to apples 4dB is almost 50% more perceived noise of whatever the baseline is (if the baseline is low, obviously the change will be hard to perceive). All your BS about pitch, positioning, room, is all irrelevant when comparing two products in identical environment. My god, have you ever taken a science course?
Yes, many. I've actually interned with an audiologist and worked on hearing aids; you really don't want to bark up that tree. As I've stated (three times now), you've reduced the concept too much to the point that you're wrong. I'm not sure if this discussion simply eclipses your understanding of sound phyics, but the real world behaves much differently than your high school-level physics class lead you to believe. I suggest you read through the link that I posted to become educated on the subject for a future discussion and stop derailing this thread further.
My car analogy is perfect. And this is far from an enthusiast community. Anandtech is probably the largest tech site in the world, it has perhaps millions of forum users, there are tons of average users here who wouldn't even consider overclocking. The kind of community you describe is at [H] or xtremesystems.
[H] forums actually get more traffic that AT forums, so no. Really, are you just arguing anything here to support the GTX 680/nvidia? If so, I won't waste my time.
I already have a 2560x1440 display and I'm the only one with that resolution in my circle of real life acquaintances. That already goes to tell you how "common" this resolution is. I know a total of zero people who game with multiple displays or 3D. The only scenario I've witnessed 1536MB not being enough is with BF3 at 1440p with 4xMSAA. I'm already outside the norm, what kind of a setup would put the 7970 in a favorable light? 3x30" monitors? 12 megapixels? Fine, for the whole handful of those folks we'll recommend 7970s.
$550 graphics cards are enthusiast graphics cards for enthusiast setups. Nobody competent buys a $550 graphics card to play at 1680x1050.
 

Diceman2037

Member
Dec 19, 2011
54
0
66
Hard to call that from one set of reviews. Every single one of those games has performed better on NVIDIA cards in the past, so it just seems they're playing to their strengths. For instance, why no 4xMSAA or 8x MSAA in Skyrim, especially since it runs so easily? Who cares about HAWX 2 results? BF3 with MSAA has been a strength for NVIDIA in the past, why all of a sudden do they not use FXAA there? Is FXAA all of a sudden no good for BF3?

Tom's has historically been in someone's pocket, so I'm not surprised. Can't wait for more reviews though. :thumbsup:

MSAA is used in skyrim as part of the quality profile. High automatically sets 8xMSAA. You can't exactly claim to be using the High preset without MSAA at 8x......
 
Last edited:

Eugene86

Member
Dec 18, 2007
160
0
71
$550 graphics cards are enthusiast graphics cards for enthusiast setups. Nobody competent buys a $550 graphics card to play at 1680x1050.

I play at 1680x1050 and I am planning on buying a 680 once it's released. Why? Because my GTX 560s in SLI are not adequate enough to run Battlefield 3 at my preferred settings at a minimum of 60 fps.

The 680 looks like the perfect solution for me as I will get a great performing card and will be able to get rid of my SLI set up.

You seem to think that only enthusiasts would buy this card. What better option do I have? Should I go Tri-SLI? Quad-SLI? What other option is going to be a better choice in regards to performance and price?
 

chimaxi83

Diamond Member
May 18, 2003
5,456
61
101
I would put 0 stock in a benchmark that AMD users can simply disable tess on the driver level.

The 3dmark report page (web page, not a screenshot) shows if tesselation settings were changed.

I don't know why people don't post up the URL instead of just a screenshot to shut down the bs nitpicking that happens when someone realizes another person has better shinies than them.

My best extreme score. All it says up top is driver not approved, which is shows for us all. oh, and my clocks are actually 1250/1750.
 
Last edited:

Crap Daddy

Senior member
May 6, 2011
610
0
0
There are 2 things that most of the owners of 7970 or the supportes of AMD come up to defend their purchase or their beloved brand on this situation:

overclockability and higher resolutions.

I can bet that GK104 will overclock at least the same as Tahiti and at higher resolutions neither card will pump alone the needed frames in those very few demanding games. So you will need two. If the 680 is indeed 50$ cheaper I'd certainly buy those loosing a few frames in the process.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,110
1,260
126
There are 2 things that most of the owners of 7970 or the supportes of AMD come up to defend their purchase or their beloved brand on this situation:

overclockability and higher resolutions.

I can bet that GK104 will overclock at least the same as Tahiti and at higher resolutions neither card will pump alone the needed frames in those very few demanding games. So you will need two. If the 680 is indeed 50$ cheaper I'd certainly buy those loosing a few frames in the process.

I'll gladly bet you that on average at 2560x1600 and eyefinity/surround resolutions the 7970 is faster than the 680.

The 680 is a great card; if you game at 1080P.

Otherwise it's meh, 20% faster than a 580 at 1600P ? Making it on par with the 7970. All using Tom's results. Really ? Only 20% faster than their past flagship. This is a performance increase you are impressed with ?

Most people game @ 1080P, so the 680 is great, but it's just awful in that it brings absolutely zero new to the table for high resolution gaming.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,110
1,260
126
20% faster is still 10% faster than the 7970. Also twice as good.

Actually 20% faster at 1600P is just catching up to the 7970. See here: 7970 made a 46% improvement at 1600P over the 6970.



This chart with the 680 results will be a good data point to judge the card when reviews are out.

Puzzling to see a positive reception for what I saw you complaining over with the 7970's launch; not enough of a performance increase. You expected 60% over a GTX580, looks like nvidia delivered about 35%

680 is a great card for 1080P and it's nice to have options. This is an appalling increase in performance for a flagship though.

Hopefully for people in the market for one or the other there will be price wars. With no clear across the board winner, pricing will become a point to differentiate on.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
The 3dmark report page (web page, not a screenshot) shows if tesselation settings were changed.

I don't know why people don't post up the URL instead of just a screenshot to shut down the bs nitpicking that happens when someone realizes another person has better shinies than them.

My best extreme score. All it says up top is driver not approved, which is shows for us all. oh, and my clocks are actually 1250/1750.

The program can not tell if you've adjusted tess levels within the drivers.

Hence why no AMD driver has ever been approved for 3DMark11.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Actually 20% faster at 1600P is just catching up to the 7970. See here: 7970 made a 46% improvement at 1600P over the 6970.



This chart with the 680 results will be a good data point to judge the card when reviews are out.

Puzzling to see a positive reception for what I saw you complaining over with the 7970's launch; not enough of a performance increase. You expected 60% over a GTX580, looks like nvidia delivered about 35%

680 is a great card for 1080P and it's nice to have options. This is an appalling increase in performance for a flagship though.

Hopefully for people in the market for one or the other there will be price wars. With no clear across the board winner, pricing will become a point to differentiate on.

That's a solid one and another site is Computerbase.de that offers a nice over-all gauge, too.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
Not sure if serious...

snip


But there hasn't even been a meta-analysis of overclocking results yet. Besides, a 7970 would only have to hit ~1200MHz to match the same overclock of a GTX 680 at 1300MHz. 1200MHz is easily doable on the reference cooler without touching the stock fan profile. I'm not sure where you're getting your information on the 7970. Also, how are you so sure that the GTX 680 did 1300MHz at "comfortable" noise levels?

??, how is this 1200mhz easily doable on stock reference cooler when the BEDD XFX version was unstable above 1150mhz?....
http://www.anandtech.com/show/5314/...ouble-dissipation-the-first-semicustom-7970/6

Is this a driver release fix? or something?
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |