AMD goes fermi

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Grooveriding

Diamond Member
Dec 25, 2008
9,108
1,260
126
Steam still shows AMD with 60% of the DX11 market and the 5870 and 5850 having sold more than the 580,570,480,470,560 etc. Also the 5770 being the #1 DX11 card sold.

What exactly were you trying to prove ? That there are not many people who buy $700+ dual-gpu cards over $500 single-gpu cards ? Or that the 6XXX series refresh did not sell as well as the 5XXX series it replaced ?

I think your tap-dancing is ill placed. Keep us posted on the Steam numbers over the next few months as the 7XXX series starts to saturate its way into the numbers.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Why would you think the 7XXX series would saturate the market when the 6XXX series didn't and was much less in price? I can understand the 5XXX series considering it offered nice value, very balanced and competing against DirectX 10 product mix from nVidia.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Steam still shows AMD with 60% of the DX11 market and the 5870 and 5850 having sold more than the 580,570,480,470,560 etc. Also the 5770 being the #1 DX11 card sold.

And what did the "small dies" bring AMD?
Profit?
Or increased marketshare (by sacrificing said profits)...

What exactly were you trying to prove ? That there are not many people who buy $700+ dual-gpu cards over $500 single-gpu cards ? Or that the 6XXX series refresh did not sell as well as the 5XXX series it replaced ?

Nice spin...but not biting.

I think your tap-dancing is ill placed. Keep us posted on the Steam numbers over the next few months as the 7XXX series starts to saturate its way into the numbers.

I think you putting words into people mouths are ill placed...but we are all different.
Again, the majority of your post has nothing to do with what I wrote...what is up with that?
 

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
When the HD 5000 series cards were taking the sales lead and showing a shift in the margins in AMD's favor people on this forums howled that Steam's survey didn't count.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
When the HD 5000 series cards were taking the sales lead and showing a shift in the margins in AMD's favor people on this forums howled that Steam's survey didn't count.

I think the testament of the 5XXX series was they actually not only had strong steam numbers but actually did have over-all discrete leadership as well. -- that was indeed impressive.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
As to who can't be bothered. There is so much wrong with your tying in the value of Cayman, some huge swing in gpu share and discounting facts like I posted from August on add in boards .
I can't imagine how embarrassing it must feel to have your own link prove you wrong, especially after I recommended you start reading the source material not five posts ago. In your own link, AMD says:
AMD said:
Graphics segment revenue was $367 million, down 11% compared to the prior quarter, mainly due to lower discrete mobile unit shipments and seasonality in the desktop discrete graphics at inboard market. Graphics segment operating loss was $7 million, down $26 million from the prior quarter, primarily due to a lower revenue and increased important investments in our next-generation 28-nanometer leadership graphic offerings.
I find it funny that you think you can post any article with a vague reference to the discussion and it adds, nevermind proves, your point. Do you think people here won't read them? That stuff works in middle school, not in the real world. I made a point on efficient chips being better buys and posted a source that showed NVIDIA was falling out of favor in consumer machines. The "facts" you posted on add-in boards have nothing to do with the argument we were discussing. In fact, the article itself states:
JPR said:
Over 16 million AIBs shipped in Q2 2011. Nvidia was the leader in unit shipments for the quarter, elevated by double attach and GPU-compute/CUDA sales.
Embarrassing. That's your argument: "b-b-b-b-but CUDA!"?
And information like AMD gpu division losing money in Q2, which would support low share in the Steam survey.
edit : fixed link
One of the golden rules of statistics - correlation does not imply causality. Of course, anyone with the most basic knowledge of statistics would know that, thanks for proving where you stand to the forum. :thumbsup:
All I see is a badly disguised personal attack...with not arguments

Steam is +30 millions users...if you call that niche, you should take your own advice...:ninja:
There's 60-80 million AIB's shipped a year, nevermind GPU's and integrated. 30 million is a drop in the bucket in comparison, and the fact that you thought it was something shows how out of touch you are with what we're discussing. Now stop deflecting, the burden of proof is on you. You want to use it as a source, you prove it to the rest of us.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,108
1,260
126
And what did the "small dies" bring AMD?
Profit?
Or increased marketshare (by sacrificing said profits)...



Nice spin...but not biting.



I think you putting words into people mouths are ill placed...but we are all different.
Again, the majority of your post has nothing to do with what I wrote...what is up with that?

I don't care about AMD's profits, what do I care about how much money they make ? I'm a gamer, I play games, all I care about is performance and if my cards don't give me problems. Only whacked-out fanboys give a toss about which company makes more money.

My post again shows you talking out your backside saying the Steam numbers show things in favor of one company over the other. When they show the opposite.
 
Sep 19, 2009
85
0
0
Steam's Survey is so right, it is obvious the GTX 580 outsold the much slower (and unpopular) GTX 450, HD 5750, and the HD 5670.

The ultra-high end market is a expanding market, I estimate that by 2016 1/3 of World's Population will be enjoying their $500+ cards.

I mean, who gives a f*** about the low-end gaming? Everyone is rich now.
 

nsavop

Member
Aug 14, 2011
91
0
66
Market share only tells part of the story, AMD gained market share but sacrificed profits in doing so. Nvidia on the other hand priced there cards at a premium favoring margins over market share. Its pretty obvious which business model was more succsessful.

Looking at the new CEO quotes about being the aggressor/predator and the premium there charging for there upcoming gpu's (rightly so) I think amd wants to maximize profits rather then market share going forward.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Were you worried when the GTX 480 was released 6 months after the 5870
Yes, taking an extra 6 months is a serious botch and a company cannot survive repeats of that.
nVidia lost marketshare and distributers.

and was only 10% faster but consumed 50% more power
You mean, was inefficient in comparison. It could have easily been clocked to comsume identical amount of power, for the same noise and temp. It would have been slower if such was done though.

NVIDIA's working with the same 28nm process that AMD is, and they already botched one new architecture/new node launch.
Did they? AMD claims to have lunched the 7970 but I just checked in neweeg...
http://www.newegg.com/Product/Produc...scription=7970
I see a carpet, lego, headphones, and a ring. I don't see a video card.
What about google shopping?
https://www.google.com/search?ix=hc...urce=og&sa=N&tab=wf&ei=Ghn9TtLIGIibsgKA0q2oAQ
Nope, nothing for AMD 7970... except for reviews.

This is because it was not really lunched, AMD lied when they said they lunched the 7970, you literally cannot buy it in any store.
If nVidia lunches 3+ months after AMD on this one, I would call it botched.
But if they lunch within 1-2 months then that is normal execution. If they lunch ahead then it is well executed.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,108
1,260
126
Yes, taking an extra 6 months is a serious botch and a company cannot survive repeats of that.
nVidia lost marketshare and distributers.


You mean, was inefficient in comparison. It could have easily been clocked to comsume identical amount of power, for the same noise and temp. It would have been slower if such was done though.


Did they? AMD claims to have lunched the 7970 but I just checked in neweeg...
http://www.newegg.com/Product/Produc...scription=7970
I see a carpet, lego, headphones, and a ring. I don't see a video card.
What about google shopping?
https://www.google.com/search?ix=hc...urce=og&sa=N&tab=wf&ei=Ghn9TtLIGIibsgKA0q2oAQ
Nope, nothing for AMD 7970... except for reviews.

This is because it was not really lunched, AMD lied when they said they lunched the 7970, you literally cannot buy it in any store.
If nVidia lunches 3+ months after AMD on this one, I would call it botched.
But if they lunch within 1-2 months then that is normal execution. If they lunch ahead then it is well executed.


http://hardforum.com/showthread.php?t=1662199



Bought here: http://www.plaisio.gr/

Did 1125core/1575memory, the maximum possible in AMD overdrive, no voltage able to be added yet. Same overclocking results many of the reviewers cards did, but this is a retail card. Notably at this speed the card is about as fast as a GTX 590 or 6990, a monster.

I'm looking forward to whatever nv releases eventually too, but I'm a realist and have seen all the current rumours. Enjoy the 4+ month wait and let's talk about it when it's actually here, and not an unannounced and unconfirmed product.

Initially reading the first OP I thought this was just a discussion thread, now seeing some more of your posts should of just called the thread 'Come in and here and bitch about the 7970 nvidia fanboys'
 
Last edited:

taltamir

Lifer
Mar 21, 2004
13,576
6
76

A bit of investigating shows that is a greek store... which I searched and found it does not offer 7970. (search 7970 on it, find nothing)

http://hardforum.com/showpost.php?p=1038200146&postcount=6

The guy claims he ordered it from that store VIA PHONE, because they apparently have them in stock but are not actually listing them on their website
(I checked, I don't speak greek but they have no product with 7970 in the name)

Either he is full of it, they are full of it, or something shady is going on...
Some guy on some forum CLAIMING to have bought one via phone is not proof that the product is actually available for purchase. Show me a link to a webstore that actually sells it via the store.

Its not on amazon, its not on newegg, its not on google shop, and no, its not on http://www.plaisio.gr/ either
 
Last edited:

taltamir

Lifer
Mar 21, 2004
13,576
6
76

seriously? You cannot buy it! The only proof you gave otherwise is that a forum poster in HardOCP forum claims to have bought on via phoine from a greek web-store... the webstore he claims to have bought it from does NOT list it as a product they carry...
All I asked for is one single link to a product page where there is an actual buy option. (not preorder, buy now).

Also, please avoid personal attacks
 
Last edited:

wahdangun

Golden Member
Feb 3, 2011
1,007
148
106
seriously? You cannot buy it! The only proof you gave otherwise is that a forum poster in HardOCP forum claims to have bought on via phoine from a greek web-store... the webstore he claims to have bought it from does NOT list it as a product they carry...
All I asked for is one single link to a product page where there is an actual buy option. (not preorder, buy now).

Also, please avoid personal attacks

but actually that guy have a PROVE (he showed his shapire card that he buy via phone, and its not a photoshop) now its your turn to disapprove it.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I've been following this argument and I can't say I agree, especially when you look at it from several perspectives.....

Which part? The part where there is only a 50-55W difference in power between high-end cards? You think that's material when it comes time to you buying a high-end graphics card in the context of the overall system power consumption and in the context of us enthusiasts who add 100-150W of power to the CPU through overclocking alone?

Your entire section of that post addressed how HD6950 and HD6970 are better values for the $$ vs. the 580, etc. How did that come up I have no idea....because the only thing that was actually being talked about was that a total system with GTX580 consumes about 15% more power than a similar system with an HD6970. You can't run a Graphics card without a CPU, motherboard, ram and hard drives, can you? So all in all arguing about 50-60W of power differences between top GPUs is very strange to me. I mean if I was sweating so hard about 50-60W of power, I'd disconnect all the lights in my house and live in total darkness, never put up a Xmas tree, hang dry all my laundry, bike to 7/11, never use a blow dryer, etc. The total system is already using 350-400W of power. People who care about power consumption so much are probably using i3s and are gaming on HD5770s. For example, if HD7970 used 250W of power instead of 200W, but came out clocked at 1.1ghz from the factory, you wouldn't be more impressed? That would be FAR more impressive to me.

This is why the 7970 is that much more impressive. To an average consumer, one might look at it and say "oh, it's only 20-25% faster than the GTX 580, that's not much." But then, once you start analyzing different metrics and become a better consumer, it really shines.

No one is arguing that GTX580 is a better buy than HD7970. I don't know why you keep defending it because obviously the 7970 is better. I am just shaking my head at "enthusiasts" who think there are *large* differences in power consumption among high-end GPUs, because 50W is minor, considering most of us here are running overclocked CPUs, etc. From an engineering perspective, it makes the 7970 more impressive, and it should considering it's a 28nm GPU.

I think what you are saying is that because GTX580 has higher power consumption to begin with, with overclocking, the power consumption difference will rise far above 50W because 28nm process is more power efficient. Agreed.

Any consumer regardless of technical knowledge can do a quick price/performance comparison and say, "wait, that's 20-25% extra performance for only $50 at the enthusiast level - that's unheard of, and it has double the vRAM!"

HD7970 > GTX580. No argument here.

Then, if that consumer is an enthusiast, one can look at the technical merits of the chip itself, such as the lower power consumption and overclockability, and the picture just gets better and better. As I stated above, if you crank the 7970 to a GTX 580's power consumption level, it will slaughter it.

Again, you are talking about 200-250W videocards. But like I said, you can't use a videocard by itself. So at the end of the day, it's still a 400 or 450W system. If hypothetically the GTX580 system consumed 300W, but a 7970 system consumed 400W, would you buy it instead of the faster HD7970? No, you wouldn't. The lower power consumption of the HD7970 over GTX580 is a nice bonus, but in the global context, it's meaningless for enthusiasts who buy $500 graphics cards. They are going to buy the faster card, assuming it has a reasonably quiet cooler. In the context of high-end GPUs, HD6970, GTX580, HD7970 are all power hungry cards. To me anything that's hitting almost 200W+ is a power hungry GPU. If I could buy a high-end GPU with 50-75W of load TDP power, then I'd consider that a material difference worth talking about.

With the 7xxx series, I think you'll see all those advantages disappear.

I don't think so, because HD7970 isn't much faster than GTX580 in Dragon Age 2, Crysis 2, Metro 2033 and BF3. It doesn't make sense that HD7970 is barely faster than GTX580 in the latest games. That means there are some fundamental things holding it back (like GPU clocks, or something else).

My comment was addressing an implication made by another poster that certain games shouldn't count because they are NV sponsored and therefore the performance differences in them are less relevant. And again you went on about poor Tessellation implementation in Crysis 2, etc., none of which had anything to do with my post. My point is simple, any game whether it's NV or AMD sponsored should count, especially if it's a popular game.

If there are 20 games that run better on AMD cards, I want to know that so it makes my buying choice easier next time I upgrade. From where I am standing, HD7970 is barely better than GTX580 in BF3, Crysis 2 and Metro 2033. It really doesn't matter to me as a gamer that NV spent $ ensuring that those games work better on its hardware, because AMD could have done that too. I hope AMD spends more $ and works closer with developers to optimize game code for their hardware too. The point still stands that HD7970 still does poorly with Antialiasing Deferred compared to GTX580 in BF3. That's a red flag to me (not because I play BF3, but because future games will use Frostbyte 2). If deferred AA is the direction of future game engines, then I am going to wait to see how Kepler generation does because I am not going to drop $500 on a new generation of graphics card that is showing less than stellar gains in the most demanding games over a 2 year old Fermi tech. For gamers who can afford to upgrade often, this isn't even an issue. They'll sell the 7970 and move on to the next best thing (HD7980, GTX680, etc.).
 
Last edited:

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Yes, taking an extra 6 months is a serious botch and a company cannot survive repeats of that.
nVidia lost marketshare and distributers.


You mean, was inefficient in comparison. It could have easily been clocked to comsume identical amount of power, for the same noise and temp. It would have been slower if such was done though.
Exactly.
Did they? AMD claims to have lunched the 7970 but I just checked in neweeg...
http://www.newegg.com/Product/Produc...scription=7970
I see a carpet, lego, headphones, and a ring. I don't see a video card.
What about google shopping?
https://www.google.com/search?ix=hc...urce=og&sa=N&tab=wf&ei=Ghn9TtLIGIibsgKA0q2oAQ
Nope, nothing for AMD 7970... except for reviews.

This is because it was not really lunched, AMD lied when they said they lunched the 7970, you literally cannot buy it in any store.
If nVidia lunches 3+ months after AMD on this one, I would call it botched.
But if they lunch within 1-2 months then that is normal execution. If they lunch ahead then it is well executed.
Beyond any "lunch" jabs, I'm actually baffled by this post. How have you been a member of this forum for 7 years and not seen a soft/paper launch? Furthermore, all sources point to January 9th for retail availability, which is 10 days away, so what exactly are you arguing here? That the card isn't amazing until everyone who wanted one has one? AMD is done, they have their 28nm flagship, it's gone gold, and it's impressive. Now they're just building up stock for January 9th. Meanwhile we haven't heard a thing from NVIDIA. If they had anything to throw out for an attention grab, now would be the time. However, they haven't, so that would lead me to believe they're still a ways off.
 

MrX8503

Diamond Member
Oct 23, 2005
4,529
0
0
I responded to the comment that GTX500 series cards were power consuming. Compared to their direct competitors (HD6900 series cards on 40nm), relative to the performance, they weren't hot. Comparing HD7970's performance vs. power consumption vs. GTX580/HD6970 is obvious - HD7970 will blow them both away. His comment should apply bo both the 6970 and the GTX580. Relative to their performance, they were both power hungry.

You insinuated that we shouldn't put too much weight into those 4 games I linked since they are NV sponsored (otherwise why would you even bring up the NV sponsored part?). I responded that they should count 100% since gamers play them. If HD7970 isn't fast enough in them over GTX580 by more than 20%, that's just the reality, regardless if the game is NV sponsored. It should count towards the overall performance delta of the videocard vs. its competitor. I have an AMD card and I don't just ignore the performance advantages NV has in Crysis 2, HAWX2, Lost Planet 2 and Civ5 because those games run better on NV.

Do me a favor and don't be a GPU card reviewer, if you cant be unbiased. Anandtech acknowledged the performance discrepancy, but followed it up by specifically saying it was a NVIDIA sponsored game. I have no loyalty to cards, in fact, my next card will be an NVIDIA, but I wanted to state that your comparison was misleading.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Your entire section of that post there went on and on about how HD6950 and HD6970 are better values for the $$, etc. While the only thing that was actually being talked about was that a total system with GTX580 consumes about 15% more power than a similar system with an HD6970. You can't run a Graphics card without a CPU, motherboard, ram and hard drives, can you? So all in all arguing about 50-60W of power differences between top GPUs is just laughable to me. I mean if I was sweating so hard about 50-60W of power, I'd disconnect all the lights in my house and live in total darkness, never put up a Xmas tree, hang dry all my laundry, bike to 7/11, etc. The total system is already using 350-400W of power. People who care about power consumption so much are probably using i3s and are gaming on HD5770s. And honestly, if HD7970 used 300W of power and came out clocked at 1.3ghz, that would be FAR more impressive than what they did. High-end cards are all about performance, not wattage.

No one is arguing that GTX580 is a better buy than HD7970. I don't know why you keep defending it because obviously the 7970 is better. I am just shaking my head at "enthusiasts" who think there are *large* difference in power consumption among high-end GPUs, because 50W is minor, considering most of us here are running overclocked CPUs, etc. From an engineering perspective, it makes the 7970 more impressive, and it should considering it's a 28nm GPU.
But it seems like you still don't understand why that 50W in pure graphics card power consumption is important as you keep trivializing it. Go back and re-read the walkthrough I posted, as you're still missing a big part of my argument. I compared the 7970 to a GTX 580 to show why that 50W matters. It's not just that you need a bigger heatsink, or the next tier PSU, or a louder fan, or it costs $20 more a year to run. At the core, it's about having as many options in front of you as possible, great performance over a range of TDP's and therefore applications, and that's what a high performance/watt chip does.

Again, you are talking about 200-250W videocards. But like I said, you can't use a videocard by itself. So at the end of the day, it's still a 400 or 450W system. If GTX580 system consumes 400W, would you buy it instead of the faster HD7970? No you wouldn't. The lower power consumption of the HD7970 over GTX580 is a nice bonus, but in the global context, it's meaningless for enthusiasts. If HD7970 consumed 100W more than GTX580, you are telling me you wouldn't buy it as long as the cooler was adequate and it ran cool and quiet?
Again, you seemingly are completely missing the point of why the lower TDP is valuable from an electrical engineering standpoint. Go re-read my post.

I don't think so, because HD7970 isn't much faster than GTX580 in Dragon Age 2, Crysis 2, Metro 2033 and BF3. It doesn't make sense that HD7970 is barely faster than GTX580 in the latest games. That means there are some fundamental things holding it back (like GPU clocks, or something else).

My comment was addressing an implication made by another poster that certain games shouldn't count because they are NV sponsored and therefore the performance differences in them are less relevant. And again you went on about poor Tessellation implementation in Crysis 2, etc., none of which had anything to do with my post. My point is simple, any game whether it's NV or AMD sponsored should count, especially if it's a popular game.

If there are 20 games that run better on AMD cards, I want to know that so it makes my buying choice easier next time I upgrade. From where I am standing, HD7970 is barely better than GTX580 in BF3, Crysis 2 and Metro 2033. It really doesn't matter to me as a gamer that NV spent $ ensuring that those games work better on its hardware, because AMD could have done that too. I hope AMD spends more $ and works closer with developers to optimize game code for their hardware too. The point still stands that HD7970 still does poorly with Antialiasing Deferred compared to GTX580 in BF3. That's a red flag to me (not because I play BF3, but because future games will use Frostbyte 2). If deferred AA is the direction of future game engines, then I am going to wait to see how Kepler generation does because I am not going to drop $500 on a new generation of graphics card that is already showing signs of struggle in those very games that are based on next generation engines.
That's a fine point and I agree with it, all games should count. However, my post wasn't just concerning your argument. I think a lot of the games that NVIDIA cards had an advantage on will run differently on on the GCN architecture. "Fundamental things holding it back" are definitely drivers, as any new architecture will take time to be tweaked. Look at Metro 2033 for example - NVIDIA sponsored game, and the 6970 is darn close to the GTX 580 and the 7970 is substantially faster (31% @ 2560x1600). Give it time.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Do me a favor and don't be a GPU card reviewer, if you cant be unbiased. Anandtech acknowledged the performance discrepancy, but followed it up by specifically saying it was a NVIDIA sponsored game. I have no loyalty to cards, in fact, my next card will be an NVIDIA, but I wanted to state that your comparison was misleading.


Post #114 of HD6990 vs. HD7970, you said "Aren't most of those games favored by NVIDIA GPUs?"

1) Can you please elaborate how my comparison of HD6990 vs. HD7970 was misleading?

2) Can you please elaborate how someone who plays Shogun 2, Crysis 2, Metro 2033 and BF3 should ignore those benchmarks?

3) Can you also elaborate how the fact that a game is AMD or NV sponsored impacts a gamer who wants to upgrade to play it? Should we just toss out all those games where HD7970 is barely faster because they are "NV sponsored"?

That's hilarious that you are calling me biased because NV's cards work well with Deferred MSAA and NV spent $ to work closer with developers. I guess in your world BF3, Crysis 2, Metro 2033 count. What's next any game that runs faster on AMD cards shouldn't count because it gives AMD an unfair advantage?

It's not just that you need a bigger heatsink, or the next tier PSU, or a louder fan, or it costs $20 more a year to run.

You seriously just went there? You are now going to tell me I should care about $20 a year in electricity costs when I am buying a $500 graphics card that will depreciate $200 in 12-15 months? If AMD released an HD7970 with 300W TDP and 50% more performance than the current 7970, you wouldn't buy it because it would have consumed 100W more than the GTX580?

Again, you seemingly are completely missing the point of why the lower TDP is valuable from an electrical engineering standpoint. Go re-read my post.

No, I am not missing that point at all. I am discussing power consumption from a gamer's point of view. To me, if card A consumes 250W and card B consumes 175W, but card A is much faster, I am going to buy it instead. If I cared about saving electricity, I'd buy an Xbox 360. The fact that HD7970 uses less power over 580 is a nice bonus, but if it used more power than the 580, it wouldn't be a deal breaker either since it's 20% faster.

Look at Metro 2033 for example - NVIDIA sponsored game, and the 6970 is darn close to the GTX 580 and the 7970 is substantially faster (31% @ 2560x1600). Give it time.

In % terms it's much faster, but not in playable terms. Ya, and all 3 of them are unplayable at those settings, which means I'd be upgrading from 17 fps to 32 fps.

I look at this and see that HD7970 is a whooping 44% faster in BF3 at 2560x1600 and at 27 fps minimum and 43 fps avg, it's still not good enough for a fast-paced FPS game. 10% improvement in drivers and 30% overclocking, and it will be great

 
Last edited:

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
You seriously just went there? You are now going to tell me I should care about $20 a year in electricity costs when I am buying a $500 graphics card that will depreciate $200 in 12 months?

It's not just that you need a bigger heatsink, or the next tier PSU, or a louder fan, or it costs $20 more a year to run.
You really do have an issue with reading comprehension.
No, I am not missing that point at all. I am discussing TDP from a gamer's point of view. To me, if 1 card consumes 250W and the other consumes 175W, but card 1 is faster, I am going to buy it instead. If I cared about saving electricity, I'd buy an Xbox 360.
Then why are you on an enthusiast's forum? Go to GameSpot or the like. With that basal outlook on hardware, you're always going to be paying more/getting less. If you refuse to learn how hardware operates or why we do what we do, then you're wasting your time here. That is the kind of thinking I would hope this forum would change, not perpetuate.
Ya, and all 3 of them are unplayable at those settings, which means I'd be upgrading from 17 fps to 32 fps. Great.
I think you read the wrong chart - http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/17
The 7970 runs the game at a 36 FPS, which is playable by all metrics. The GTX 580, at 27.5FPS, is not. And it's only going to get better with the driver updates.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
If you refuse to learn how hardware operates or why we do what we do, then you're wasting your time here. That is the kind of thinking I would hope this forum would change, not perpetuate.

Personal attack?

I take it you are calling my knowledge into question because you couldn't come up with a good reason why someone should care about 50W of extra power consumption for their $500 graphics card on an enthusiast rig that is already power hungry? Lower TDP does not correlate directly with overclocking. My HD6950 has a far lower TDP than GTX580 and it can barely overclock.

You also went on and on about HD7970 in this thread when I clearly showed a minimal power consumption difference between HD6970 vs. GTX580 in my post which was directed about a comment that GTX500 series was power hungry. See a trend? You keep defending the 7970 when the discussion was never about that card in regard to power consumption.

I think you read the wrong chart - http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/17
The 7970 runs the game at a 36 FPS, which is playable by all metrics. The GTX 580, at 27.5FPS, is not. And it's only going to get better with the driver updates.

I didn't read the chart wrong. The comment of going from 17 to 32 fps was not related to AnandTech's Metro 2033 benchmark, it was regarding going from unplayable to unplayable.

If you want exact #s, that's going from 26.5 fps to 31.3 fps in Metro 2033 @ 1920x1080 4AA.

Here is BF3 was from [H], your favourite website.

HD7970 = 42.6 fps average, 27 fps min
GTX580 = 37.2 fps average, 24 fps min
HD6970 = 29.6 fps average, 21 fps min

I would love for you to go and play BF3 at 36 fps online since 36 fps is perfectly playable for you.

One of the golden rules of statistics - correlation does not imply causality. Of course, anyone with the most basic knowledge of statistics would know that, thanks for proving where you stand to the forum. :thumbsup:

For someone with a membership status from 2004, you sure have a lot of respect for other forum members. How about you try to address your disagreement with respect to another member as opposed to responding in the manner that you did?

Do you even realize the way you respond to people's arguments sometimes?
 
Last edited:

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Personal attack?
Because I address you personally does not mean it's an attack. If you can't have a discussion then don't reply to my posts.
I take it you are calling my knowledge into question because you couldn't come up with a good reason why someone should care about 50W of extra power consumption for their $500 graphics card on an enthusiast rig that is already power hungry? Lower TDP does not correlate directly with overclocking. My HD6950 has a far lower TDP than GTX580 and it can barely overclock.
I'm not calling your knowledge into question, but I'm certainly questioning your reading comprehension. I've presented the same argument multiple times and it seems you still don't understand it. However, you have not pointed out anything specifically that you do not understand, but instead repeat ad nauseum "50W extra power consumption is nothing". The 50W extra power consumption has nothing to do with extra heat, bigger heatsinks, the cost of electricity, or anything of that nature. These are all byproducts of higher power consumption that are trivial in this argument, but you seem deadset focused on them for some reason. The reason your aforementioned "power hungry $500" is power hungry in the first place is because it's being pushed past its optimum performance/watt niche in order to reach some arbitrary performance level. You can consider this "factory overclocked" even though it's at its supposed stock settings. Again, go back to where I described performance as being on a spectrum with voltage and temperature.

In buying a $500 GTX 580, for example, you're essentially paying NVIDIA to overclock the card for you. Now if you take my 6950 that runs at 1GHz/1400MHz, I only paid $230 for that same $500 GTX 580 level performance. Why? More efficient chip. You're correct in that not all chips are good overclockers, and this concept must be taken with the caveat that cards are duds (my condolences to your 6950), and that's why rated specs are truly your only "guaranteed" performance. Still though, in the end of the day, I have a GTX 580 I only paid $230 for.
I didn't read the chart wrong. I don't have a PC to play FPS games at 36 fps, especially not after spending $500 on a graphics card.
Then you need to buy a better monitor setup.
 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
With the 7xxx series, I think you'll see all those advantages disappear. Crysis 2 and Lost Planet 2 ran better on NVIDIA hardware simply due to the tessellation performance advantage the 5xx series had. With the drastically improved tessellation engine on the 7xxx series, I don't think AMD will have trouble in those two games anymore. On a side note, I hope this removes the garbage "tessellate the ocean under the whole level" sucker punches NVIDIA pulls - I'm not losing FPS in general because of some squabble. On a side note, I think that shows how much NVIDIA really cares about its gamers if it's willing to hurt EVERYONE's performance just to make its cards look a little better. But I digress, cold hard facts: http://www.guru3d.com/article/amd-radeon-hd-7970-review/23 . I'm not sure what the performance advantage was in HAWX 2, tessellation as well? If so, same point.

Civ5 was a unique situation as I'm aware, as there's code/algorithms in the game that are specifically designed to run on NVIDIA's hardware and speed it up. I forget where I read about it, but I also remember it being mentioned that it would possibly be very easy to port the same code to AMD's GCN architecture since it can run the same C++ code. I might have screwed up that concept in translation, but it would be something if AMD could suddenly "port" all of NVIDIA's tweaks over to GCN (sucks for NVIDIA, good for consumers).
That would be the color look up tables that NVDA had included in the game code which is not accessible to AMD architecture.
For all those others such as Notty,Firebird,taltamir etc intent on bitching about the HD7970 as much as they can...get over it.
AMD now owns NVDA in the high end graphics market...both in fastest GPU and overall fastest card(2xGPU).
*Paybacks a bitch ain't it*:whiste:


Knock it off with the flamebaiting and member callouts. This approach to the VC&G community is a non-starter.


Administrator Idontcare
 
Last edited by a moderator:
Status
Not open for further replies.
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |