40nm Battle Heats Up

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: chizow
55nm hasn't been offered anywhere close to 2 years; the first 55nm part was RV670 which launched in Nov 07. Its obvious production capacity for older processes is going to be greater than new processes as building new fabs for every new process would be far too costly. The upgrade cycle will go something like new process > ramp up production > ramp down > retrofit/transition to a new process. Obviously retrofit and transition is going to incur significant capital expenditures and as such, will also incur significant premiums for products based on that process. Seriously, it goes against all business/manufacturing/accounting principles to think a newer, faster process technology costs the same or less than a slower, older process technology based on the price of raw materials alone.

There is a difference between the first 55nm product being released by the major IHVs to the actual process technology being available. The 55nm GP process went "online" during March 07 timeframe, where such technology was offered to any companies requiring the need for such process technology, so roughly 21 months of service. With the new 40nm process going "online" late 2008, the argument that your making is kind of moot. Its a well known fact that nVIDIA is behind the process race ever since the NV30 fiasco compared to ATi/AMD which however is not a bad thing depending on the situation (theres no right or wrong method here, but in this case too much 65nm stock ). With both IHVs being one of the biggest customers for TSMC, I doubt these transitional costs would have a large impact on the deals made between them and the other IHVs. But you are right that a newer process technology should cost higher than the older process technology, but its premature to say that newer process technologies should be dismissed because it costs more.

The latest Steam Survey certainly indicates the 4800 series hasn't gained as much traction as many expected:

ATI Radeon HD 4800 Series (+0.56%) 7.09%
NVIDIA GeForce GTX 260 (+0.40%) 1.88%
NVIDIA GeForce GTX 280 (+0.07%) 1.29%
NVIDIA GeForce 9800 (+0.71%) 6.06%

So 9.23% competing NV parts to ATI's 7.09% 4800 Series, neither of which come close to the 24.58% listed under GeForce 8800. Its certainly a better picture than the 3870/2900 days, but still a far way to go considering ATI no longer has any price advantage or the performance crown.

Well this clearly indicates that the HD4800 series have had more traction than the GTX series (GTX280 being the worst of the bunch and anyone can guess why is it so). Im guessing most of it has to do with the introduction of the HD4850 and now the HD4830.

You're claiming Nvidia can't sustain this pricing because they're concerned about lower margins and lower market share, when lower prices would actually result in lower margins and higher market share and higher sales volume. This is in contrast to higher margins and lower market share and sales volume, but ultimately the same profit. In reality Nvidia isn't gaining or losing market share from AMD so much as they're adjusting market share within their own product lines. By lowering pricing and increasing market share for their single-GPU, they're moving away from high-end pricing for single-GPU, which will be replaced by a dual-GPU for the halo effect.

This last sentence characterizes AMDs business strategy in the GPU market ever since the RV670. nVIDIA on the other hand is still in the "single monolithic GPU" bandwagon, with GT212 and GT300 scheduled for 09 releases which both are single monolithic GPUs. Your theory above doesn't quite match with whats been happening this generation especially since the competition is providing a product that is much cheaper while providing ~90% of the performance. nVIDIA have obviously lowered the prices to increase demand for the cards yet the required demand is not there as indicated by the old 65nm inventory ala GTX260 (also suggests that the yield rate for full fledged GT200 chips were poor) and the lack of transition to 55nm for these performance parts unlike the GTX285. Lower prices doesn't necessarily result in higher marketshare and sales volume (examples such as GTX280), same goes for higher margins and lower market share/sales volumes (G71 is a key example where it provided high margin, resulted in high sales volume which led to higher market share). Presuming RV770/GT200 has good margins at their rated MSRP (note that nVIDIA launched their cards first), you can already guess which IHV been hurt the most when the price war began.

Thinking about it, nVIDIAs G71 closely resembles the RV770 IMO.

Also I'd disagree that AMD's problems are isolated to their CPU division. While their GPU division is certainly improving, they weren't exactly blowing anyone away the two years prior to RV770. Also, while they've shown some hints of profitability on the itemized income statements, that's before any impairments, write-offs or expenses are prorated. Also, I'd say you're underestimating the impact of Nvidia's discrete GPU business. Again, looking back at FY2008 where they enjoyed record sales, profits, and margins on the strength of the 8800 series, G80 and G92.

Add the G7x generation also. Sure ATi fumbled with R600 (happens to companies all the time), but the RV670 generation did show what they were capable of, forcing nVIDIA to release its G92 based products earlier than expected which screwed up nVIDIA's lineup (same thing with the 9800GTX+ and the release of HD4850). I think your overestimating or just forgetting that nVIDIA has other numerous markets in which they compete in which can offset the potential losses (compared to estimates compiled by their financial teams) caused by the lack of sales of their GT200 based products.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Originally posted by: chizow
Originally posted by: apoppin
we were kinda expecting this .. last Summer
- AMD's answer to Nvida's response to 4870
AMD still hasn't responded to GT200, as its still the fastest single-GPU and will increase that lead later this week with GTX 285 based on GT200b.

GT200b and GTX 295 was more of a response to 4870X2, but if these roadmaps are to be believed, AMD won't enjoy a 6-8 month process edge any longer as Nvidia has made it clear they're no longer to lose the performance crown due to process technology alone, especially when they have access to that technology as well.

Still, if these reports are anything close to accurate, we should be in for some serious improvements in GPU capability in the next 12 months. GT300 and RV8XX should make good frame rates in Crysis at max settings at 1920 a reality with just a single GPU.

lol
 

dadach

Senior member
Nov 27, 2005
204
0
76
Originally posted by: dguy6789
Originally posted by: chizow
Originally posted by: apoppin
we were kinda expecting this .. last Summer
- AMD's answer to Nvida's response to 4870
AMD still hasn't responded to GT200, as its still the fastest single-GPU and will increase that lead later this week with GTX 285 based on GT200b.

GT200b and GTX 295 was more of a response to 4870X2, but if these roadmaps are to be believed, AMD won't enjoy a 6-8 month process edge any longer as Nvidia has made it clear they're no longer to lose the performance crown due to process technology alone, especially when they have access to that technology as well.

Still, if these reports are anything close to accurate, we should be in for some serious improvements in GPU capability in the next 12 months. GT300 and RV8XX should make good frame rates in Crysis at max settings at 1920 a reality with just a single GPU.

lol

yeah no kidding...someone give that man nvidia focus job, so we can place him accordingly with other nvidia technology promoters
 

qbfx

Senior member
Dec 26, 2007
240
0
0
Originally posted by: chizow

AMD still hasn't responded to GT200, as its still the fastest single-GPU and will increase that lead later this week with GTX 285 based on GT200b.

GT200b and GTX 295 was more of a response to 4870X2, but if these roadmaps are to be believed, AMD won't enjoy a 6-8 month process edge any longer as Nvidia has made it clear they're no longer to lose the performance crown due to process technology alone, especially when they have access to that technology as well.

Still, if these reports are anything close to accurate, we should be in for some serious improvements in GPU capability in the next 12 months. GT300 and RV8XX should make good frame rates in Crysis at max settings at 1920 a reality with just a single GPU.

That's an odd response from nVidia IMO

As seen here, with the new cat 9.x betas the HD4870X2 not only manages to compete with the GTX295 but actually leaves it far behind once you turn on some AA even as low as 1680x1050 res. Going up to 2560x1600 8AAx16AF the GTX295 performs as poorly as 61% of the HD4870X2's performance.

Then you realize these things are actually made to compete in this sort of resolutions and AA/AF modes (I think it's safe to say that everything at the moment (except for Crysis) runs fine at 1920x1200 on a single HD48701GB or GTX260+ so that's not where neither the GTX295 nor the HD470X2 is meant to compete).

Then combine that with the $100 price premium for the GTX295, and I don't see how anyone, even clearly biased as you are, can claim nVidia's got the crown in the enthusiast market.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
Originally posted by: chizow
Originally posted by: SlowSpyder
Isn't Nvidia going to DDR5? In another thread the Nvidia guys kept pointing out how that is more expensive, thus off-setting the cost of Nvidia's larger 65nm GPU and 512 bit memory. Why would they be making this change if that's the case? Obviously DDR5 does not add to cost like a lot of people try to portray it does. Is it more expensive then DDR3? I'm sure it is. But I highly doubt DDR5 costs so much that it causes the production cost of a 4870 to reach the level of cost a GTX2x0 card costs. With the 2900 cards AMD tried the 512bit bus and had lots of bandwidth as well, obviously there is a benefit to sticking with 256 bit and DDR5 over 512bit and DDR3.
Yep, rumor has it Nvidia is rearranging ROPs and memory controllers for a move to GDDR5, at which point direct comparisons to AMD parts would be more accurate, since they'll also be on the same process. But again, its not surprising certain people think GDDR5 and GDDR3 cost the same for an OEM considering they think a 55nm and 65nm wafer also cost the same.

Unfortunately, reality sets in and we see this isn't how things play out in the real world. DDR3 costs a lot more than DDR2. 45nm CPUs cost more than 65nm CPUs. Newer hard drives cost more than older hard drives. New cars cost more than old cars. All this despite raw material costs that are very much the same.....there's got to be a missing piece in there....perhaps we need to factor in things like depreciation, amortization, capitalization of assets, R&D, etc......

Also, doesn't AMD get those same volume discounts? I'm willing to bet AMD has shipped a pretty similar number of GPU's as Nvidia.
Based on what? Market share shows Nvidia ships 2 GPUs for every one of AMDs and although AMD has regained some of that market share its still somewhere between 2:1 and 3:2.




I have no hard number what so ever, so you could be right that Nvidia still sells more GPU's then AMD. But the point is that AMD isn't Steve's GPU's & Graphics Cards. They sell tens of thousands of cards per quarter no doubt. I'm sure they are getting a competitive price on wafers.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Cookie Monster
There is a difference between the first 55nm product being released by the major IHVs to the actual process technology being available. The 55nm GP process went "online" during March 07 timeframe, where such technology was offered to any companies requiring the need for such process technology, so roughly 21 months of service. With the new 40nm process going "online" late 2008, the argument that your making is kind of moot.
Its not a moot point, as 65nm and soon 55nm will be that much older and that much cheaper relative to the newest offered process, which in this case will be 40nm. So now that 40nm is being offered by TSMC, do you really think Nvidia and AMD will get them for the same price per wafer as 55nm?

Its a well known fact that nVIDIA is behind the process race ever since the NV30 fiasco compared to ATi/AMD which however is not a bad thing depending on the situation (theres no right or wrong method here, but in this case too much 65nm stock ).
Yep, but this was clearly by choice, erring on the side of caution with die shrinks and new architectures due largely to NV30's failure. AMD recently experienced similar with a shrink and new architecture with R600, they just quickly swept it under the rug and went with a 55nm shrink and RV670 within 5-6 months.

The significance of this latest roadmap is that it shows Nvidia is pushing the process envelope sooner so that they're no longer behind the 8-ball when it comes to process technology. In the last 8 months Nvidia has shifted from 65nm to 55nm and now 40nm when their typical progression would have been 12-18 months in the past.

With both IHVs being one of the biggest customers for TSMC, I doubt these transitional costs would have a large impact on the deals made between them and the other IHVs. But you are right that a newer process technology should cost higher than the older process technology, but its premature to say that newer process technologies should be dismissed because it costs more.
Who's saying it should be dismissed? I'm saying any cost analyses claiming wafer prices are the same between different processes are clearly flawed. Market pricing and financials have once again proven such claims to be false.

As for transitional cost not having a significant impact on process pricing, this blurb gives some insight to the significance of Nvidia pushing for newer processes sooner:

Doug Freedman, an analyst at American Technology Research, said in a research note that Nvidia was not getting the 55nm capacity it needs from the silicon foundry giant and that the problem was likely to get even once Nvidia starts to utilize more advanced fabrication processes, 45nm and 40nm. It is crucial for Nvidia to utilize the most advanced process technologies possible as this allows the company to build very powerful GPUs at lower costs.

Nvidia and AMD's GPU segment are certainly two of TSMC's larger partners, but Nvidia holds at least a 2:1 edge based on GPU shipments alone without considering their chipset business. Its obvious additional pressure for new process capacity will force TSMC to either transition more fabs to newer processes or limit production based on capacity. This is obviously less favorable for TSMC when Nvidia was placing large orders on older processes that did not require additional capital, and as a result, there's the resulting tension between TSMC and Nvidia.

Well this clearly indicates that the HD4800 series have had more traction than the GTX series (GTX280 being the worst of the bunch and anyone can guess why is it so). Im guessing most of it has to do with the introduction of the HD4850 and now the HD4830.
Certainly, the problem is the 4850 and 4830 don't compete with the GTX series, which is why the 9800 numbers were included. Once you properly factor in competing price segments its obvious 4800 didn't have nearly as much impact as recent generation GPUs, particularly the 8800 series and its commanding 25% of DX10 parts and 12% overall GPU share.

This last sentence characterizes AMDs business strategy in the GPU market ever since the RV670. nVIDIA on the other hand is still in the "single monolithic GPU" bandwagon, with GT212 and GT300 scheduled for 09 releases which both are single monolithic GPUs. Your theory above doesn't quite match with whats been happening this generation especially since the competition is providing a product that is much cheaper while providing ~90% of the performance.
Yep its AMD's little core strategy, Nvidia has shifted their strategy so that they can have their monolithic core and beat AMD's little core, but also double-up and beat AMD's multi-GPU solution at any given time as well. The only hurdle in the past has been being on a different process node, which shouldn't be an issue now that they're both pushing 40nm simultaneously.

Also core size alone doesn't tell the whole story, as Nvidia has shown twice now with both GX2 parts. Both times Nvidia's part has come in at a higher transistor count and larger die size, and both times their parts have drawn less power while outperforming AMD's parts. Obviously core frequency has a significant impact in this regard, as Nvidia has gone with larger chips and lower clocks while maintaining some overhead, whereas AMD has gone with smaller chips and fewer transistors with higher clocks and less headroom.

nVIDIA have obviously lowered the prices to increase demand for the cards yet the required demand is not there as indicated by the old 65nm inventory ala GTX260 (also suggests that the yield rate for full fledged GT200 chips were poor) and the lack of transition to 55nm for these performance parts unlike the GTX285.
Again, your assertions are not only contradictory but haven't been reflective of what's happened in the marketplace either. The abundance of GTX 260, Nvidia's ability to cut prices on them continuously, and their ability to seamlessly introduce variants with more cores (Core 216) shows yields were quite good. Not to mention GTX 260s clock as well or better than full-fledged GTX 280s from most reports. In reality Nvidia is simply self-neutering fully capable parts in order to meet market demand and pricing segments, which really wouldn't be any different than what they've done for years. Recently they've shown the ability to quickly burn off remaining inventory by dropping prices as shown with 65nm core 192, 216 and soon GTX 280.

Lower prices doesn't necessarily result in higher marketshare and sales volume (examples such as GTX280), same goes for higher margins and lower market share/sales volumes (G71 is a key example where it provided high margin, resulted in high sales volume which led to higher market share). Presuming RV770/GT200 has good margins at their rated MSRP (note that nVIDIA launched their cards first), you can already guess which IHV been hurt the most when the price war began.

Thinking about it, nVIDIAs G71 closely resembles the RV770 IMO.
Sure it does. Lower prices are always going to result in higher marketshare and sales volume. You don't think GTX 280 was selling better after its price drop from $650 to $450? It wasn't even statististically significant in Steam's Survey until November. Both vendor's take advantage of this basic economic principle in their product offerings, offering essentially the same parts with some performance limiters or enhancers with vastly different mark-ups. As for which IHV has been hurt the most, again, I'd say the one still posting a profit and maintaining 40% gross margins would be hurt the least. After the initial price cut forced by AMD, Nvidia has consistently undercut pricing with GTX 260 and G92-based parts.

Add the G7x generation also. Sure ATi fumbled with R600 (happens to companies all the time), but the RV670 generation did show what they were capable of, forcing nVIDIA to release its G92 based products earlier than expected which screwed up nVIDIA's lineup (same thing with the 9800GTX+ and the release of HD4850). I think your overestimating or just forgetting that nVIDIA has other numerous markets in which they compete in which can offset the potential losses (compared to estimates compiled by their financial teams) caused by the lack of sales of their GT200 based products.
Actually if you look at Note 16 of their Q3 10-Q you'll see exactly how much of their business is still focused on the sale of GPUs:

GPU = 461M
PSB = 199M
MCP = 197M
CPB = 34M
Other = 5M

Total Revenue = 897M

PSB is their Professional Solutions Business, or Quadro and Tesla, which are still wholly based on their GPU architectures. From that you can see 75% of revenues and nearly 100% of their profit is generated from their GPU business. Their chipset and Other actually lost money and consumer products barely broke even.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Janooo
Correct me if I am wrong. Does this mean that HD4800 is outselling GT200 more than 2:1? That's interesting.
Sure, but does the 4850 compete in price or performance with any of the GT200 parts? No it doesn't, which is why the 9800 parts are included in a correct analysis.

Originally posted by: dadach
yeah no kidding...someone give that man nvidia focus job, so we can place him accordingly with other nvidia technology promoters
Its amazing how certain folks spew garbage like this under the guise of being impartial or unbiased.

I'd have to check with all my other employers first. Let's see, I've been accused of being an employee of the likes of AMD, Intel, EA, Creative, Antec, Corsair.....go figure!
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
How do we "KNOW" that 55nm costs more then 65nm? Is there a link to TSMC pricing showing that to be the case? How do we know that they didn't move to the new process and are charging more for older technologies, to keep them online after they have moved to a new process?

Lots of speculation.

What we do know is that a bigger chip takes more space on a wafer. You will get less usable chips per wafer for a comprably larger chip as opposed to a smaller one, assuming yield is up to par for both chips.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: qbfx
That's an odd response from nVidia IMO

As seen here, with the new cat 9.x betas the HD4870X2 not only manages to compete with the GTX295 but actually leaves it far behind once you turn on some AA even as low as 1680x1050 res. Going up to 2560x1600 8AAx16AF the GTX295 performs as poorly as 61% of the HD4870X2's performance.

Then you realize these things are actually made to compete in this sort of resolutions and AA/AF modes (I think it's safe to say that everything at the moment (except for Crysis) runs fine at 1920x1200 on a single HD48701GB or GTX260+ so that's not where neither the GTX295 nor the HD470X2 is meant to compete).

Then combine that with the $100 price premium for the GTX295, and I don't see how anyone, even clearly biased as you are, can claim nVidia's got the crown in the enthusiast market.
Yep I'm very familiar with that computerbase review and I've also noted the problems and inconsistencies with their Performance Rating aggregates. Going game by game and seeing a bigger Green line above 2 smaller Red lines doesn't seem to jive with their summarys. Also, how exactly do you think the 4870X2's issues with AA in Bioshock are figured in? The 4870X2 gets 8 FPS and the GTX 295 gets 2.8 FPS in STALKER at 2560 4xAA, which is 250% in Performance Rating, but what does that actually mean? Lastly, you have titles like RS: Vegas showing 200%+ across the board in favor of AMD (similar to something like Dead Space favoring NV parts) which would clearly skew any aggregate Performance Rating using %. Simply put, do you think a 100% advantage in 1 game outweighs a 10% advantage in 9 other games?

Anyways, here's a recent review by AT with the same drivers and they come to the conclusion the GTX 295 is the faster part:

NVIDIA GeForce GTX 295: Leading the Pack

Guess AT and every other review site is biased also?

 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
Video Drivers Catalyst 8.12 hotfix
ForceWare 181.20

The AT review does not use Catalyst 9.1... doesn't 9.1 bring some pretty impressive bump ups in speed to the Radeon line?
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
Originally posted by: chizow
Originally posted by: Janooo
Correct me if I am wrong. Does this mean that HD4800 is outselling GT200 more than 2:1? That's interesting.
Sure, but does the 4850 compete in price or performance with any of the GT200 parts? No it doesn't, which is why the 9800 parts are included in a correct analysis.
...

True, but 4800 makes much more money than GT200. The future looks good for AMD's graphics. Currently it's old technology that keeps NV afloat. The GT200 poor financial performance will show later.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: SlowSpyder
The AT review does not use Catalyst 9.1... doesn't 9.1 bring some pretty impressive bump ups in speed to the Radeon line?
From what I've read 8.12 Hot Fix is the same driver version as 9.1 Beta. Perhaps someone familiar with the driver designations can compare the real driver versions and confirm, as its been clear for some time the actual driver designations aren't indicative of which driver is newer/better.

Originally posted by: Janooo
True, but 4800 makes much more money than GT200. The future looks good for AMD's graphics. Currently it's old technology that keeps NV afloat. The GT200 poor financial performance will show later.
Again, not sure what you're basing that assumption on, certainly not financial reports or market pricing. As for GT200, now that they're fully transitioned to 55nm and transitioning to a full line of 40nm parts, I'd say the financial outlook is good.
 

OCGuy

Lifer
Jul 12, 2000
27,224
36
91
Originally posted by: Janooo
Originally posted by: chizow
Originally posted by: Janooo
Correct me if I am wrong. Does this mean that HD4800 is outselling GT200 more than 2:1? That's interesting.
Sure, but does the 4850 compete in price or performance with any of the GT200 parts? No it doesn't, which is why the 9800 parts are included in a correct analysis.
...

True, but 4800 makes much more money than GT200. The future looks good for AMD's graphics. Currently it's old technology that keeps NV afloat. The GT200 poor financial performance will show later.



So the "old" tech is the top single GPU, and the top multi-gpu solution? What does that say about the competitor?
 

OCGuy

Lifer
Jul 12, 2000
27,224
36
91
Originally posted by: chizow

Originally posted by: dadach
yeah no kidding...someone give that man nvidia focus job, so we can place him accordingly with other nvidia technology promoters
Its amazing how certain folks spew garbage like this under the guise of being impartial or unbiased.

I'd have to check with all my other employers first. Let's see, I've been accused of being an employee of the likes of AMD, Intel, EA, Creative, Antec, Corsair.....go figure!

I wouldnt worry about him...I have a feeling he is someone's second account.



 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: chizow
Originally posted by: apoppin
we were kinda expecting this .. last Summer
- AMD's answer to Nvida's response to 4870
AMD still hasn't responded to GT200, as its still the fastest single-GPU and will increase that lead later this week with GTX 285 based on GT200b.

GT200b and GTX 295 was more of a response to 4870X2, but if these roadmaps are to be believed, AMD won't enjoy a 6-8 month process edge any longer as Nvidia has made it clear they're no longer to lose the performance crown due to process technology alone, especially when they have access to that technology as well.

Still, if these reports are anything close to accurate, we should be in for some serious improvements in GPU capability in the next 12 months. GT300 and RV8XX should make good frame rates in Crysis at max settings at 1920 a reality with just a single GPU.


Is there really a point with you? Not likely. :brokenheart:

This thread is lame. It's the same old Nvidia lover flame bait thread started by non other than the guy who was praising ROP is the driving force when it comes to biggest performance gains on modern GPU. :laugh:
 

qbfx

Senior member
Dec 26, 2007
240
0
0
Originally posted by: chizow

Yep I'm very familiar with that computerbase review and I've also noted the problems and inconsistencies with their Performance Rating aggregates. Going game by game and seeing a bigger Green line above 2 smaller Red lines doesn't seem to jive with their summarys.

You must have been looking at the 1280x1024 results to see the green line bigger than the 2 red ones because at 2560x1600 with AA I see bigger red lines on top of the green one in 9/12 titles.

Also, how exactly do you think the 4870X2's issues with AA in Bioshock are figured in?

In Bioshock at 2560x1600, 4xAA/16xAF - 0fps, just like GTX295's 0fps in 2560x1600, 8xAA/16xAF in Crysis Warhead and the crippled "performance" of every nV card at higher AA/AF modes in any resolution, let alone 2560x1600.

The 4870X2 gets 8 FPS and the GTX 295 gets 2.8 FPS in STALKER at 2560 4xAA, which is 250% in Performance Rating, but what does that actually mean?

I guess it would mean that AMD/aTI products handle better AA/AF on high resolutions in graphics-intensive titles like STALKER (and Crysis Warhead for that matter).

Lastly, you have titles like RS: Vegas showing 200%+ across the board in favor of AMD (similar to something like Dead Space favoring NV parts) which would clearly skew any aggregate Performance Rating using %. Simply put, do you think a 100% advantage in 1 game outweighs a 10% advantage in 9 other games?

No I don't think so. nV too have titles like LP and FC2 in which they perform better (overall) but that doesn't make the test subjective. On the contrary, I'd call it very objective - 12 titles, all very popular. And where did you see those 9 titles in favor of nV anyway? Oh and I don't see a more correct way to display performance rating in multiple benchmarks, other than %-based.

Anyways, here's a recent review by AT with the same drivers and they come to the conclusion the GTX 295 is the faster part:

NVIDIA GeForce GTX 295: Leading the Pack

Guess AT and every other review site is biased also?

AT team didn't use 9.1's for their tests. As SlowSpyder pointed out, they used 8.12 hotfix. So I guess not, just you and some other folks

 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Ocguy31
I wouldnt worry about him...I have a feeling he is someone's second account.
Oh I'm not, he pops up every 4-5 months spewing similar garbage like a few other 100-post launch pedestrians.

Originally posted by: Azn
Is there really a point with you? Not likely. :brokenheart:

This thread is lame. It's the same old Nvidia vs AMD flame bait thread started by non other than the guy who was praising ROP performance over everything else. :laugh:
Actually the news of Nvidia's shift in GPU process strategy is quite significant, as this will be the first time since NV30 that AMD won't enjoy a process advantage over Nvidia. Furthermore, GT212 is expected to be more than just a shrink of GT200 with a reported additional 400m transistors and ~50% more SP to go along with higher clock frequencies typically associated with process node shrinks.

As for old debates, I think its been pretty clearly demonstrated ROPs still have a significant impact on performance. While its diminished with GT200 due to lower shader clocks, its clear ROPs and to a lesser degree, memory bus and VRAM are the main differences between full spec GT200 parts and neutered GPUs going into GTX 260 or 295 parts. Certainly more than texture fillrate or SP performance, as you've claimed many times. Case in point is the launch GTX 260, which actually has equal or lower texture fillrate and SP performance than the 9800 GTX+, yet the GTX 260 always outperforms the 9800GTX+. Is the performance difference proportional to the TMU/SP performance? Or is it closer to the ROP/pixel fillrate performance? Rhetorical questions really, I already know the answer.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: qbfx
You must have been looking at the 1280x1024 results to see the green line bigger than the 2 red ones because at 2560x1600 with AA I see bigger red lines on top of the green one in 9/12 titles.
Actually the 295 dominates anything lower than 2560 and when looking only at 2560, results are split between the 295 and 4870X2 at 5 titles each, with 2 splits:

295 wins: COD5, Bioshock, FC2, Lost Planet, WIC
4870X2 wins: Jericho, Grid, RS, Call of Juarez, Crysis
Splits: Ass. Creed, STALKER

In Bioshock at 2560x1600, 4xAA/16xAF - 0fps, just like GTX295's 0fps in 2560x1600, 8xAA/16xAF in Crysis Warhead and the crippled "performance" of every nV card at higher AA/AF modes in any resolution, let alone 2560x1600.
Except 2560 8xAA isn't compared under Performance Rating, while Bioshock is. Again, referencing the above, its obvious that the Performance Rating aggregates don't tell the whole story.

I guess it would mean that AMD/aTI products handle better AA/AF on high resolutions in graphics-intensive titles like STALKER (and Crysis Warhead for that matter).
No it means % based aggregates are meaningless when neither part is capable of providing playable frame rates. Also, in the case of STALKER you do realize the 295 wins in every resolution and AA setting, up to and including 2560 no AA right? The only win for the 4870X2 is at 2560 with AA where neither part is producing double-digit frame rates.....

No I don't think so. nV too have titles like LP and FC2 in which they perform better (overall) but that doesn't make the test subjective. On the contrary, I'd call it very objective - 12 titles, all very popular. And where did you see those 9 titles in favor of nV anyway? Oh and I don't see a more correct way to display performance rating in multiple benchmarks, other than %-based.
Rofl, while its nice to see a variety of benchmarks I don't think many would agree with you about Jericho and Call of Juarez being "very popular", or even good titles. Like I said though, individual titles showing massive gains are going to skew any results based on aggregates. When you have aggregates a individual titles that clearly favor one vendor over another are going to have a significant impact. Again, if they had reviewed Dead Space what do you think would happen to those performance rating graphs?

AT team didn't use 9.1's for their tests. As SlowSpyder pointed out, they used 8.12 hotfix. So I guess not, just you and some other folks
Since its obvious you're both going to cling to that excuse instead of finding out for certain, feel free to run this through babelfish:
Update: ATi hat die Catalyst 9.x-Beta-Version als Catalyst 8.12 Hot Fix für Windows Vista 32-Bit und 64-Bit veröffentlicht. Der Treiber unterstützt offiziell nur Grafikkarten aus der Radeon HD 4000-Serie und kann kostenlos bei ATi heruntergeladen werden.

 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: chizow
Actually the news of Nvidia's shift in GPU process strategy is quite significant, as this will be the first time since NV30 that AMD won't enjoy a process advantage over Nvidia. Furthermore, GT212 is expected to be more than just a shrink of GT200 with a reported additional 400m transistors and ~50% more SP to go along with higher clock frequencies typically associated with process node shrinks.

It won't matter anyways. AMD can easily raise their performance level by adding more TMU and SP into their design without better processing technology. You do know RV770 is slightly smaller than 9800gtx and much smaller than Nvidia's behemoth GT200. Your whole logic makes very little sense if any sense at all other than you are swayed by Nvidia. It's quite amusing.


Originally posted by: chizow

As for old debates, I think its been pretty clearly demonstrated ROPs still have a significant impact on performance. While its diminished with GT200 due to lower shader clocks, its clear ROPs and to a lesser degree, memory bus and VRAM are the main differences between full spec GT200 parts and neutered GPUs going into GTX 260 or 295 parts. Certainly more than texture fillrate or SP performance, as you've claimed many times. Case in point is the launch GTX 260, which actually has equal or lower texture fillrate and SP performance than the 9800 GTX+, yet the GTX 260 always outperforms the 9800GTX+. Is the performance difference proportional to the TMU/SP performance? Or is it closer to the ROP/pixel fillrate performance? Rhetorical questions really, I already know the answer.

You haven't demonstrated a thing and no where does ROP been acknowledged as the driving force in GPU performance.

As for your 9800gtx+ vs 260. You are quite wrong in your assessment like always. Biggest difference between these cards are pixel fillrate and memory bandwidth. Take out AA performance which is bottlenecked by memory bandwidth you can clearly see 9800gtx+ is within 10% of the performance of GTX260. Let's not forget GTX260 has slightly more shading power as well which that 10% performance also comes from. :brokenheart:
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
Originally posted by: chizow
...
Originally posted by: Janooo
True, but 4800 makes much more money than GT200. The future looks good for AMD's graphics. Currently it's old technology that keeps NV afloat. The GT200 poor financial performance will show later.
Again, not sure what you're basing that assumption on, certainly not financial reports or market pricing. As for GT200, now that they're fully transitioned to 55nm and transitioning to a full line of 40nm parts, I'd say the financial outlook is good.
4800 outsells GT200 2:1 and GT200 is more expensive. What else do I need to know that it didn't bring the money NV was expecting?
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
If we're not going to count exceedingly low frame rate situations, why do we factor in exceedingly high frame rate situations. Say over 60FPS? Who makes these 'rules' on how to measure cards performance? I don't get it. If you discount excetionally low then discount performance above 60FPS since the playing experience will be pretty much identical.

http://www.computerbase.de/art...ia_geforce_gtx_295/15/

But the GTX285 cleans up at 1680x1050 1xAA/1xAF.

I'm just not sure why people are putting these conditions on the testing to skew results. Is 8FPS vs. 2FPS going to create a better experience? No. They are both not playable. But then why do we look at 120FPS vs. 110FPS and call that a 'win' for a card seeing as the playing experience will be identical? Once you get to the higher resolutions and AA levels (where people use this level of card for) their isn't such a clear cut 'winner'. Just one card that is $100 more then another.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Azn
You haven't demonstrated a thing and no where does ROP been acknowledged as the driving force in GPU performance.

As for your 9800gtx+ vs 260. You are quite wrong in your assessment like always. :laugh: Biggest difference between these cards are pixel fillrate and memory bandwidth. Take out AA performance which is bottlenecked by memory bandwidth you can clearly see 9800gtx+ is within 10% of the performance of GTX260. Let's not forget GTX260 has slightly more shading power as well which that 10% performance also comes from. :brokenheart:
Rofl, and what do you think drives pixel fillrate and AA performance? ROPs. Sure bandwidth and VRAM also play a factor but the fact remains the GTX 260 is always going to be faster than the 9800GTX+ even in non-bandwidth limited situations. All of this despite higher or equal theoreticals on the 9800GTX+ in areas you claimed offered the most performance gain, texture fillrate and shader performance.

1. 9800GTX+
2. GTX 260 c192

SP
1. 128x1836=235008
2. 192x1242=238464

TMU
1. 64x738=47232
2. 64x576=36864

ROP
1. 16x738=11808
2. 28x576=16128

Bandwidth/VRAM
1. 512MB 256-bit @ 1100MHz = 70GB/s
2. 896MB 448-bit @ 999MHz = 111GB/s

So again, is performance difference between the parts more proportional to TMU/SP differences? Or ROP/Bandwidth/VRAM differences? You can extend these comparisons to the GTX 260 c192/216 to the GTX 280 or better yet, GTX 295 to GTX 280 SLI and see SP and TMU alone aren't enough to close the gap:

The GeForce GTX 295 performed pretty much where we expected: between the GTX 260 SLI and the GTX 280 SLI setups. In some games, the GTX 295 performed very nearly at GTX 260 performance, indicating a bottleneck somewhere in memory bandwidth or with the ROPs.

Even though GT200 does benefit more from SP clock increases than G80 or G92, if you ever test one for yourself, you'll see it still clearly benefits the most from higher core clocks (ROP/TMU/staging units).
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: chizow
Originally posted by: Azn
You haven't demonstrated a thing and no where does ROP been acknowledged as the driving force in GPU performance.

As for your 9800gtx+ vs 260. You are quite wrong in your assessment like always. :laugh: Biggest difference between these cards are pixel fillrate and memory bandwidth. Take out AA performance which is bottlenecked by memory bandwidth you can clearly see 9800gtx+ is within 10% of the performance of GTX260. Let's not forget GTX260 has slightly more shading power as well which that 10% performance also comes from. :brokenheart:
Rofl, and what do you think drives pixel fillrate and AA performance? ROPs. Sure bandwidth and VRAM also play a factor but the fact remains the GTX 260 is always going to be faster than the 9800GTX+ even in non-bandwidth limited situations. All of this despite higher or equal theoreticals on the 9800GTX+ in areas you claimed offered the most performance gain, texture fillrate and shader performance.

1. 9800GTX+
2. GTX 260 c192

SP
1. 128x1836=235008
2. 192x1242=238464

TMU
1. 64x738=47232
2. 64x576=36864

ROP
1. 16x738=11808
2. 28x576=16128

Bandwidth/VRAM
1. 512MB 256-bit @ 1100MHz = 70GB/s
2. 896MB 448-bit @ 999MHz = 111GB/s

So again, is performance difference between the parts more proportional to TMU/SP differences? Or ROP/Bandwidth/VRAM differences? You can extend these comparisons to the GTX 260 c192/216 to the GTX 280 or better yet, GTX 295 to GTX 280 SLI and see SP and TMU alone aren't enough to close the gap:

The GeForce GTX 295 performed pretty much where we expected: between the GTX 260 SLI and the GTX 280 SLI setups. In some games, the GTX 295 performed very nearly at GTX 260 performance, indicating a bottleneck somewhere in memory bandwidth or with the ROPs.



It's quite proportionate considering it has more vram, whole lot of bandwidth, more rop, SP where 9800gtx is bandwidth starved however you look at it. the fact that you think ROP is the driving force is laughable

Let's not forget GT 295 is clocked lower than GTX 280 in SLI which Anandtech forgot to mention and mostly tested bandwidth limited situations.


Even though GT200 does benefit more from SP clock increases than G80 or G92, if you ever test one for yourself, you'll see it still clearly benefits the most from higher core clocks (ROP/TMU/staging units).

No doubt about it because it's core hungry and not enough texture fillrate to take advantage of that bandwidth.
 

qbfx

Senior member
Dec 26, 2007
240
0
0
Originally posted by: chizow

Actually the 295 dominates anything lower than 2560

That's not true, the HD4870X2 dominates:

1280x1024, 8xAA/16xAF
1680x1050, 8xAA/16xAF
2560x1600, 4xAA/16xAF
2560x1600, 8xAA/16xAF
and logically derived from the above 1920x1200 which they skip in this test.

and when looking only at 2560, results are split between the 295 and 4870X2 at 5 titles each, with 2 splits:

295 wins: COD5, Bioshock, FC2, Lost Planet, WIC
4870X2 wins: Jericho, Grid, RS, Call of Juarez, Crysis
Splits: Ass. Creed, STALKER

I was looking at the 2560x1600, 8xAA/16xAF benches where the HD4870X2 clearly runs circles around the GTX295. If we consider all of the 2560x1600 tests I'd put FarCry 2 (the HD4870X2 beats the GTX295 in 2560x1600, 1xAA/1xAF) and Lost Planet (the HD4870X2 beats the GTX295 in 2560x1600, 8xAA/16xAF) in splits too.

So it's more like:

HD4870X2: Jericho, GRID, RS, CoJ, Crysis
GTX295: COD5, Bioshock, WIC
Splits: Assasin's Creed, STALKER, FarCry 2, Lost Planet

Except 2560 8xAA isn't compared under Performance Rating, while Bioshock is. Again, referencing the above, its obvious that the Performance Rating aggregates don't tell the whole story.

2560x1600, 8xAA/16xAF IS compared under Performance Rating, there's a little + on the bottom of the page.

No it means % based aggregates are meaningless when neither part is capable of providing playable frame rates. Also, in the case of STALKER you do realize the 295 wins in every resolution and AA setting, up to and including 2560 no AA right? The only win for the 4870X2 is at 2560 with AA where neither part is producing double-digit frame rates.....

%-based results are meaningful when you have multiple benches and you want to summarize the results, as simply there is no other/better way to do that.

Rofl, while its nice to see a variety of benchmarks I don't think many would agree with you about Jericho and Call of Juarez being "very popular", or even good titles. Like I said though, individual titles showing massive gains are going to skew any results based on aggregates. When you have aggregates a individual titles that clearly favor one vendor over another are going to have a significant impact. Again, if they had reviewed Dead Space what do you think would happen to those performance rating graphs?

This review includes 12 titles that are not handpicked like nV does when testing their new drivers. Lost Planet and FarCry 2 both favor nV, yet you see the HD4870X2 is on par with the GTX295.


"Here we want our results to benchmark the Radeon HD 4870 and expand the driver, presumably as a Catalyst 9.1 will be released against his predecessor Catalyst 8:10, 8:11 and Catalyst Catalyst 8:12"



 

nosfe

Senior member
Aug 8, 2007
424
0
0
"Here we want our results to benchmark the Radeon HD 4870 and expand the driver, presumably as a Catalyst 9.1 will be released against his predecessor Catalyst 8:10, 8:11 and Catalyst Catalyst 8:12"

it says that the 9.x beta was released to the public as 8.12 hotfix
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |