nVidia GT200 Series Thread

Page 19 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Klinky1984

Member
Nov 21, 2007
48
0
66

ronnn

Diamond Member
May 22, 2003
3,918
0
71
yikes are we going to get graphs from both companies comparing their new cards to last years stuff? The important part is the girl on the box and they seem to be keeping that part secret.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Lol that's hilarious, chop the bottom off the graph and make it look like its 10x faster than the competition...gotta love marketing .
 

ddarko

Senior member
Jun 18, 2006
264
3
81
The main problem with the graph is that it's completely devoid of details. At what resolution? With AA? If AA, what kind of AA? Etc. etc. Nvidia can't be blamed for benchmarking it against the 3870 X2 since presumably, it doesn't have access to unreleased AMD cards. But it can be criticized for not offering a comparison with its own current gen cards. This marketing propaganda is so silly and counterproductive. The only people who are going to see leaked graphs and charts like this are enthusiasts who can smell the BS from a mile away. I don't know what marketing expects to achieve with such transparently manipulated "data" like this.
 

Avalon

Diamond Member
Jul 16, 2001
7,567
152
106
Very misleading graph, but interesting. If true, that's kind of surprising that in UT3, Fear, and COD4 they claim the GTX260 is only going to be 20% faster than the 3870X2. That means it'll only be ~15% faster than a single 4870 in those games at whatever setting Nvidia was running. Why such a massive leap in some games, and others very little? Could the drivers still be this ill-prepared so late in the game less than 3 weeks before launch?
 

Rusin

Senior member
Jun 25, 2007
573
0
0
Avalon:
Well Geforce 9600 GT lost against HD3850 256MB three weeks before launch, but when there was launch drivers out 9600 GT won HD3870 512.
---

HD3870 X2 can already pretty much max out COD4 and FEAR..X2's memory bandwidht and framebuffer limit's won't come obvious in these two games..and since GTX 200-cards were ahead they can also max out those games. HD3870 X2 can pull playable frame rates at 2048x1536 0xAA 16xAF.


 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
oh wow, took me a while to get what you are talking about... they start the graph at 0.8 instead of 0. where 1 = performance of ATI card... so if the ATI card gets a 1 and the 280GTX gets a 2.5 it makes it seem like the ATI has 0.2 and the nv has 1.7 on call of juarez for example. which is 8.5x bigger bar.
And they used upcoming drivers for themselves, and outdated AMD drivers. The 8.5 cat drivers claim 12% performance increase on that specific game, i don't know if 8.4 changed anything for it too, but they used 8.3
Not THAT outdated (considering 8 = year, and 3 = month....) but still shady.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: taltamir
oh wow, took me a while to get what you are talking about... they start the graph at 0.8 instead of 0. where 1 = performance of ATI card... so if the ATI card gets a 1 and the 280GTX gets a 2.5 it makes it seem like the ATI has 0.2 and the nv has 1.7 on call of juarez for example. which is 8.5x bigger bar.
And they used upcoming drivers for themselves, and outdated AMD drivers. The 8.5 cat drivers claim 12% performance increase on that specific game, i don't know if 8.4 changed anything for it too, but they used 8.3
Not THAT outdated (considering 8 = year, and 3 = month....) but still shady.

Apparently I'm the only person in this thread who understood that graph the first time he/she looked at it.

It comparison in relation to a 3870X2, it's very straight forward, very easy to read.

It's not saying "ATi got .2", it's saying "3870X2 got framerate X. GTX260 was Y% higher, GTX 280 was Z% higher".

I have seen marketing graphs that make 10-20% look like huge differences. These are huge differences.

The only thing that can be construed in the slightest as misleading here is we don't know what went into the benchmarks. By this I mean presumably NVIDIA used areas of the game that showed their products in a favorable light. This could happen inadvertently with ANY review, and it's also possible there are areas in the game which could show their products in a more favorable light, or that these benches or representative of average difference in these games.

What I've just written is the only logical way that graph can be interpreted to my knowledge.

 

dv8silencer

Member
May 7, 2008
142
0
0
Originally posted by: nRollo
Originally posted by: taltamir
oh wow, took me a while to get what you are talking about... they start the graph at 0.8 instead of 0. where 1 = performance of ATI card... so if the ATI card gets a 1 and the 280GTX gets a 2.5 it makes it seem like the ATI has 0.2 and the nv has 1.7 on call of juarez for example. which is 8.5x bigger bar.
And they used upcoming drivers for themselves, and outdated AMD drivers. The 8.5 cat drivers claim 12% performance increase on that specific game, i don't know if 8.4 changed anything for it too, but they used 8.3
Not THAT outdated (considering 8 = year, and 3 = month....) but still shady.

Apparently I'm the only person in this thread who understood that graph the first time he/she looked at it.

It comparison in relation to a 3870X2, it's very straight forward, very easy to read.

It's not saying "ATi got .2", it's saying "3870X2 got framerate X. GTX260 was Y% higher, GTX 280 was Z% higher".

I have seen marketing graphs that make 10-20% look like huge differences. These are huge differences.

The only thing that can be construed in the slightest as misleading here is we don't know what went into the benchmarks. By this I mean presumably NVIDIA used areas of the game that showed their products in a favorable light. This could happen inadvertently with ANY review, and it's also possible there are areas in the game which could show their products in a more favorable light, or that these benches or representative of average difference in these games.

What I've just written is the only logical way that graph can be interpreted to my knowledge.

The graphs are not inaccurate. The function of graphs is to make comparisons easier to see. All they needed to do is start the y at 0 and you would have a better graph. My main point is that it is obvious that Nvidia is trying to make it look as if product A is x times more powerful than product B. If you give me the power to change y-min at will, then I can make Product A look x times better than product B, where x is of an arbitrary magnitude.

You can construct the graph in many ways. Some ways are better than others. Some are obviously trying to mislead people who just take a glance. I never said the numbers themselves were inaccurate.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
Originally posted by: Rusin
Originally posted by: Extelleron
Originally posted by: Janooo
HD 3870 X2 against the GeForce GTX 280 and GTX 260

Mixed impression...

Actually looks very good for AMD. 4870 X2 should be ~2x 3870 X2 in performance, so it should beat the GTX 260 just about everywhere (except World in Conflict) and it will beat the GTX 280 in a lot of cases as well. Add in that those were tests done by nVidia using old AMD drivers, and it's actually looking good for the 4000-series.

If R700 rumors prove true as well, HD 4870 X2 should > GTX 280 in Crysis.
Well if HD4870 is around HD3870 X2 level in performance as expected, then HD4870 X2 would sit pretty close to GTX 260 level depending on:
1. How efficient crossfire is with RV770 [4870 X2 is traditional crossfire setting with bridge chip and all). HD3870 X2 was around 65% faster than HD3870
2. What clocks they have for HD4870 X2? I mean HD4870's TDP is already 157W..16%
----
With those numbers GTX 260 is (on average) 65% faster in games than HD3870 X2 and GTX 280 90%. Smallest differencies on games which HD3870 X2 can run decently at 2560x1600 4xAA 16xAF..

You can't really predict 4870x2 performance before seeing how a single GPU 4870 performs first. If the expected 4870 performance is almost (1.9x) twice that of a 3870, then it means 4870 will easily beat the 3870x2. And going by the NV graph, we can say the 4870x2 will beat the GTX260 and be closer to the GTX280 performance. (and even beat that in some cases)

I'm also guessing the average performance increase this round for the 4870 X2 will be higher than 65% when compared to the 4870. They have time to improve the drivers, as the 4870x2 is not supposed to release till August.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: nRollo
Originally posted by: taltamir
oh wow, took me a while to get what you are talking about... they start the graph at 0.8 instead of 0. where 1 = performance of ATI card... so if the ATI card gets a 1 and the 280GTX gets a 2.5 it makes it seem like the ATI has 0.2 and the nv has 1.7 on call of juarez for example. which is 8.5x bigger bar.
And they used upcoming drivers for themselves, and outdated AMD drivers. The 8.5 cat drivers claim 12% performance increase on that specific game, i don't know if 8.4 changed anything for it too, but they used 8.3
Not THAT outdated (considering 8 = year, and 3 = month....) but still shady.

Apparently I'm the only person in this thread who understood that graph the first time he/she looked at it.

It comparison in relation to a 3870X2, it's very straight forward, very easy to read.

It's not saying "ATi got .2", it's saying "3870X2 got framerate X. GTX260 was Y% higher, GTX 280 was Z% higher".

I have seen marketing graphs that make 10-20% look like huge differences. These are huge differences.

The only thing that can be construed in the slightest as misleading here is we don't know what went into the benchmarks. By this I mean presumably NVIDIA used areas of the game that showed their products in a favorable light. This could happen inadvertently with ANY review, and it's also possible there are areas in the game which could show their products in a more favorable light, or that these benches or representative of average difference in these games.

What I've just written is the only logical way that graph can be interpreted to my knowledge.

And that, in a nutshell, is why I stopped posting here. Starting a graph like this at .8 vs 0 is misleading, period. Every thread becomes an argument with comments like that, and it's just a waste of everybody's time.

As usual there's a logical fallacy in your argument - just because you've seen graphs more skewed than this one, doesn't mean it's not misleading.

Yes those of us who know how to read graphs and make sure that they don't start or end at some weird value know that they are skewed, but as Klinky1984 said, these Nvidia graphs make the lead look like they are 8x the competition, when they're really up to about 2.5x.

-------

As for these graphs - looks impressive. Next gen can't get here soon enough!
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: jiffylube1024
Originally posted by: nRollo
Originally posted by: taltamir
oh wow, took me a while to get what you are talking about... they start the graph at 0.8 instead of 0. where 1 = performance of ATI card... so if the ATI card gets a 1 and the 280GTX gets a 2.5 it makes it seem like the ATI has 0.2 and the nv has 1.7 on call of juarez for example. which is 8.5x bigger bar.
And they used upcoming drivers for themselves, and outdated AMD drivers. The 8.5 cat drivers claim 12% performance increase on that specific game, i don't know if 8.4 changed anything for it too, but they used 8.3
Not THAT outdated (considering 8 = year, and 3 = month....) but still shady.

Apparently I'm the only person in this thread who understood that graph the first time he/she looked at it.

It comparison in relation to a 3870X2, it's very straight forward, very easy to read.

It's not saying "ATi got .2", it's saying "3870X2 got framerate X. GTX260 was Y% higher, GTX 280 was Z% higher".

I have seen marketing graphs that make 10-20% look like huge differences. These are huge differences.

The only thing that can be construed in the slightest as misleading here is we don't know what went into the benchmarks. By this I mean presumably NVIDIA used areas of the game that showed their products in a favorable light. This could happen inadvertently with ANY review, and it's also possible there are areas in the game which could show their products in a more favorable light, or that these benches or representative of average difference in these games.

What I've just written is the only logical way that graph can be interpreted to my knowledge.

And that, in a nutshell, is why I stopped posting here. Starting a graph like this at .8 vs 0 is misleading, period. Every thread becomes an argument with comments like that, and it's just a waste of everybody's time.

As usual there's a logical fallacy in your argument - just because you've seen graphs more skewed than this one, doesn't mean it's not misleading.

Yes those of us who know how to read graphs and make sure that they don't start or end at some weird value know that they are skewed, but as Klinky1984 said, these Nvidia graphs make the lead look like they are 8x the competition, when they're really up to about 2.5x.

-------

As for these graphs - looks impressive. Next gen can't get here soon enough!

TBH - I like the graph starting at .8 and showing something for the ATi card rather than just having the the graph axis be ATi performance.

It gives you the opportunity to read the graph by color, for example. In any case, my point in posting this wasn't to say "NVIDIA marketing can do no wrong with their graphs", it was only to say I was surprised to see this reaction to that graph.

(and the 8X thing never even occurred to me, that's so far outside any launch I never considered it)

Also, I only gave the example of the graphs that show tiny differences as large as evidence this graph is pretty descriptive.

As far as you posting here goes, personally I'd rather you did. Seems to me we had some good discussions back in the day.

 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
what reaction, those graphes are BS misleading marketing... and that is something I fully expect from nvidia, AMD, intel, and most of these other companies...
To claim that "there is nothing really misleading here" is BS nRollo. It is VERY misleading. (If the figures are right, AMD can use them to make a graph that starts at -10 and make a 2.5x performance difference seem like a 0.1 performance difference, oh yea, comparing the 8.6 beta drivers to nvidias 169 driver while they are at it).

Anyways, this isn't the end of the world, and it isn't gonna make me stop buying nvidia, but it IS pure bullshit, and there is no reason to defend such behavior.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: taltamir
what reaction, those graphes are BS misleading marketing... and that is something I fully expect from nvidia, AMD, intel, and most of these other companies...
To claim that "there is nothing really misleading here" is BS nRollo. It is VERY misleading. (If the figures are right, AMD can use them to make a graph that starts at -10 and make a 2.5x performance difference seem like a 0.1 performance difference, oh yea, comparing the 8.6 beta drivers to nvidias 169 driver while they are at it).

Anyways, this isn't the end of the world, and it isn't gonna make me stop buying nvidia, but it IS pure bullshit, and there is no reason to defend such behavior.

TBH I'm pretty amazed by everyone's reaction to it, but feel you're all entitled to your opinions. So I guess if you want to think those graphs are "teh devils work" it's your option to do so and buy whatever card you like based on whatever criteria you'd like.

It's a nice day here, so I'm going to head to my cabin and grill steaks, have some Jack Daniels and coke, try to catch a fish with my son.

Hope your Sunday goes as well or better. :beer:
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,230
2
0
Originally posted by: taltamir
what reaction, those graphes are BS misleading marketing... and that is something I fully expect from nvidia, AMD, intel, and most of these other companies...
To claim that "there is nothing really misleading here" is BS nRollo. It is VERY misleading. (If the figures are right, AMD can use them to make a graph that starts at -10 and make a 2.5x performance difference seem like a 0.1 performance difference, oh yea, comparing the 8.6 beta drivers to nvidias 169 driver while they are at it).

Anyways, this isn't the end of the world, and it isn't gonna make me stop buying nvidia, but it IS pure bullshit, and there is no reason to defend such behavior.

Its Rollo after all, the joke is on those who seriously think he changed

Obviously, now if AMD does the same thing, he will also defend them as not to have a double standard, but if it were AMD doing this first you can bet he wouldnt be defending the graph

Point is yes, people are really that stupid, and yes, they will look at a graph and think the card is 8 times faster or whatever without looking at the baseline

Then again Nvidia has always excelled at PR, which is why I hate them
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Other than the argument about the graph, let's look at the big picture.

-- In the most demanding games (Crysis, Oblivion, Call of Juarez, COH, and World in Conflict), GTX 280 is between 80-145% faster than 3870X2.

-- Discounting useless 3dmark bench, ancient FEAR and extremely well scaling UT3 engine, ATI has got their hands full if they intend to match the performance of the GTX 280 part(unless I am missing something?)

-- 4870 would have to be at least as fast as a single 3870 X2 and up to 20% faster, and scale at least 80% in its X2 form to be performance competitive.

However, since ATI plans to undercut GTX series on pricing, other than the performance crowd, things aren't looking so bad
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: RussianSensation
Other than the argument about the graph, let's look at the big picture.

-- In the most demanding games (Crysis, Oblivion, Call of Juarez, COH, and World in Conflict), GTX 280 is between 80-145% faster than 3870X2.

-- Discounting useless 3dmark bench, ancient FEAR and extremely well scaling UT3 engine, ATI has got their hands full if they intend to match the performance of the GTX 280 part(unless I am missing something?)

-- 4870 would have to be at least as fast as a single 3870 X2 and up to 20% faster, and scale at least 80% in its X2 form to be performance competitive.

However, since ATI plans to undercut GTX series on pricing, other than the performance crowd, things aren't looking so bad

HD 4870 X2 should be 2x the speed of the HD 3870 X2, unless AMD runs into power issues, so it should beat the GTX 280 in a lot of big games (definitely UT3 and probably Crysis too).

nVidia has their hands full really, because they are competing with 2x 55nm chips with a huge, power hungry 65nm chip w/ low yields and high costs.


 

HOOfan 1

Platinum Member
Sep 2, 2007
2,337
15
81
Originally posted by: Extelleron
HD 4870 X2 should be 2x the speed of the HD 3870 X2, unless AMD runs into power issues, so it should beat the GTX 280 in a lot of big games (definitely UT3 and probably Crysis too).

nVidia has their hands full really, because they are competing with 2x 55nm chips with a huge, power hungry 65nm chip w/ low yields and high costs.


However, considering news that the 55nm GT200 chip has been taped out, by the time the HD4870X2 hits the market, the 55nm GT200 may be fast on its heals.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: HOOfan 1
Originally posted by: Extelleron
HD 4870 X2 should be 2x the speed of the HD 3870 X2, unless AMD runs into power issues, so it should beat the GTX 280 in a lot of big games (definitely UT3 and probably Crysis too).

nVidia has their hands full really, because they are competing with 2x 55nm chips with a huge, power hungry 65nm chip w/ low yields and high costs.


However, considering news that the 55nm GT200 chip has been taped out, by the time the HD4870X2 hits the market, the 55nm GT200 may be fast on its heals.

That's the wildcard for nVidia.

If the G200 architecture was able to reach the clocks of G92, then GTX 280 would be much faster. Right now we are looking at 600MHz core/1.3GHz shaders, when 9800GTX is 675MHz/1.7GHz shaders. And 9800GTX is clocked a lot lower than it needs to be; vendor oc'd versions are available w/ 750MHz+ core. If G200 could reach those speeds, then it would blow everything away.

Personally I think the big parts for AMD are the 4850 & 4870, even if the 4870 X2 were to capture the performance lead in some games. Even if GTX 280 or its 55nm refresh beats the X2, nVidia will still have to compete with G92b in the <$300 range. Unless G92b hits some insane clocks, it will be unable to compete with the HD 4870. At ~$200, the HD 4850 should be a great value as well.

 

HOOfan 1

Platinum Member
Sep 2, 2007
2,337
15
81
well don't forget that the 9800GTX launched at a $450 MSRP, and can now be had for as little as $280...and with a die shrink the G200 core should be cheaper...so a 55nm GTX 280 successor could easily be as low at $400-$450. HD4870X2 will more than likely be about that price. Then the GTX 260 55nm successor could easily drop to the $300 or below range....if the 65nm part doesn't do it first as the 9800GTX dropped from $450 to $300 in actual price.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: HOOfan 1
well don't forget that the 9800GTX launched at a $450 MSRP, and can now be had for as little as $280...and with a die shrink the G200 core should be cheaper...so a 55nm GTX 280 successor could easily be as low at $400-$450. HD4870X2 will more than likely be about that price. Then the GTX 260 55nm successor could easily drop to the $300 or below range....if the 65nm part doesn't do it first as the 9800GTX dropped from $450 to $300 in actual price.

I don't think you are going to see any GT200 parts, even at 55nm, below $350-400. It's still a huge chip, over 400mm^2, and likely 55nm yields are a bit lower than 65nm yields.

Also, you have the 448-bit or 512-bit bus, which makes the PCB very complex. Then you have the cooling, which needs to be high-end to keep the card cool.

You might see a GT200 at those sort of prices if 55nm yields are very good and they move from 512-bit/GDDR3 to 256-bit/GDDR5. Which I could see being a possibility..... nVidia can't stay with GDDR3 forever, and by this fall GDDR5 supply should be very good.

 

chewietobbacca

Senior member
Jun 10, 2007
291
0
0
The 9800GTX dropped in price because Nvidia had to compete with ATI cutting prices. Keep in mind the 9800GTX was basically an OC'd / higher-binned 8800GTS512 and people weren't willing to justify paying $200 more for a card of similar performance. If Nvidia had it their way, they would've kept it priced high (much like the GX2 was priced high but dropped). A big reason was because ATI was selling cards at $150-200 and they didn't want to let them lose market share at that price range.

We'll see how much a die shrink helps - if they go ATI's route with R600->RV670, removing the 256-bit bus and going to faster memory might be necessary to really cut down the die size.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: Extelleron

HD 4870 X2 should be 2x the speed of the HD 3870 X2, unless AMD runs into power issues, so it should beat the GTX 280 in a lot of big games (definitely UT3 and probably Crysis too).

In ATI's case you'll have a card which might not even scale in some games (and most likely you'll have to wait for constant driver updates to get proper scaling) - COD4.

Also, it's highly unlikely NV will lose in Crysis - Crysis bench 3870 x2

ATI's card would have to be significantly cheaper or faster or scale extremely well -- otherwise it's hard to recommend a dual-gpu solution over a single gpu. Also, significance of UT3 benchmark is largely irrelevant since my low level 8800GTS 320MB already runs the game at 1920x1080 very smoothly.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |