nVidia GT200 Series Thread

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: Denithor
Originally posted by: taltamir
Snow level, the first half of the game was "playable" (but no way smooth) on a low resolution and high setting, but it was stuttering too much later on, so i lowered it to medium.
And really, there is no justification for such atrocious quality. All the games I mentioned play at absolute max settings (maybe with AA being lower then max) at 1920x1200 and are smooth as cream. Crysis is simply not optimized for this generation of cards. It might be nice to play on a G200 when they come out, but it looks much MUCH worse then any other game I own on an 8800GTS 512.

How is that in any way related to quality? Just because you haven't spent enough money to have a GX2 doesn't make the game's quality any different...

People grumbled about Oblivion the same way for the first year it was out, it brought the entire then-current generation of video cards to their collective knees. It wasn't until the launch of the 8800GTS/X and the 2900 series cards that we could really play Oblivion at high res with eye candy turned up.

The new 48x0 and GTX 280/260 cards should provide enough performance to play Crysis reasonably well on a single GPU. And there will be much rejoicing.

you carefully ignore the next sentence...
Originally posted by: taltamir
Now it doesn't bother me that it is called medium and not max... it bothers me that the "medium" settings that gets as much FSP as "max" on another game looks much worse then the max... high looks comparable or better to other games, but has atrocious FPS. and very high looks better then anything else, but is a slideshow.

If they only put in more effort on optimizing the medium setting, the one that the vast majority of people can play at, then they would have had a much better product.

The quality issue is that other games look much better on the same FPS on the same video card...

30FPS bioshock on 8800GTS 512
30FPS assassin creed on 8800GTS 512
30FPS etc on 8800GTS 512

Other games look better on a reasonably high end, and any low end card then crysis. I am saying that the engine is severely lacking for existing hardware
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: taltamir
Originally posted by: Denithor
Originally posted by: taltamir
Snow level, the first half of the game was "playable" (but no way smooth) on a low resolution and high setting, but it was stuttering too much later on, so i lowered it to medium.
And really, there is no justification for such atrocious quality. All the games I mentioned play at absolute max settings (maybe with AA being lower then max) at 1920x1200 and are smooth as cream. Crysis is simply not optimized for this generation of cards. It might be nice to play on a G200 when they come out, but it looks much MUCH worse then any other game I own on an 8800GTS 512.

How is that in any way related to quality? Just because you haven't spent enough money to have a GX2 doesn't make the game's quality any different...

People grumbled about Oblivion the same way for the first year it was out, it brought the entire then-current generation of video cards to their collective knees. It wasn't until the launch of the 8800GTS/X and the 2900 series cards that we could really play Oblivion at high res with eye candy turned up.

The new 48x0 and GTX 280/260 cards should provide enough performance to play Crysis reasonably well on a single GPU. And there will be much rejoicing.

you carefully ignore the next sentence...
Originally posted by: taltamir
Now it doesn't bother me that it is called medium and not max... it bothers me that the "medium" settings that gets as much FSP as "max" on another game looks much worse then the max... high looks comparable or better to other games, but has atrocious FPS. and very high looks better then anything else, but is a slideshow.

If they only put in more effort on optimizing the medium setting, the one that the vast majority of people can play at, then they would have had a much better product.

The quality issue is that other games look much better on the same FPS on the same video card...

30FPS bioshock on 8800GTS 512
30FPS assassin creed on 8800GTS 512
30FPS etc on 8800GTS 512

Other games look better on a reasonably high end, and any low end card then crysis. I am saying that the engine is severely lacking for existing hardware

I'm repeating myself here but what you say is simply false. An 8800GTS 512MB is playable at High settings @ 1680x1050 in Crysis. I don't know what is wrong with your setup, but that's the way it is for me and others.

Well GTX 280 could be true terror compared to 9800 GX2 on higher settings with AA. I mean look how 9800 GX2 performs at 1680x1050 8xAA 16xAF? 8800 GTX scores 40% better fps on average. On higher settings 9800 GX2's memory bandwidht and possibly frame buffer limitations will come obviously forward, but these shouldn't be issue with GTX 280 and GTX 260.

I think GTX 280 will provide a serious improvement over the 9800GX2 in every way; as you said 9800GX2 falls apart w/ AA or high res because of its 256-bit bus and only 16 ROPs. With 32 ROPs / 1GB / 512-Bit, GTX 280 will dominate at high resolutions. And remember you can't compare number of SP's/TMU's between GX2 and GTX 280; GX2 relies on SLI and can not effectively use all those resources.
 

HOOfan 1

Platinum Member
Sep 2, 2007
2,337
15
81
If you think any game looks graphically better than Crysis on the right hardware, then I don't know where you are coming from.

If you are complaining that Crysis won't run well on your equipment, then that is your problem.

The designers of Crysis specifically stated that the game wouldn't be able to be played maxed out on the current hardware.

Faulting a game for having graphics so advanced that they demand a top of the line system is shortsighted...that is how PC gaming has been happening for years.

If you think Bioshock at high settings looks better than Crysis on Medium settings...again that is your opinion, but many others disagree.

The fact of the matter is that Crysis is pretty much the only game right now that demands the most out of a high end system and therefore it is a great game for benchmarking...you are just going to have to get used to that.
 

MTDEW

Diamond Member
Oct 31, 1999
4,284
37
91
Originally posted by: HOOfan 1
If you think any game looks graphically better than Crysis on the right hardware, then I don't know where you are coming from.

If you are complaining that Crysis won't run well on your equipment, then that is your problem.

The designers of Crysis specifically stated that the game wouldn't be able to be played maxed out on the current hardware.

Faulting a game for having graphics so advanced that they demand a top of the line system is shortsighted...that is how PC gaming has been happening for years.

If you think Bioshock at high settings looks better than Crysis on Medium settings...again that is your opinion, but many others disagree.

The fact of the matter is that Crysis is pretty much the only game right now that demands the most out of a high end system and therefore it is a great game for benchmarking...you are just going to have to get used to that.

I'd have to agree with this statement.
Ive read people stating that Bioshock, COD4 and Gears of war all look better and play better than Crysis.
Which is just simply not true, none of those games have have near the draw distance and open endedness(is that a word?) that Crysis has.
The Crysis game engine is just plain and simple drawing more onscreen at longer distances than those other games.
 

superbooga

Senior member
Jun 16, 2001
333
0
0
Originally posted by: Rusin
GTX 280 compared to 9800 GX2
-93.75% of SP units (Still should have more SP-performance, since that shader update]
-62.5% of TMUs (This worries, but not much..)
-100% wider mem bus
-100% more vram

Data is duplicated, but data accesses aren't duplicated, so 9800GX2 is effectively 512 MB of memory with a 512 bit bus.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
Originally posted by: superbooga
Originally posted by: Rusin
GTX 280 compared to 9800 GX2
-93.75% of SP units (Still should have more SP-performance, since that shader update]
-62.5% of TMUs (This worries, but not much..)
-100% wider mem bus
-100% more vram

Data is duplicated, but accesses aren't, so 9800GX2 is effectively 512 MB of memory with a 512 bit bus.

Cards like 9800GX2 almost never perform at 2x speed of the single chip card, you end up losing some performance. My guess is that the GTX 280 will be noticeably faster than the GX2. And even the GT 260 might equal or beat it in some cases (lower resolutions).
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: superbooga
Originally posted by: Rusin
GTX 280 compared to 9800 GX2
-93.75% of SP units (Still should have more SP-performance, since that shader update]
-62.5% of TMUs (This worries, but not much..)
-100% wider mem bus
-100% more vram

Data is duplicated, but data accesses aren't duplicated, so 9800GX2 is effectively 512 MB of memory with a 512 bit bus.

I don't think this is correct.

You have two independent slots of 512MB of RAM, each on a 256 bit bus...

Either you have two 256 bit buses transferring the same data simultaneously, or you have two different 256 bit buses transferring different instances of the same data sequentially... No matter how you slice it, you still only have a 256 bit bus.

I could be wrong about my reasoning, but I'm pretty sure that dual independent 256 buses transferring duplicate data to different memory slots does not effectively equal a 512 bit bus.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Nitro is correct.

That being said, GT280 looks one hell of a card. I wont be surprised if this thing demolished the current lineup of benchmarks. And yes even crysis.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,000
126
Nitro is correct.
Yep. Not only that, but taking advantage of 2x256 relies on multi-GPU scaling to work properly. If that's not happening it's just 1x256 or even less if the driver has major issues.

It?s kind of like how a dual-core 3 GHz CPU is 6 GHz in theory but in practice it would only be that with perfectly threaded load distribution.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
Originally posted by: Rusin
I think that it's the opposite way around GTX 260 performs better at high resolutions and settings. If Geforce 8800 GTX owns 9800 GX2 at 1600x1200 8xAA 16xAF..then what will GTX 260 do which looks better than 8800 GTX in every aspect?

My bad, I missed that you said on your first post the Geforce GTX 260 has a 448bit bus, I made my assumption thinking it had only a 256bit memory bus cause that's what I read somwhere.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
Originally posted by: Cookie Monster
That being said, GT280 looks one hell of a card. I wont be surprised if this thing demolished the current lineup of benchmarks. And yes even crysis.

GTX280 will be a BEAST. It can work for both great gaming and as a space heater, which can be good or bad depending on where you live

Just a few safety tips to remember in case you plan on buying one:

* Keep all furniture and combustible items at least three feet away from your GTX280.
* Never use the GTX280 to dry clothing, towels or other combustibles.
* The GTX280 should not be left on while you are asleep or leave the room.
* Keep young children away from your GTX280.
* Avoid using the GTX280 to heat the bathroom, and never touch it when you are wet.
* Make sure that your smoke and carbon monoxide detectors are working before gaming with your GTX280.
 

superbooga

Senior member
Jun 16, 2001
333
0
0
Originally posted by: Rusin
I think that it's the opposite way around GTX 260 performs better at high resolutions and settings. If Geforce 8800 GTX owns 9800 GX2 at 1600x1200 8xAA 16xAF..then what will GTX 260 do which looks better than 8800 GTX in every aspect?

Where do you see the 8800GTX owning the GX2? Maybe in MSAA, but not CSAA, which is probably what most people will use.
 

superbooga

Senior member
Jun 16, 2001
333
0
0
Originally posted by: nitromullet
You have two independent slots of 512MB of RAM, each on a 256 bit bus...

Either you have two 256 bit buses transferring the same data simultaneously, or you have two different 256 bit buses transferring different instances of the same data sequentially... No matter how you slice it, you still only have a 256 bit bus.

I could be wrong about my reasoning, but I'm pretty sure that dual independent 256 buses transferring duplicate data to different memory slots does not effectively equal a 512 bit bus.

Data is duplicated across all GPU memory in SLI. Each GPU is rendering something different, so they are accessing different regions of memory at the same time. Basically it would be two different 256 bit buses transferring different instances of the same data AT THE SAME TIME.

It's like giving the same copy of a 50 problem test to two people, telling one person to do the first 25 problems and the other to do the last 25 problems. Of course, the last 25 problems may be a lot harder than the first 25, resulting in poor load balancing. But when it's well balanced it's a lot faster than having one person do 50 problems.
 

ghost recon88

Diamond Member
Oct 2, 2005
6,196
1
81
So nVidia went all out, and just decided to crank the fastest core out there, without thinking about power consumption or heat production? They should start attaching waterblocks to the cards straight outta the factory, as apparently air cooling isn't gonna be cutting it anymore
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: MTDEW
Originally posted by: HOOfan 1
If you think any game looks graphically better than Crysis on the right hardware, then I don't know where you are coming from.

If you are complaining that Crysis won't run well on your equipment, then that is your problem.

The designers of Crysis specifically stated that the game wouldn't be able to be played maxed out on the current hardware.

Faulting a game for having graphics so advanced that they demand a top of the line system is shortsighted...that is how PC gaming has been happening for years.

If you think Bioshock at high settings looks better than Crysis on Medium settings...again that is your opinion, but many others disagree.

The fact of the matter is that Crysis is pretty much the only game right now that demands the most out of a high end system and therefore it is a great game for benchmarking...you are just going to have to get used to that.

I'd have to agree with this statement.
Ive read people stating that Bioshock, COD4 and Gears of war all look better and play better than Crysis.
Which is just simply not true, none of those games have have near the draw distance and open endedness(is that a word?) that Crysis has.
The Crysis game engine is just plain and simple drawing more onscreen at longer distances than those other games.

There were games in history like Doom 3 which looked better than just about anything out there at medium settings at 800x600; or HL2 that blazed at 60-100 frames with everything maxed out on something like 6800 GT. The point is Crysis is a graphical masterpiece, but with older hardware it doesn't scale that well. It doesn't scale with quad core processors either. The graphics are great, but not significantly better than Gears of War which runs on Xbox360! Although I personally think that Crysis looks great even at 1024x768 at medium settings, its hardware requirements to graphics output quality is awful! I can run COD4 at 1600x1200 2AA with everything on high and get smoother frames! And compared to Unreal Tournament 2004 which looks better at 1920x1080 and runs at 80-100 frames, Crysis is about the worst game in terms of hardware efficiency.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: superbooga
Originally posted by: nitromullet
You have two independent slots of 512MB of RAM, each on a 256 bit bus...

Either you have two 256 bit buses transferring the same data simultaneously, or you have two different 256 bit buses transferring different instances of the same data sequentially... No matter how you slice it, you still only have a 256 bit bus.

I could be wrong about my reasoning, but I'm pretty sure that dual independent 256 buses transferring duplicate data to different memory slots does not effectively equal a 512 bit bus.

Data is duplicated across all GPU memory in SLI. Each GPU is rendering something different, so they are accessing different regions of memory at the same time. Basically it would be two different 256 bit buses transferring different instances of the same data AT THE SAME TIME.

It's like giving the same copy of a 50 problem test to two people, telling one person to do the first 25 problems and the other to do the last 25 problems. Of course, the last 25 problems may be a lot harder than the first 25, resulting in poor load balancing. But when it's well balanced it's a lot faster than having one person do 50 problems.

This is exactly why you don't have an effective 512 bit bus.

Using your example, say that maximum amount of problems that a 256 bit student can carry from the teacher's desk to their own in one trip is 50 problems. Between the two of them, the students are both carrying 100 problems, but combined they are still only solving the same 50 in one sitting. Now, if these kids were 512 bit, one of them could carry 100 problems at once and solve them in one sitting. This would be 50 additional problems that your two kids haven't even started working on yet.
 

dv8silencer

Member
May 7, 2008
142
0
0
Originally posted by: Rusin
http://www.pcper.com/comments.php?nid=5679
Folding@home client for Nvidia.

"Apparently we aren't allowed to talk about specifics on performance of the client, mostly because the numbers we saw were based on an "upcoming NVIDIA GPU". I don't think I'll be spoiling anything by saying the new GPU was incredibly fast and the upcoming GPU will be faster than any Folding client today including the PS3; you will be impressed. "

nice
 

superbooga

Senior member
Jun 16, 2001
333
0
0
Originally posted by: nitromullet
This is exactly why you don't have an effective 512 bit bus.

Using your example, say that maximum amount of problems that a 256 bit student can carry from the teacher's desk to their own in one trip is 50 problems. Between the two of them, the students are both carrying 100 problems, but combined they are still only solving the same 50 in one sitting. Now, if these kids were 512 bit, one of them could carry 100 problems at once and solve them in one sitting. This would be 50 additional problems that your two kids haven't even started working on yet.

Nitro, the students solve different problems at the same time.

Let's say a 256 bit student can solve 50 problems in ONE HOUR, and a 512 bit student can solve 100 problems in the same amount of time. If you have 2 256 bit students, then student A can solve the first 50 problems in one hour, and student B can solve the second 50 problems in one hour. You have 100 problems solved in one hour.

Perhaps are you confused by data duplication -- bus width has nothing do with this. Data duplication just means effective memory size, not bandwidth is limited. Stored data is duplicated, but accessed data is not. The data is stored at the teacher's desk, not at the student's desk. =)

Now the framebuffer from one of the GPUs has to be written to the framebuffer in the primary GPU, since only that GPU can output to the display. This uses bandwidth, but it is far less than the bandwidth used to render the image, unless we are talking about very high frame rates. This is one reason why SLI usually doesn't help if a single GPU is already getting 200 fps -- combining the frame buffers 200 times a second starts eating up bandwidth.

As an analogy, the school principal (CPU) creates all the problems, and uses the PCI-E bus to deliver the same set of 10000 problems to two teachers. Each teacher has one 256 bit student that can solve 50 problems in our hour. The students solve problems for 23 hours a day, then spend one hour to deliver solved problems from the second teacher's desk to the first teacher's desk.

So in 24 hours, two 256 bit students solve 2300 problems, while a single 512 bit student solves 2400 problems.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |