Originally posted by: Denithor
Originally posted by: taltamir
Snow level, the first half of the game was "playable" (but no way smooth) on a low resolution and high setting, but it was stuttering too much later on, so i lowered it to medium.
And really, there is no justification for such atrocious quality. All the games I mentioned play at absolute max settings (maybe with AA being lower then max) at 1920x1200 and are smooth as cream. Crysis is simply not optimized for this generation of cards. It might be nice to play on a G200 when they come out, but it looks much MUCH worse then any other game I own on an 8800GTS 512.
How is that in any way related to quality? Just because you haven't spent enough money to have a GX2 doesn't make the game's quality any different...
People grumbled about Oblivion the same way for the first year it was out, it brought the entire then-current generation of video cards to their collective knees. It wasn't until the launch of the 8800GTS/X and the 2900 series cards that we could really play Oblivion at high res with eye candy turned up.
The new 48x0 and GTX 280/260 cards should provide enough performance to play Crysis reasonably well on a single GPU. And there will be much rejoicing.
Originally posted by: taltamir
Now it doesn't bother me that it is called medium and not max... it bothers me that the "medium" settings that gets as much FSP as "max" on another game looks much worse then the max... high looks comparable or better to other games, but has atrocious FPS. and very high looks better then anything else, but is a slideshow.
If they only put in more effort on optimizing the medium setting, the one that the vast majority of people can play at, then they would have had a much better product.
Originally posted by: taltamir
Originally posted by: Denithor
Originally posted by: taltamir
Snow level, the first half of the game was "playable" (but no way smooth) on a low resolution and high setting, but it was stuttering too much later on, so i lowered it to medium.
And really, there is no justification for such atrocious quality. All the games I mentioned play at absolute max settings (maybe with AA being lower then max) at 1920x1200 and are smooth as cream. Crysis is simply not optimized for this generation of cards. It might be nice to play on a G200 when they come out, but it looks much MUCH worse then any other game I own on an 8800GTS 512.
How is that in any way related to quality? Just because you haven't spent enough money to have a GX2 doesn't make the game's quality any different...
People grumbled about Oblivion the same way for the first year it was out, it brought the entire then-current generation of video cards to their collective knees. It wasn't until the launch of the 8800GTS/X and the 2900 series cards that we could really play Oblivion at high res with eye candy turned up.
The new 48x0 and GTX 280/260 cards should provide enough performance to play Crysis reasonably well on a single GPU. And there will be much rejoicing.
you carefully ignore the next sentence...
Originally posted by: taltamir
Now it doesn't bother me that it is called medium and not max... it bothers me that the "medium" settings that gets as much FSP as "max" on another game looks much worse then the max... high looks comparable or better to other games, but has atrocious FPS. and very high looks better then anything else, but is a slideshow.
If they only put in more effort on optimizing the medium setting, the one that the vast majority of people can play at, then they would have had a much better product.
The quality issue is that other games look much better on the same FPS on the same video card...
30FPS bioshock on 8800GTS 512
30FPS assassin creed on 8800GTS 512
30FPS etc on 8800GTS 512
Other games look better on a reasonably high end, and any low end card then crysis. I am saying that the engine is severely lacking for existing hardware
Well GTX 280 could be true terror compared to 9800 GX2 on higher settings with AA. I mean look how 9800 GX2 performs at 1680x1050 8xAA 16xAF? 8800 GTX scores 40% better fps on average. On higher settings 9800 GX2's memory bandwidht and possibly frame buffer limitations will come obviously forward, but these shouldn't be issue with GTX 280 and GTX 260.
Originally posted by: HOOfan 1
If you think any game looks graphically better than Crysis on the right hardware, then I don't know where you are coming from.
If you are complaining that Crysis won't run well on your equipment, then that is your problem.
The designers of Crysis specifically stated that the game wouldn't be able to be played maxed out on the current hardware.
Faulting a game for having graphics so advanced that they demand a top of the line system is shortsighted...that is how PC gaming has been happening for years.
If you think Bioshock at high settings looks better than Crysis on Medium settings...again that is your opinion, but many others disagree.
The fact of the matter is that Crysis is pretty much the only game right now that demands the most out of a high end system and therefore it is a great game for benchmarking...you are just going to have to get used to that.
Originally posted by: Rusin
GTX 280 compared to 9800 GX2
-93.75% of SP units (Still should have more SP-performance, since that shader update]
-62.5% of TMUs (This worries, but not much..)
-100% wider mem bus
-100% more vram
Originally posted by: superbooga
Originally posted by: Rusin
GTX 280 compared to 9800 GX2
-93.75% of SP units (Still should have more SP-performance, since that shader update]
-62.5% of TMUs (This worries, but not much..)
-100% wider mem bus
-100% more vram
Data is duplicated, but accesses aren't, so 9800GX2 is effectively 512 MB of memory with a 512 bit bus.
Originally posted by: superbooga
Originally posted by: Rusin
GTX 280 compared to 9800 GX2
-93.75% of SP units (Still should have more SP-performance, since that shader update]
-62.5% of TMUs (This worries, but not much..)
-100% wider mem bus
-100% more vram
Data is duplicated, but data accesses aren't duplicated, so 9800GX2 is effectively 512 MB of memory with a 512 bit bus.
Yep. Not only that, but taking advantage of 2x256 relies on multi-GPU scaling to work properly. If that's not happening it's just 1x256 or even less if the driver has major issues.Nitro is correct.
Originally posted by: Rusin
I think that it's the opposite way around GTX 260 performs better at high resolutions and settings. If Geforce 8800 GTX owns 9800 GX2 at 1600x1200 8xAA 16xAF..then what will GTX 260 do which looks better than 8800 GTX in every aspect?
Originally posted by: Cookie Monster
That being said, GT280 looks one hell of a card. I wont be surprised if this thing demolished the current lineup of benchmarks. And yes even crysis.
Originally posted by: Rusin
I think that it's the opposite way around GTX 260 performs better at high resolutions and settings. If Geforce 8800 GTX owns 9800 GX2 at 1600x1200 8xAA 16xAF..then what will GTX 260 do which looks better than 8800 GTX in every aspect?
Originally posted by: nitromullet
You have two independent slots of 512MB of RAM, each on a 256 bit bus...
Either you have two 256 bit buses transferring the same data simultaneously, or you have two different 256 bit buses transferring different instances of the same data sequentially... No matter how you slice it, you still only have a 256 bit bus.
I could be wrong about my reasoning, but I'm pretty sure that dual independent 256 buses transferring duplicate data to different memory slots does not effectively equal a 512 bit bus.
Originally posted by: MTDEW
Originally posted by: HOOfan 1
If you think any game looks graphically better than Crysis on the right hardware, then I don't know where you are coming from.
If you are complaining that Crysis won't run well on your equipment, then that is your problem.
The designers of Crysis specifically stated that the game wouldn't be able to be played maxed out on the current hardware.
Faulting a game for having graphics so advanced that they demand a top of the line system is shortsighted...that is how PC gaming has been happening for years.
If you think Bioshock at high settings looks better than Crysis on Medium settings...again that is your opinion, but many others disagree.
The fact of the matter is that Crysis is pretty much the only game right now that demands the most out of a high end system and therefore it is a great game for benchmarking...you are just going to have to get used to that.
I'd have to agree with this statement.
Ive read people stating that Bioshock, COD4 and Gears of war all look better and play better than Crysis.
Which is just simply not true, none of those games have have near the draw distance and open endedness(is that a word?) that Crysis has.
The Crysis game engine is just plain and simple drawing more onscreen at longer distances than those other games.
Originally posted by: superbooga
Originally posted by: nitromullet
You have two independent slots of 512MB of RAM, each on a 256 bit bus...
Either you have two 256 bit buses transferring the same data simultaneously, or you have two different 256 bit buses transferring different instances of the same data sequentially... No matter how you slice it, you still only have a 256 bit bus.
I could be wrong about my reasoning, but I'm pretty sure that dual independent 256 buses transferring duplicate data to different memory slots does not effectively equal a 512 bit bus.
Data is duplicated across all GPU memory in SLI. Each GPU is rendering something different, so they are accessing different regions of memory at the same time. Basically it would be two different 256 bit buses transferring different instances of the same data AT THE SAME TIME.
It's like giving the same copy of a 50 problem test to two people, telling one person to do the first 25 problems and the other to do the last 25 problems. Of course, the last 25 problems may be a lot harder than the first 25, resulting in poor load balancing. But when it's well balanced it's a lot faster than having one person do 50 problems.
Originally posted by: Rusin
http://www.pcper.com/comments.php?nid=5679
Folding@home client for Nvidia.
"Apparently we aren't allowed to talk about specifics on performance of the client, mostly because the numbers we saw were based on an "upcoming NVIDIA GPU". I don't think I'll be spoiling anything by saying the new GPU was incredibly fast and the upcoming GPU will be faster than any Folding client today including the PS3; you will be impressed. "
Originally posted by: nitromullet
This is exactly why you don't have an effective 512 bit bus.
Using your example, say that maximum amount of problems that a 256 bit student can carry from the teacher's desk to their own in one trip is 50 problems. Between the two of them, the students are both carrying 100 problems, but combined they are still only solving the same 50 in one sitting. Now, if these kids were 512 bit, one of them could carry 100 problems at once and solve them in one sitting. This would be 50 additional problems that your two kids haven't even started working on yet.