AzN
Banned
- Nov 26, 2001
- 4,112
- 2
- 0
Originally posted by: dug777
If all we cared about what fps/$, we'd all have 4850s or 9600GSOs (at a guess).
Majority of us have 4850 or 9600gso actually or whatever is equivalent.
Originally posted by: dug777
If all we cared about what fps/$, we'd all have 4850s or 9600GSOs (at a guess).
Originally posted by: taltamir
well, no, because the performance is similar at that level. More ram is a technical choice. Just like ram type or how many bits your bus is.
Originally posted by: WaitingForNehalem
Originally posted by: taltamir
well, no, because the performance is similar at that level. More ram is a technical choice. Just like ram type or how many bits your bus is.
Current games need over 512mb vram at 1920x1200+. Future games will require more. That's why cards with 256mb are inadequate. Take a look at Crysis at 2560x1600. The 512mb 4870 can't be used because it doesn't have enough ram. Besides, obviously 896mb costs more than 512mb so it is an unfair comparison.
Originally posted by: taltamir
well, no, because the performance is similar at that level. More ram is a technical choice. Just like ram type or how many bits your bus is.
Originally posted by: WaitingForNehalem
Originally posted by: taltamir
well, no, because the performance is similar at that level. More ram is a technical choice. Just like ram type or how many bits your bus is.
Current games need over 512mb vram at 1920x1200+. Future games will require more. That's why cards with 256mb are inadequate. Take a look at Crysis at 2560x1600. The 512mb 4870 can't be used because it doesn't have enough ram. Besides, obviously 896mb costs more than 512mb so it is an unfair comparison.
Originally posted by: evolucion8
Originally posted by: taltamir
well, no, because the performance is similar at that level. More ram is a technical choice. Just like ram type or how many bits your bus is.
http://www.hardwarezone.com/ar...php?id=2684&cid=3&pg=8
This reviews proved that the RV770 GPU benefits from the additional RAM, I saw the HD 4870X2 review here http://www.techpowerup.com/rev...hire/HD_4870_X2/6.html and they disabled one GPU and saw those increases in performance.
Originally posted by: taltamir
Originally posted by: evolucion8
Originally posted by: taltamir
well, no, because the performance is similar at that level. More ram is a technical choice. Just like ram type or how many bits your bus is.
http://www.hardwarezone.com/ar...php?id=2684&cid=3&pg=8
This reviews proved that the RV770 GPU benefits from the additional RAM, I saw the HD 4870X2 review here http://www.techpowerup.com/rev...hire/HD_4870_X2/6.html and they disabled one GPU and saw those increases in performance.
yes, and when it benefits from that extra ram it outperforms the GTX260, doesn't it? making it worth more.
In order to get scores for single-GPU HD 4870 X2, I disabled one device in Windows Device Manager. These scores are interesting because they show how a 1024 MB GDDR5 HD 4870 would perform. The normal HD 4870 has only 512 MB of GDDR5.
Originally posted by: apoppin
Originally posted by: taltamir
Originally posted by: evolucion8
Originally posted by: taltamir
well, no, because the performance is similar at that level. More ram is a technical choice. Just like ram type or how many bits your bus is.
http://www.hardwarezone.com/ar...php?id=2684&cid=3&pg=8
This reviews proved that the RV770 GPU benefits from the additional RAM, I saw the HD 4870X2 review here http://www.techpowerup.com/rev...hire/HD_4870_X2/6.html and they disabled one GPU and saw those increases in performance.
yes, and when it benefits from that extra ram it outperforms the GTX260, doesn't it? making it worth more.
In order to get scores for single-GPU HD 4870 X2, I disabled one device in Windows Device Manager. These scores are interesting because they show how a 1024 MB GDDR5 HD 4870 would perform. The normal HD 4870 has only 512 MB of GDDR5.
that is their claim .. we don't really know that is true at all, do we?
look at the benches .. the 512MB version is outperforming the Core disabled X2 at 16x10 and 19x12 in your own example
.. and in the others they trade places
nothing conclusive there
Originally posted by: evolucion8
http://www.hardwarezone.com/ar...php?id=2684&cid=3&pg=8
This reviews proved that the RV770 GPU benefits from the additional RAM
Originally posted by: apoppin
Originally posted by: taltamir
Originally posted by: evolucion8
Originally posted by: taltamir
well, no, because the performance is similar at that level. More ram is a technical choice. Just like ram type or how many bits your bus is.
http://www.hardwarezone.com/ar...php?id=2684&cid=3&pg=8
This reviews proved that the RV770 GPU benefits from the additional RAM, I saw the HD 4870X2 review here http://www.techpowerup.com/rev...hire/HD_4870_X2/6.html and they disabled one GPU and saw those increases in performance.
yes, and when it benefits from that extra ram it outperforms the GTX260, doesn't it? making it worth more.
In order to get scores for single-GPU HD 4870 X2, I disabled one device in Windows Device Manager. These scores are interesting because they show how a 1024 MB GDDR5 HD 4870 would perform. The normal HD 4870 has only 512 MB of GDDR5.
that is their claim .. we don't really know that is true at all, do we?
look at the benches .. the 512MB version is outperforming the Core disabled X2 at 16x10 and 19x12 in your own example
.. and in the others they trade places
nothing conclusive there
Originally posted by: MrSpadge
Originally posted by: evolucion8
http://www.hardwarezone.com/ar...php?id=2684&cid=3&pg=8
This reviews proved that the RV770 GPU benefits from the additional RAM
This 1GB 4870 is overclocked, which explains the performance advantage in all but 2 tests. The only cases where more memory helps are Crysis in 1600x1200 with 4x FSAA and Crysis in 1920x1440 with 4x FSAA. Actually you proved our point and not yours.
MrS
Well is up to you to believe it or not
There's no overclocking that will help you when you are VRAM limited.
It would only "help" the X3's 2nd GPU in the 2nd PCIe 4x slot.Originally posted by: dug777
Originally posted by: apoppin
Originally posted by: taltamir
Originally posted by: evolucion8
Originally posted by: taltamir
well, no, because the performance is similar at that level. More ram is a technical choice. Just like ram type or how many bits your bus is.
http://www.hardwarezone.com/ar...php?id=2684&cid=3&pg=8
This reviews proved that the RV770 GPU benefits from the additional RAM, I saw the HD 4870X2 review here http://www.techpowerup.com/rev...hire/HD_4870_X2/6.html and they disabled one GPU and saw those increases in performance.
yes, and when it benefits from that extra ram it outperforms the GTX260, doesn't it? making it worth more.
In order to get scores for single-GPU HD 4870 X2, I disabled one device in Windows Device Manager. These scores are interesting because they show how a 1024 MB GDDR5 HD 4870 would perform. The normal HD 4870 has only 512 MB of GDDR5.
that is their claim .. we don't really know that is true at all, do we?
look at the benches .. the 512MB version is outperforming the Core disabled X2 at 16x10 and 19x12 in your own example
.. and in the others they trade places
nothing conclusive there
Would a P45 help with your x3 setup? Or whatever the top doggie is now, x48?
Who said anything about "lying" ?Well is up to you to believe it or not, you wasn't there when they performed those tests, and you wasn't there when Anandtech performed the X2 tests, so how would you know if both are lying? Please...
Originally posted by: taltamir
When that 896mb is GDDR3 and the 512 is GDDR5 then the 896 is much CHEAPER, it does not cost more.
And while some games do benefit from the extra ram, those same games also benefit from extra ram bandwidth, in which the 4870 wins.
It makes more sense to compare FPS and realize that it happens because of ram size, speed, gpu speed, etc.
The 260 has more ram. the 4870 has a faster gpu and faster ram. It balances out.
And historically we can see that when games would start needing so much ram that performance would tank, then then you would be limited by the GPU even if you did have enough ram.
Originally posted by: WaitingForNehalem
Originally posted by: woolfe9999
Originally posted by: taltamir
that was obviously not his point. He was countering the same "style" of pointless "accusations" about nvidia. It is a sarcastic remark in reply to woolfe9999, obviously all "why"s in both posts are ridiculous.
My post was not a "pointless accusation" against Nvidia. It was not an accusation, period. It was an inference that the appearance of this new product suggests that we more than likely will have to wait until Christmas to see .55nm. As with all inferences, I may be incorrect. However, I doubt that I am.
- woolfe
Actually it wasn't a sarcastic remark. I'm aware of ATI using GDDR5. But when they start using 512mb vram with a 256mbit memory bandwith it makes you wonder is this a next-gen card? (yes I know there is a 1gb card that has been released) The 2900XT had 1gb vram and a 512mbit memory bandwith. Instead of matching Nvidia's cards why not beat them? The 4870 never demolished the GTX 260. They were always close. On the other side what's up with Nvidia not putting DirectX 10.1? When I bought my x850XT everyone said Shader Model 3.0 isn't that important. Now most games require it. Oh, for all those saying a 4870 is way cheaper, there is a GTX 260 for $220 AR at newegg. I've never seen a price tag like that for a 4870.
Originally posted by: cmdrdredd
Originally posted by: WaitingForNehalem
Originally posted by: woolfe9999
Originally posted by: taltamir
that was obviously not his point. He was countering the same "style" of pointless "accusations" about nvidia. It is a sarcastic remark in reply to woolfe9999, obviously all "why"s in both posts are ridiculous.
My post was not a "pointless accusation" against Nvidia. It was not an accusation, period. It was an inference that the appearance of this new product suggests that we more than likely will have to wait until Christmas to see .55nm. As with all inferences, I may be incorrect. However, I doubt that I am.
- woolfe
Actually it wasn't a sarcastic remark. I'm aware of ATI using GDDR5. But when they start using 512mb vram with a 256mbit memory bandwith it makes you wonder is this a next-gen card? (yes I know there is a 1gb card that has been released) The 2900XT had 1gb vram and a 512mbit memory bandwith. Instead of matching Nvidia's cards why not beat them? The 4870 never demolished the GTX 260. They were always close. On the other side what's up with Nvidia not putting DirectX 10.1? When I bought my x850XT everyone said Shader Model 3.0 isn't that important. Now most games require it. Oh, for all those saying a 4870 is way cheaper, there is a GTX 260 for $220 AR at newegg. I've never seen a price tag like that for a 4870.
Simple, GDDR5 does not need 512bit interface. The 4870 is NOT memory bandwidth limited in any way.
Originally posted by: apoppin
Originally posted by: cmdrdredd
Originally posted by: WaitingForNehalem
Originally posted by: woolfe9999
Originally posted by: taltamir
that was obviously not his point. He was countering the same "style" of pointless "accusations" about nvidia. It is a sarcastic remark in reply to woolfe9999, obviously all "why"s in both posts are ridiculous.
My post was not a "pointless accusation" against Nvidia. It was not an accusation, period. It was an inference that the appearance of this new product suggests that we more than likely will have to wait until Christmas to see .55nm. As with all inferences, I may be incorrect. However, I doubt that I am.
- woolfe
Actually it wasn't a sarcastic remark. I'm aware of ATI using GDDR5. But when they start using 512mb vram with a 256mbit memory bandwith it makes you wonder is this a next-gen card? (yes I know there is a 1gb card that has been released) The 2900XT had 1gb vram and a 512mbit memory bandwith. Instead of matching Nvidia's cards why not beat them? The 4870 never demolished the GTX 260. They were always close. On the other side what's up with Nvidia not putting DirectX 10.1? When I bought my x850XT everyone said Shader Model 3.0 isn't that important. Now most games require it. Oh, for all those saying a 4870 is way cheaper, there is a GTX 260 for $220 AR at newegg. I've never seen a price tag like that for a 4870.
Simple, GDDR5 does not need 512bit interface. The 4870 is NOT memory bandwidth limited in any way.
in ANY way?
sure it is .. try it at 25x16 or at 19x12 with everything completely maxed out
.. on a list of games i can give you
you will see definite limitations
However, the 4870 is a nice bit of compromise and offers good bang for buck in most practical gaming situations
- as does the GT260
Originally posted by: cmdrdredd
Originally posted by: apoppin
Originally posted by: cmdrdredd
Originally posted by: WaitingForNehalem
Originally posted by: woolfe9999
Originally posted by: taltamir
that was obviously not his point. He was countering the same "style" of pointless "accusations" about nvidia. It is a sarcastic remark in reply to woolfe9999, obviously all "why"s in both posts are ridiculous.
My post was not a "pointless accusation" against Nvidia. It was not an accusation, period. It was an inference that the appearance of this new product suggests that we more than likely will have to wait until Christmas to see .55nm. As with all inferences, I may be incorrect. However, I doubt that I am.
- woolfe
Actually it wasn't a sarcastic remark. I'm aware of ATI using GDDR5. But when they start using 512mb vram with a 256mbit memory bandwith it makes you wonder is this a next-gen card? (yes I know there is a 1gb card that has been released) The 2900XT had 1gb vram and a 512mbit memory bandwith. Instead of matching Nvidia's cards why not beat them? The 4870 never demolished the GTX 260. They were always close. On the other side what's up with Nvidia not putting DirectX 10.1? When I bought my x850XT everyone said Shader Model 3.0 isn't that important. Now most games require it. Oh, for all those saying a 4870 is way cheaper, there is a GTX 260 for $220 AR at newegg. I've never seen a price tag like that for a 4870.
Simple, GDDR5 does not need 512bit interface. The 4870 is NOT memory bandwidth limited in any way.
in ANY way?
sure it is .. try it at 25x16 or at 19x12 with everything completely maxed out
.. on a list of games i can give you
you will see definite limitations
However, the 4870 is a nice bit of compromise and offers good bang for buck in most practical gaming situations
- as does the GT260
With a 1GB 4870 hooked up to a 1080p LCD TV (1920x1080) and Crysis on very high with 4x AA it's very playable and hovers around 30fps mostly. Sure it stutters at times, but that's cause Crysis is a pig.
I don't see limitations there. You do always have to remember that the higher resolution you go, the more impact AA and AF will have to the experience. That's just what happens. But I have yet to find a game that isn't perfectly playable at 1920x1080 with everything maxed (not necessarily AA/AF though).
My original point is that GDDR5 does not need a million bit bus to get bandwidth out of it.
Originally posted by: evolucion8
Originally posted by: taltamir
well, no, because the performance is similar at that level. More ram is a technical choice. Just like ram type or how many bits your bus is.
http://www.hardwarezone.com/ar...php?id=2684&cid=3&pg=8
This reviews proved that the RV770 GPU benefits from the additional RAM, I saw the HD 4870X2 review here http://www.techpowerup.com/rev...hire/HD_4870_X2/6.html and they disabled one GPU and saw those increases in performance.
Originally posted by: apoppin
Originally posted by: cmdrdredd
Originally posted by: apoppin
Originally posted by: cmdrdredd
Originally posted by: WaitingForNehalem
Originally posted by: woolfe9999
Originally posted by: taltamir
that was obviously not his point. He was countering the same "style" of pointless "accusations" about nvidia. It is a sarcastic remark in reply to woolfe9999, obviously all "why"s in both posts are ridiculous.
My post was not a "pointless accusation" against Nvidia. It was not an accusation, period. It was an inference that the appearance of this new product suggests that we more than likely will have to wait until Christmas to see .55nm. As with all inferences, I may be incorrect. However, I doubt that I am.
- woolfe
Actually it wasn't a sarcastic remark. I'm aware of ATI using GDDR5. But when they start using 512mb vram with a 256mbit memory bandwith it makes you wonder is this a next-gen card? (yes I know there is a 1gb card that has been released) The 2900XT had 1gb vram and a 512mbit memory bandwith. Instead of matching Nvidia's cards why not beat them? The 4870 never demolished the GTX 260. They were always close. On the other side what's up with Nvidia not putting DirectX 10.1? When I bought my x850XT everyone said Shader Model 3.0 isn't that important. Now most games require it. Oh, for all those saying a 4870 is way cheaper, there is a GTX 260 for $220 AR at newegg. I've never seen a price tag like that for a 4870.
Simple, GDDR5 does not need 512bit interface. The 4870 is NOT memory bandwidth limited in any way.
in ANY way?
sure it is .. try it at 25x16 or at 19x12 with everything completely maxed out
.. on a list of games i can give you
you will see definite limitations
However, the 4870 is a nice bit of compromise and offers good bang for buck in most practical gaming situations
- as does the GT260
With a 1GB 4870 hooked up to a 1080p LCD TV (1920x1080) and Crysis on very high with 4x AA it's very playable and hovers around 30fps mostly. Sure it stutters at times, but that's cause Crysis is a pig.
I don't see limitations there. You do always have to remember that the higher resolution you go, the more impact AA and AF will have to the experience. That's just what happens. But I have yet to find a game that isn't perfectly playable at 1920x1080 with everything maxed (not necessarily AA/AF though).
My original point is that GDDR5 does not need a million bit bus to get bandwidth out of it.
perhaps you are correct ,, you just don't see it
Crysis is not playable on a 4870 at very high "hovering around 30FPS mostly"
- try using FRAPS and you will see Frame rates regularly in the teens - 30s is your fastest frames with low-20s as average
.. as every credible reviewer reports for 19x12 .. for No/AA
--what magic card are you using to get 30FPS average with 4xAA?
... with the 512MB version of 4870, you ,,, i mean i .. see even more limitations
A SINGLE 4870 1GB at 790/1100 is getting 30fps using devmode in game to see an fps counter. Like I said it stutters at times due to the engine being poor. I never said average, and I never mentioned a timedemo. Please read what I said, and NOT what you think.
Remember that the timedemos are designed as a worst case scenerio, not actual gameplay. Walking through the jungle and doing general gameplay it's a good experience. There are only a few times where it really wasn't good at all.
Originally posted by: MrSpadge
Hi guys,
in the comments section to the article I posted the following:
"I read somewhere that the first batches of 48x0 cards had a bug in their bios which prevented power play from working properly. This is supposed to be fixed since some time now and idle power draw should be decreased significantly.
I'd say contact AMD or a card manufacturer. If it's true they should be more than happy to assist you in obtaining updated numbers. The current numbers are just plain horrible and may keep people from buying the Radeons.
Regards, MrS"
Seems like it was a pretty bad choice to post there - seems like no one noticed. Do you know anything about this? I mean, the 38x0 series had such great idle power consumption, it was one of their primary strengths! Sure, the RV7700 is more complex, but all of this logic should be switched off at idle anyway. Something seems wrong here!
MrS