CrystalBay
Platinum Member
- Apr 2, 2002
- 2,175
- 1
- 0
The Whole problem is early adopters are supposed to take an extra 33A% at retail/etaial up thier bum....No thanks you wiill not rape my ass.....
I'm pretty sure the GeForce 2 retailed at $300 at launch.Originally posted by: n7
Patience is a virtue with these things.
Truth be told though, the way the market for video cards is right now though, i can see the $300 GT staying that way for a long time.
nV & AMD have done a fantastic job over the last few years bumping the price of the high end up higher & higher to point where $300 seems like a great deal...
maybe they could call it the x1975xxx. with that many x's it must be good, right?Originally posted by: SickBeast
If they changed the shader tech and gave it AA power, then maybe...but really...that's a big maybe.Originally posted by: n7
743/1000 was the official speed on the HD 2900 XT 1 GB.
But many came factory OCed to around 825/1050.
I know my 2900XT 1 GB did close to 1150 MHz on the memory IIRC.
Anyway, i don't see 825/2400 being unrealistic at all.
But i still don't see it beating the 8800 GT.
I wonder how hard it would be for them to juice up an X1900 and give it DX10.
Originally posted by: SickBeast
Perhaps because 128 vs. 112 shaders is not much difference at all, and the 384-bit bus has proven to be relatively useless.Originally posted by: Azn
Why do you keep on mentioning your GT is like GTX? It's not a GTX. You have a GT. It performs right in the dab between a gts and gtx. Once you raise the resolution and AA it performs more like a GTS and not a GTX.
A 10% overclock on an 8800GT makes it perform on-par with a GTX, but it has way more overclocking headroom than that. My guess is that with both cards overclocked, the GT will match the GTX.
Where did you see a GTX smoking a GT?
Originally posted by: ronnn
Originally posted by: SolMiester
Originally posted by: ronnn
fudzilla doesn't even rate as rumours. Give it a couple of weeks and we will see. Looks to me like a nice mid range card. Low power usage, cheap to make and good performance. Ati could lose the performance battle, but still win the war. Is mid range........
ATI lost the performance battle over a year ago......With a new GTS coming shortly, next year a new GTX, the GT will be the mid-range card and with the drop to $200, AMD will lose out there too!....Maybe the integrated graphics chips?
Lots of maybes here. Anyways what happens next year won't effect how ati's midrange cards sell this year.
Originally posted by: cmdrdredd
I stand by what I said. NEITHER card matters at all. Why? Because DX10 is still unplayable at any reasonable resolution with reasonable FPS...PERIOD!
I don't care what card is cheap...I want a card that is faster than a card from last year...(read: 8800GTX overclocked).
Originally posted by: thilan29
Well, if they lowered the clocks then maybe it performs better than it was though to?
Originally posted by: yacoub
Originally posted by: thilan29
Well, if they lowered the clocks then maybe it performs better than it was though to?
That doesn't jibe though. If it could handle higher clocks stock, they would use higher clocks to ensure it competes or even dominates the competition. The reality is it probably runs too hot to handle anything higher, which would makes sense given ATI's inability to produce a card that runs at decent temps without cranking up a noisy fan.
Originally posted by: keysplayr2003
Face it, Crysis is a beast, but that doesn't mean there aren't ways to play this game without breaking the bank. So,,,,, current technology graphics cards CAN play this game. Contrary to your findings.
Originally posted by: yacoub
Originally posted by: thilan29
Well, if they lowered the clocks then maybe it performs better than it was though to?
That doesn't jibe though. If it could handle higher clocks stock, they would use higher clocks to ensure it competes or even dominates the competition. The reality is it probably runs too hot to handle anything higher, which would makes sense given ATI's inability to produce a card that runs at decent temps without cranking up a noisy fan.
Originally posted by: yacoub
Well let's look at the info we have available to use to make an educated guess:
We know that the last three Radeon series cards (X1800 X1900 and HD2900) were hot beasts with noisy cooling.
We know that this new one is supposed to be 55nm which should mean it runs cooler.
Originally posted by: keysplayr2003
Originally posted by: cmdrdredd
I stand by what I said. NEITHER card matters at all. Why? Because DX10 is still unplayable at any reasonable resolution with reasonable FPS...PERIOD!
I don't care what card is cheap...I want a card that is faster than a card from last year...(read: 8800GTX overclocked).
Would you say that 1280x1024 is a reasonable resolution? Actually, this is a bit tougher than 14x9. Cause these guys seem to be getting pretty good playability under Vista.
Hard Forum Crysis 1280x1024 scores. All settings high, no AA
With a good CPU, and 8800GT's, even GTS's, average framerates look to be not as abysmal as you make it out to be. And this is beta.
I am seeing average framerates from the high 30's to mid 40's (36-45 fps) for 8800GT and up.
The GTS640 averages low to mid 30's.
These scores are with single cards. SLI is not functional yet (as per these results).
When the prices of the 8800GT's levels out (gouging calms down) you can pick up two of these for less than the current cost of a single 8800GTX if you have an SLI capable system.
That is extraordinarily nice deal. 2 8800GT's will be more than powerful enough for Crysis at 1650x1080, if single 8800GT's are playable with all settings high at 1280x1024. Could maybe even throw some AA at the game with dual cards.
Face it, Crysis is a beast, but that doesn't mean there aren't ways to play this game without breaking the bank. So,,,,, current technology graphics cards CAN play this game. Contrary to your findings.
Originally posted by: Bateluer
We'll find out for certain in a few weeks. Until then, I will reserve all judgment.
Higher-end Radeon HD 3870 will feature GDDR4 instead of GDDR3 while using the same RV670 core found on HD 3850. This GDDR4 memory is clocked at 1.2 GHz, and the core frequency is bumped to 775 MHz. The GDDR3 version of HD 3870 will feature the same core frequency as the GDDR4 card, but comes standard with lower frequency GDDR3 instead of GDDR4 to target a better price point.
The red flag is that Radeon HD 3850 touts exactly the same features found on the 80nm Radeon HD 2900 design with the exception of reduced GDDR3 memory. HD 3850 will reduce the thermal envelope when compared to the previous generation, but performance should be nearly identical to Radeon HD 2900.
Pricing on RV670 has not been confirmed. However, given that Radeon HD 3850 is essentially Radeon HD 2900 in a single-slot design, it's easy to expect AMD will price those cards similarly to R600 cards available today.
Originally posted by: Matt2
As I have said before, I dont expect anything spectacular from RV670.
Barring any architectural improvements (which i doubt have been made), I fully expect HD3870 to be the same speed or a splotch slower than my 825/910 2900XT.
Originally posted by: yacoub
Well let's look at the info we have available to use to make an educated guess:
We know that the last three Radeon series cards (X1800 X1900 and HD2900) were hot beasts with noisy cooling.
We know that this new one is supposed to be 55nm which should mean it runs cooler.
We know that the rumors so far for this new card's hardware setup is that it will be under-performing the 8800GT.
That means that it will need all the speed it can get out of its hardware to compete as well as possible against the NVidia cards.