BespinReactorShaft
Diamond Member
- Jun 9, 2004
- 3,190
- 0
- 0
Originally posted by: R3MF
Gubbins = £100
Umm... what sort of gubbins for instance would take up £100?
Originally posted by: R3MF
Gubbins = £100
Speak for yourself!! Damn this... voice... in my head!!!Originally posted by: biostud
Nobody forces anyone to play the games with 4xFSAA 16xAF 1600x1200+ Max details.
Originally posted by: tanishalfelven
to the above post.
if YOU insist in bieng an SOB who need 1920x1200 to play games then its your fault. most ppl games yes even oblivion at lower res and are perfectly happy even if we don't get all the bells and whistles(eg HDR or have to turn of shadows...i am talking about 1028x768. i ca nuse shadows at even lower res.). last time i checked my card equaled a 6600gt that is beaten by the 7300gt. which is supposed to cost <$100. so STFU ok.
I really feel like the graphics industry is just taking advantage of the enthusiasts now. I want decent priced video cards that doesnt take a power grid to run. They need to make the cards more efficent and same goes for game developers. They need to make their games run more efficiently.
no there not. try buying a budget computer now. you'll be able to for <300
specs
ati mobo with igp (i've heard the newer one use a x700 level but the older ones use x300) 80
nice cheap sempron 80
512 mb ram. 50
that 210.
get a 7600gs for under 100 (if you really wanna play games well other wise the igp is ok too)
cost. 310 or 210.
tax
240
340
ha. don't go arounf FUD that gfx maker overcharge.
Originally posted by: rmed64
meh, my rig is holding up great for CS:S and BF2, the only games I play daily on my pc.
BF2142 is my next PC purchase, and its gonna be using the same BF2 engine with some enhancements. So basically, I see no reason to dump my oced 6800GS anytime soon. I game at 1024x768, so my card is a perfect fit.
I feel sorry for those who have huge monitors and need to game at much higher res (1600x1200 and up) therefore needing a much more powerful card or even cards (SLI, crossfire). But hey, they are the "hardcore", so if thats what they need, then they gotta pay the price.
Talk of this kind of stuff is kinda gloomy, but hopefully they will come back down to earth.
As depressing as this news is, there is a small light at the end of the tunnel. Our sources tell us that after this next generation of GPUs we won?t see an increase in power consumption, rather a decrease for the following generation. It seems as if in their intense competition with one another, ATI and NVIDIA have let power consumption get out of hand and will begin reeling it back in starting in the second half of next year.
Originally posted by: ElFenix
i like how the videocard sucks down far more power than the rest of the system these days.
are external power bricks just not good enough? frankly i think i'd rather have that than another heat producer INSIDE my computer.
Originally posted by: RampantAndroid
Originally posted by: ElFenix
i like how the videocard sucks down far more power than the rest of the system these days.
are external power bricks just not good enough? frankly i think i'd rather have that than another heat producer INSIDE my computer.
Ditto to this...
And I agree with the whole thing of SLI dumping this crap on us....my opinion of SLI is that it is the worst thing to ever happen the gaming industry - for two main reasons (and Crossfire is just as evil here in my opinion):
It lets programmers be sloppy...just like the current attitude with RAM and HD space "Memory/Storage is cheap, so we don't have to be effcient with it"
Or, it lets studios write games that require SLI just to run them right (Oblivion comes to mind here)...Oblivion does look good...but how much of that is just extraneous? In my opinon, they should program for mainstream SINGLE cards, and then SLI/Crossfire would just be a performance boost....rather than writing games for SLI and making everyone with a single card get lower performance and quality (even if your card is current generation).
Just my opinion there....what next, dual Soundcards? With X-Fis, that'd mean twice the "snap crackle pop"!!! YES!
Originally posted by: hans007
the same people who constantly say that intel sucks beccause its cpus use maybe 125 watts compared to an 89 watt amd cpu, will go out and buy a 160 watt x1900xtx over say a much lower power 7900gtx or 7900gt.
Originally posted by: guoziming
Remember, Oblivion was programmed with XBOX360 in mind, NOT SLI.
Also, if you want programmers to be less sloppy, you should be prepared to pay for it, because development these days is far more difficult than back in the "efficient" days, since gamers' demands have risen and the level of standards has increased.
besides, if you want Oblivion to be less graphics-intensive, you could just turn down the settings....
Yep, I'm thinking one upgrade to either an 1900xt or a 7950gx2 to hold over until the next-next gen, when they allegedly are bringing the power requirements back down.Originally posted by: Jeff7181
I won't buy a separate power supply for the video card. I'll keep my 7900GTX for a while and then I'll be buying mainstream parts I guess until they find a way to get more performance without requiring a nuclear reactor to power your PC.
Originally posted by: RampantAndroid
Originally posted by: ElFenix
i like how the videocard sucks down far more power than the rest of the system these days.
are external power bricks just not good enough? frankly i think i'd rather have that than another heat producer INSIDE my computer.
Ditto to this...
And I agree with the whole thing of SLI dumping this crap on us....my opinion of SLI is that it is the worst thing to ever happen the gaming industry - for two main reasons (and Crossfire is just as evil here in my opinion):
It lets programmers be sloppy...just like the current attitude with RAM and HD space "Memory/Storage is cheap, so we don't have to be effcient with it"
Or, it lets studios write games that require SLI just to run them right (Oblivion comes to mind here)...Oblivion does look good...but how much of that is just extraneous? In my opinon, they should program for mainstream SINGLE cards, and then SLI/Crossfire would just be a performance boost....rather than writing games for SLI and making everyone with a single card get lower performance and quality (even if your card is current generation).
Just my opinion there....what next, dual Soundcards? With X-Fis, that'd mean twice the "snap crackle pop"!!! YES!
Originally posted by: mooncancook
wow. so once you started a DX10 game, you'll notice the lights in your house fluctuate?
Originally posted by: Powermoloch
Originally posted by: openwheelformula1
I will not buy any graphic card that is incapable of passive cooling, nor will I buy any graphic card that will require another psu on top of my Seasonic S12 500w.
Thanks to idiots with $1200 graphic cards in SLI/Crossfire, they are dumping this crap on us consumers. If the next generation can't be more efficient, then it shouldn't be the next generation.
Sorry I had to rant.
yeah, it's kinda like the same problem as "muscle cars" right before the oil crisis lol.
Originally posted by: hans007
yeah i honestly dont even understand this.
the same people who constantly say that intel sucks beccause its cpus use maybe 125 watts compared to an 89 watt amd cpu, will go out and buy a 160 watt x1900xtx over say a much lower power 7900gtx or 7900gt.