Klinky1984
Member
- Nov 21, 2007
- 48
- 0
- 66
Is "Nemesis 1" really getting AMD cards? If that's the case there should be a sig disclaimer like nRollo. Frankly "Nemesis 1" is coming off quite annoying here.
Originally posted by: taltamir
if you are referring to nRollo... then you should know AMD is giving cards to Nemesis 1...
so that lovely back and forth was between a guy getting free cards from nVidia, and a guy getting free cards from AMD. Both of them BECAUSE of their posts on forums...
Originally posted by: nRollo
Originally posted by: BolleY2K
NVidia is also struggling in the chipset market at the moment - which doesn´t help either. Right now crossfire is much more attractive than SLI in my opinion.
I don't know how the chipset market or SLi vs CF relates to the GTX280, but I'd be happy to discuss either with you if you would like to open threads in regard to these topics in the appropriate forums.
Just because no one has come out and said it doesn't mean no one has done so. Your signature says "Space is a dangerous place if it's between your ears".Originally posted by: Aberforth
On the other hand no one has managed to decrypt my signature yet it contains a hidden marketing material for all.
Originally posted by: taltamir
didn't you tell us how in one of the other forums...
Anyways... I am waiting for the power consumption figures for the new AMD and nVidia cards. I have perfected my watt to dollar calculation methology and I know EXACTLY how much more it will cost me a year... and they WILL go into calculating the IQ of the purchase.
Originally posted by: JujuFish
Just because no one has come out and said it doesn't mean no one has done so. Your signature says "Space is a dangerous place if it's between your ears".Originally posted by: Aberforth
On the other hand no one has managed to decrypt my signature yet it contains a hidden marketing material for all.
Originally posted by: nRollo
25X16 @ 30" WS is a bigger improvement than anything else you can do for your computer in terms of immersion. (and I've done everything else short of shining it on my rec room wall with a projector) These cards are very nice for 25X16.
Originally posted by: bryanW1995
Originally posted by: nRollo
Originally posted by: BolleY2K
NVidia is also struggling in the chipset market at the moment - which doesn´t help either. Right now crossfire is much more attractive than SLI in my opinion.
I don't know how the chipset market or SLi vs CF relates to the GTX280, but I'd be happy to discuss either with you if you would like to open threads in regard to these topics in the appropriate forums.
translation: I'm very sad that gtx 280 isn't looking to be quite as nice as we had hoped. please just be nice to nvidia!
Originally posted by: Aberforth
Originally posted by: taltamir
didn't you tell us how in one of the other forums...
Anyways... I am waiting for the power consumption figures for the new AMD and nVidia cards. I have perfected my watt to dollar calculation methology and I know EXACTLY how much more it will cost me a year... and they WILL go into calculating the IQ of the purchase.
this new one is based on Radix64 algorithm, look for radix 64 decoder online.
Also as long as performance is high nobody gives a damm about power. People usually talk about Power when performance is pathetic and power sucking is high. Let AlGore ban these cards someones gotta send him a mail.
Originally posted by: taltamir
here is texas CHEAPEST electricity is 14 cents per kwh. (thanks to nationalpowerco going out of business due to their 11 cents kwh contracts, and everyone else raising the price).
Some places are as cheap as 6 cents per kwh... I know of some places in the US where it costs 24 cents per KWH
I game for 2 hours a day (no savings there), use the computer for other things for 10 hours a day (this is where powerplay/hybrid power reduces power), and leave it off for the remaining 12 or so (zero power taken, but not due to powerplay/hybridpower).
an idling G92 single GPU card takes about 60-70 watts... and idle underclocked (speedstep like tech) 38xx single GPU card takes about 40-50 BECUASE AMD already implemented a speedstep like function of underclocking the card (the HD2xxx series... those can go OVER 100 watts while idle).
An IGP takes an additional 5 watts at most.
35 to 65 watt reduction for a current gen card, higher for the G2xxx, probably higher for the 48xx series, and definitely higher for additional GPUs. (not to mention you don't have to subtract the 5 extra watts the IGP takes).
(40-5) watt * 0.001 kilowatt/watt * 10 hours / day * 365 days / year * 0.14 $/kwh = 17.885$ a year FOR ME, and I turn them off when not in use
(70-5) watt * 0.001 kilowatt/watt * 10 hours / day * 365 days / year * 0.14 $/kwh = 33.215$ a year FOR ME, and I turn them off when not in use
semi-worst case scenario (no point in using the HD2xxx cards... they are obsolete):
leave computer on 24 hours a day (I used to do this), 3 or 4 card SLI/CF taking 200+ watts when IDLE, not accounting for AC costs again.
(200-5) watt * 0.001 kilowatt/watt * 22 hours / day * 365 days / year * 0.14 $/kwh = 219.219 $ a year... in electric costs for your idling a serious mGPU setup for a person who leaves on 24/7
But since I turn it off lets recalculate at only 10 hours a day:
(200-5) watt * 0.001 kilowatt/watt * 10 hours / day * 365 days / year * 0.14 $/kwh = 99.645$ a year FOR ME, and I turn them off when not in use
Since this is measured power draw from the wall some PSU inefficiency loss is accounted for. But increased AC costs are not.
This is the REAL dollar amount I will personally save... a person with multi GPU setup will save a multiple of this... a person who leaves his computer on 24 hours a day will save more then twice this, or exactly twice if he plays for 4 hours a day EVERY DAY.
I was being generous by saying 2 hours a day... I play 10 hours a day when a new game like mass effect comes out, and then I don't play anything for a month or so...
Note that hybrid power and the speedstep tech DOES NOTHING while you are gaming, or while the computer is off (or at S3 sleep). It only helps when you are using the computer, but not for games, which is the majority of the time for most people.
Originally posted by: Sentry2
Hey Psynaut, what firmware version does your Westy have?
Originally posted by: taltamir
first he said he was getting free AMD cards thanks to his posts ON THIS FORUM. then when nRollo called him on it he backtracked and said he wasn't supposed to admit it, but actually he invented a cooler block for a water cooler, and he is getting cards to do other improvement work upon, but he couldn't say that before because he has to keep it secret, so he lied and said he is getting them for his posts, but he couldn't let a claim against his reliability stand, so he is now admitting the truth and breaking his business secrecy to admit why he is really getting cards.
Originally posted by: Aberforth
Originally posted by: JujuFish
Just because no one has come out and said it doesn't mean no one has done so. Your signature says "Space is a dangerous place if it's between your ears".Originally posted by: Aberforth
On the other hand no one has managed to decrypt my signature yet it contains a hidden marketing material for all.
Glad to hear it :thumbsup:
Originally posted by: Psynaut
Originally posted by: Sentry2
Hey Psynaut, what firmware version does your Westy have?
I don't know, I bought it used off Craig's List for $650, which is about what I had been planning to spend on a 24" until I stumbled onto the threads about the 37" Westy. I hooked it up and never gave it another thought. Maybe I should look into this.
Originally posted by: taltamir
first he said he was getting free AMD cards thanks to his posts ON THIS FORUM. then when nRollo called him on it he backtracked and said he wasn't supposed to admit it, but actually he invented a cooler block for a water cooler, and he is getting cards to do other improvement work upon, but he couldn't say that before because he has to keep it secret, so he lied and said he is getting them for his posts, but he couldn't let a claim against his reliability stand, so he is now admitting the truth and breaking his business secrecy to admit why he is really getting cards.
Originally posted by: taltamir
first he said he was getting free AMD cards thanks to his posts ON THIS FORUM. then when nRollo called him on it he backtracked and said he wasn't supposed to admit it, but actually he invented a cooler block for a water cooler, and he is getting cards to do other improvement work upon, but he couldn't say that before because he has to keep it secret, so he lied and said he is getting them for his posts, but he couldn't let a claim against his reliability stand, so he is now admitting the truth and breaking his business secrecy to admit why he is really getting cards.
Originally posted by: bryanW1995
I'm frankly very surprised at the general tone of this thread since benchmarks have started leaking. I fully expected the 280 to slaughter sli 4850 and be very competitive with 4870x2 and/or sli 4870's, but even the nvidia fanboys seem to be resigned to this not being the case. If 4870 is really this good and they really do sell it for $299 then we could be in for some interesting times in the next few months.
Originally posted by: HOOfan 1
Originally posted by: bryanW1995
I'm frankly very surprised at the general tone of this thread since benchmarks have started leaking. I fully expected the 280 to slaughter sli 4850 and be very competitive with 4870x2 and/or sli 4870's, but even the nvidia fanboys seem to be resigned to this not being the case. If 4870 is really this good and they really do sell it for $299 then we could be in for some interesting times in the next few months.
everything is still FUD. I remember a few days before the launch of the 9800GX2 and tweaktown or some other site posted early benchmarks that showed the card was slower than a 3870X2 and barely faster than a single 8800GT in many games....then the card came out a few days later and blew everyone away.
I like the idea of the ATI cards with a tiny bit better performance than the last gen high end, for a much lower price. But, I dislike ATI's idea that the only road to the future of top end graphics is multi-GPU cards or multi-card systems. Until I see all the compatability issues ironed out with multi-GPU/multi-card systems, I will always believe a single card/single GPU high end system is the superior way to go.
X800 beat 6800
X1800XT beat 7800GTX 256MB
X1900XTX beat 7900GTX