um isn't the reason clear? he gets pissy when you question any results.What does Kyle's behavior on his forum have to do with the reviews authored by Brent Justice?
I worked with Brent years ago on the long-dead g256.com. Great guy, writes good reviews, but I wish they tested more games.
Kinda like Eyefinity right?
I started a 3rd account and he also banned me
when the 6850 and 6870 are both oced to their respective typical max oc, they are within 5% of each other in most cases.The only negative I can think of of the 6950/6970 series is that the 6950 is almost too fast for it's own good. There is so little separation between the cards that really one SJKU could probably have been fine.
Anyone know how these are OCing? If both of these hit the same ceiling, I really can't think why anyone would pick up the 6970. Thoughts?
On a serious note now, Kyle must have serious issues, i personally like his site but i hear way too many complains about the guy.
Others agreed, someone mentioned if you are spending $1000 on a 25*16 monitor you'll not be interested in less then a GTX 580 anyway cause what's the point in saving a few $ when you are spending that much on the screen.
Exactly. Just because I bought an expensive monitor to get the most enjoyment out of my games doesn't mean I'm some kind of spendthrift that wastes money on overpriced graphics cards. I'm still running a 5850 (bought for $250) because none of these new cards offer the performance improvement that warrants their prices - even the 6950/6970's, as much of a deal as they are for us 2560x1600 gamers.I think you'd be surprised. Why spend $500 on a video card that gives you 5-10% more performance with your 30" screen than a $370 card? Or, why not spend $600 for 50-60% more performance than the $500 solution? Either way the $500 card doesn't seem like a very good deal.
I actually like the way they[h] perform their reviews. However it would be nicer if compared results at HQ. But I learned years ago by Kyles bi-polar mood swings about manufacturers he is an emotional equivalent of a child. I just roll my eyes when he is swinging back and forth so dramactically. Anybody who goes to that extremes is a drama queen.
Well, at least Eyefinity can be used on just about any road (that is any game) and is able to run with just one engine.
But for me personally, yes, I find Eyefinity to be about as useless and gimmicky as Physx. I have no desire to have bezels in the middle of my gaming scene. I have no desire to have my GPU create physics that do not appear any better than what we already have on the CPU and lower my frame rates for those same physics.
Hardware Canucks used the 10.12 driver.
http://www.hardwarecanucks.com/foru...899-amd-radeon-hd-6970-hd-6950-review-28.html
I think you'd be surprised. Why spend $500 on a video card that gives you 5-10% more performance with your 30" screen than a $370 card? Or, why not spend $600 for 50-60% more performance than the $500 solution? Either way the $500 card doesn't seem like a very good deal.
I actually like the way they[h] perform their reviews. However it would be nicer if compared results at HQ. But I learned years ago by Kyles bi-polar mood swings about manufacturers he is an emotional equivalent of a child. I just roll my eyes when he is swinging back and forth so dramactically. Anybody who goes to that extremes is a drama queen.
Yes, but it was RC2. Not that I have any idea how RC2 would perform against final.
Will be interesting to see what the 11.1 and especially 11.2 drivers (enough time for EOY vacations to not have affected the release) bring.....
Chuck
So you claim that:
A) ALL games work with Eyefinity
B) You need not reduce ANY I.Q. settings when running Eyefinity?
If so, I would call you less than honest.
Then stick with your gimped CPU physics, running at 100% load, but yielding performance (compared to GPGPU physics) that is hillariously sad.
Just know that both AMD, Intel and NVIDIA is going to leave you standing at the station, looking at the rear end of the train...:sneaky:
I'm not picking on any one member in particular, I am just quoting this post as an example to speak to when I say that I want you guys to be cognizant of the fact that Kyle is a fellow forum colleague here at AT's VC&G, albeit infrequently, and your dialogue regarding Kyle is not exactly kosher with the forum guidelines.
It is OK to name names when discussing article authors, author credibility is a relevant factor in determining article credibility, but we seem to be detracting from the spirit of acceptable personal assessment for the purposes of associating article credibility and are heading into something that could rather easily be interpreted as catty and pithy.
I want us to avoid being catty and pithy.
Please take this public conversation away from the public and into pm's or to PFI, this level of character analysis really doesn't belong in a technical forum.
Moderator Idontcare
Then stick with your gimped CPU physics, running at 100% load, but yielding performance (compared to GPGPU physics) that is hillariously sad.
Just know that both AMD, Intel and NVIDIA is going to leave you standing at the station, looking at the rear end of the train...
What am I seeing here?
http://www.anandtech.com/show/4061/amds-radeon-hd-6970-radeon-hd-6950/2
Look at the Geforce GTX 480 image. Just above and slightly to the right of the middle of the image are 2 black dots clearly out of place. Is this a rendering flaw of nvidia's drivers? Problem with the GTX 480 card used in the review? Normal behavior for nvidia cards? I don't see any mention of the artifacts in the text on that page, and I'm a little curious.
It's not about a good deal, it's about having the best. What dual-GPU is better than 2x580? Nothing. For some people dropping $1000 on a dual GPU system, or $1500 for a tri-SLI is to have the best. It's like getting an EE CPU...you pay for the best and it's not going to be the best bang-for-the-buck.