Define significant. I personally don't think any game should be excluded but only if 10+ games are tested, using 5 games and 4 of those being GameWorks titles is not good review practice.
Have you seen the benchmarks on Project Cars? That thing is ridiculously favoring Nvidia's arch.
I agree, but then you say this:There is no way to come up with a cut off point as there is too much variability.
So if Showdown is the outlier, there had to be a cut off point so what is it?Dirt Showdown seemed to be a lone game that gave results not similar to any other game on the market. That's an outlier.
I agree, but then you say this:
So if Showdown is the outlier, there had to be a cut off point so what is it?
Well, to be fair, they need widespread adoption of HBM that way it'll get cheaper for them to manufacture as well, which allows more supplies.
I am not nice like rs, hence super hard to troll as a target. I also don't waste time ^_^I'm still looking for proof that the game was removed due to "poor Nvidia performance".
Maybe you can explain how they can be biased against AMD and still include Mantle in their benchmarks.
Huh? No. There's no negotiating of any kind about what's in our GPU reviews. I publish what I want to publish. In this case I didn't feel like 4K results would be especially useful, so I made sure to throw in 1080p as well.I wonder if anandtech had to negotiate to get 1920x1080 results in there.
Huh? No. There's no negotiating of any kind about what's in our GPU reviews. I publish what I want to publish. In this case I didn't feel like 4K results would be especially useful, so I made sure to throw in 1080p as well.
I am not nice like rs, hence super hard to troll as a target. I also don't waste time ^_^
it is ok. did anyone ever tell you that? it is okOr in other words, you don't have a rebuttal. Being proven wrong is fine, just don't make it worse by trying to deflect.
I honestly can't tell if this is serious or in jest... but either way the answer is no.Did nvidia send you a reviewers guide how to test AMD Fury Nano, the same thing they did when hawaii lunched?
Anyone benching minecraft or league of legends?!
I honestly can't tell if this is serious or in jest... but either way the answer is no.
Kyle said:This is what I was thinking..... "The clearest bunch of ******** I have read today Roy. Thanks for that pathetic reply of trying to not look like AMD is fully cherry picking review sites for "good" Nano reviews.
The reason I think AMD is NOT sampling all its usual review sites is because of thinking like Roy Taylor is infecting the company. Roy wants AMD to duck and cover and only sample the journalists that they feel as though they have a sure chance of getting a "good" review with.
Brent said:All I want to say is, when the "reviews" get published, look closely at what is reported, what is covered, and the comparisons made. I think how the clock speed operates, and what the real-world clock speeds are might be glossed over in some reviews, or even incorrect if not tested right.
Something we do, that I know not everyone does, is test cards after the card as "warmed up" while gaming. Typically 15+ minutes in from gaming the GPU will have warmed up and clock speeds will be different from just testing the card right out of the gate in a short benchmark. A short benchmark is going to show higher results compared to running the card for thirty minutes and then playing a game. The clocks will be different, the results will be different. We test cards after they have gone through this warm up period, which takes extra time, but it is worth it cause you get real-world results.
I think some points will not be made about how Fury X (also a small card) wiill fit into the same cases as the Nano. Granted, you need a space for the rad and fan, but most SFF cases, even ITX, have room for this. I have seen Fury X ITX builds on youtube and the Internet. It is possible. I want to see this talked about. mATX cases for sure can house Fury X.
Finally, the price being the same as the Fury X, I have a feeling this will be downplayed. I also have a feeling availability might be downplayed.
We shall see, but it will be interesting to see if things are left out, downplayed, swept under the rug, and just not discussed when it comes to Nano in the reviews to come. These are all important factors.
"Fair" reviews would keep these topics in-mind and discuss them in reviews.
I understand that they think they do the best reviews, as I'm sure most sites do, but not only did Kyle question the integrity of any site that does get a card, Brent basically said they are going to use sites that don't do thorough informative reviews. And that's assuming he meant they won't review those details because of ignorance or laziness, if not out and out collusion.
Why are you quoting and replying to my post to Russian? Did we just find somebody who has a second account?
I don't see how that has anything to do with what I posted. One of us is totally misunderstanding each other.The quote from brent is spot on.
It feels that AMDs new line-up is tripping over its own feat and no one seems to be talking about this:
If fury-X is a 4k card, then why does the 390x need 8gb of memory? Currently, I do not believe that any single GPU (from AMD or Nvidia), will give a reasonable 4K experience. Consequently, a multi-gpu setup is preferable. In the case of crossfire it would be crazy to choose crossfire fury-x over 390x, since GPU memory is not pooled. Yet all the review sites push the whole Fury lineup as 4K cards.
Echoing Brents comments: (imo) at 4K it seems that the whole fury line up is surplus to requirement. :\
I don't see how that has anything to do with what I posted. One of us is totally misunderstanding each other.
On what you've said though, I haven't seen a single review, on H or anywhere else that backs up what you just posted.
The 390's have 8gig of RAM purely to differentiate and add perceived value over the 290's.
Sorry my fault, I should have quoted the Brent quote, I was just being lazy.
Similarly how Brent suggests that the Fury-x is stepping on the toes of the Nano I was suggesting that the 390x is stepping on the toes of the Fury-x as a 4K solution (I understand the motivation for sticking 8gb of memory into the 390's, but it has a knock on effect of making them a rather good choice for 4K when implemented in a crossfire setup).
Weather you think the Fury lineup (or the maxwell lineup) is suitable as a single gpu 4K solution is purely subjective. I do not think any results from AMD or Nvidia are particularly compelling at this resolution (for a single GPU!), I have this opinion from perusing the review sites.
Sorry for the confusion!
I don't see how that has anything to do with what I posted. One of us is totally misunderstanding each other.
On what you've said though, I haven't seen a single review, on H or anywhere else that backs up what you just posted.
The 390's have 8gig of RAM purely to differentiate and add perceived value over the 290's.
The 290X had 8gb of ram, though. Released November 2014.
It's cool.
No I don't think that a single GPU is ideal for 4K. Unless you don't mind No AA or FXAA (I don't see the point of all those pixels just to blur them together.) and maybe high instead of ultra shadows.
Dual Fury works typically as good or better than 980 ti @ 4K, even with less VRAM (Which hasn't shown to be a problem.). Then theirs always the promise of DX12 and pooling memory and other dual GPU goodness. Assuming the cards are capable enough in DX12 to take advantage. I'm in the camp of thinking that GCN will prove superior to Maxwell in DX12. YMMV.
Pooled memory :wub:
I just can't get my head around the Fury-X being a 4K card, even in multi-gpu setups, due to the memory limitation. A fairly quick and ubiquitous uptake of DX12 and, critically, memory pooling is just to large a gamble to opt for fury cards.
Back on topic, as innovative as the fury lineup is, it is also very confused/confusing.
I commend Ryan for including 1080p results. 1080p results should have been included in the Fury-X reviews. Yet this was not the case for the majority of sites. :hmm:
Pooled memory :wub:
I just can't get my head around the Fury-X being a 4K card, even in multi-gpu setups, due to the memory limitation. A fairly quick and ubiquitous uptake of DX12 and, critically, memory pooling is just to large a gamble to opt for fury cards.
Back on topic, as innovative as the fury lineup is, it is also very confused/confusing.
I commend Ryan for including 1080p results. 1080p results should have been included in the Fury-X reviews. Yet this was not the case for the majority of sites. :hmm: