Malogeek
Golden Member
I bet AMD will like the results of Kyle's subjective test. Especially since the difference in GPU+Monitor is actually $500 instead of $300 due to the 1080ti.
For those who didn't watch
Test was done at ultra wide 1440P 100hz monitors.
- 1 Prefered the 1080Ti - Gsync system
- 6 Rated them equal
- 3 Preferred the RX Vega - Freesync system
That gaming test is done using...Doom, which is optimized to all hell, could probably swap a OCed 1070 in there and have no statistically significant difference in results.
Almost. JHH said they spent 3 billion developing Volta.
I bet AMD will like the results of Kyle's subjective test. Especially since the difference in GPU+Monitor is actually $500 instead of $300 due to the 1080ti.
They could have used a 960, and had the same visual results.I bet AMD will like the results of Kyle's subjective test. Especially since the difference in GPU+Monitor is actually $500 instead of $300 due to the 1080ti.
If you read the thread there, AMD asked him (Kyle) to use a 1080, he chose the 1080 Ti.I believe that this was a test that AMD set the rules for, so you'd be a fool to believe that it had any chance of coming out in a way that doesn't make them look good.
Well of course. It's all marketing with basically no actual empirical data so no one in their right mind would base a buying decision on these kind of stunts.I believe that this was a test that AMD set the rules for, so you'd be a fool to believe that it had any ch weance of coming out in a way that doesn't make them look good.
Frankly, this is the exact same kind of crap Intel is doing with the "glue" comments on Epyc & Threadripper. This is beneath AMD.
I am little more surprised that HardOCP repeated this kind of silly test that AMD was doing at it's tour.
I am little more surprised that HardOCP repeated this kind of silly test that AMD was doing at it's tour.
They could have used a 960, and had the same visual results.
Heck, just get a 1050, put it up against the Vega Rx, and the exact same setup, and then, nvidia could say, look, our GPU + monitor costs $500 less than Vega RX + monitor! You can't tell the difference.
This is yet more smoke & mirrors.
Frankly, this is the exact same kind of crap Intel is doing with the "glue" comments on Epyc & Threadripper. This is beneath AMD.
AMD asked him to use a GTX1080 and he instead used a GTX1080TI,and people still are moaning at him?
.
Because he could just as well have used a GTX 1070 and had the same results. All this kind of testing proves is that people are poor at determining frame rate once it gets over a reasonable threshold. Especially if you have some kind of variable sync on top.
Imagine if he had used a GTX1080 and all the gamers said the Nvidia system was faster - would any of you be arguing about the result?? Not at all.
Doom is an AMD Gaming Evolved title,so at this point do really not think that AMD might not just try to get some of its titles running a bit better before launch??
You aren't understanding what I was saying. With both freesync & g-sync, you can use a worse card, and the tech compensates for that, allowing for better visual expierence.AMD asked him to use a GTX1080 and he instead used a GTX1080TI,and people still are moaning at him?
...
Not even a GTX1080 can break 100 FPS average in parts of Doom,and that is a recent Techpowerup review.
As a gsync user, i think I'm qualified to say that even with gsync turned on, it is very obvious to see the difference between 50fps and 100fps. The difference between 90fps and 110fps however is alot harder to see.You aren't understanding what I was saying. With both freesync & g-sync, you can use a worse card, and the tech compensates for that, allowing for better visual expierence.
So, it don't matter what card they used against the Vega RX, the results would have been the same.
For example, even if Vega RX was at 100fps, and the 1050 was at 50fps, both will still appear the same because g-sync is kicking in.
If you want to make a point, don't use 50fps at the first place, its spreading false information.The point is, that there is some delta where the "experience" is the same.
What that is, depends on the person, and the game in question.
It's not like NVidia is just in the graphics market, which itself isn't quite as simple as just gaming cards, but never mind that. They've also spent a lot of money building SoCs over the years and probably dumped a lot into building their own custom ARM core (project Denver) instead of just licensing the stock ARM designs like they used to.
Agree with the statements above, this was a dodgy 'test', the kind that shouldn't be featured in any 'serious' site, let alone encouraged by RTG themselves.
- Only one game.
- A game favoring the outcome AMD was banking on (and if the reasons are not obvious to you, my condolences).
- Limited in both resolution and monitor RR, further favoring said same outcome.
- Addressing whom? "Gamerz".. and more to the point, addressing them in the most elusive, unfounded manner possible; the empirical. Perception.
Was bad enough touting 'world tours' of this, now they have proceeded to convincing tech sites to go along with it? Can you grasp the kind of damage this can cause? If we somehow stopped basing our arguments or conclusions thereof on numbers and started relying solely on the empirical?
I do not want to see this from AMD, they are (were?) a different kind of company, with a different kind of approach. This is very disappointing for me to behold. Very. I can understand the dificulties, but they are not an excuse.
If you want to make a point, don't use 50fps at the first place, its spreading false information.
I find it amusing that lots of folks (including here) were saying how their Ryzen system felt much smoother but couldn't explain it with actual numbers. Guess that's just not a valid thing with a GPU
Not defending the marketing, just pointing out the double standard.