monstercameron
Diamond Member
- Feb 12, 2013
- 3,818
- 1
- 0
But that will be because of HBM, not the GPU itself right?
no the card is efficient, why does it matter if it is due to hbm or otherwise?
But that will be because of HBM, not the GPU itself right?
You cannot have an AIR cooler to fit in such a small pcb and dissipate 275W TDP, unless it would be 2x times as thick as triple-slot coolers.
But that will be because of HBM, not the GPU itself right?
Probably there's a fancy new controller driving this, that could be the reason why there's no overvolting support yet.It turns out that PowerTune takes big strides toward improved resolution and gradation! In spite of us running up against the limits of our logging technology, yielding curves that don’t look as nice anymore in spite of high-cut filtering, the bottom line is undeniable: PowerTune is now able to react to parameter changes in intervals of 10 microseconds or less. The following chart shows what happens over a time period of just 100 microseconds.
AMD’s engineers deserve some praise; these results look like they come from a card based on Nvidia's Maxwell architecture. Now we want to know how our observations are reflected in the individual load scenarios. Theoretically speaking, the power consumption at idle should be markedly lower, whereas the stress test might, unfortunately, trigger a massive increase. That is unless AMD set a conservative limit just like Nvidia did, since nothing can be predicted, and thus saved, under continuous full load. We’ll answer all of these questions in detail below.
...........
I read the toms review, and I'm confused as to why the gaming power usage is so much lower than the compute one. It implies that all the games are poorly optimized or cpu limited.
I read the toms review, and I'm confused as to why the gaming power usage is so much lower than the compute one. It implies that all the games are poorly optimized or cpu limited.
But it struck me that an interesting thing I think you could do with that radiator is put strips of beef or kale in front of the exhaust to create jerky or kale chips.
Thg power consumptions measurements is the most interesting assessments since fcat introduction. Have been for the last year.The Gigabyte 980Ti gets close to or over 300W while gaming, for example. Toms shows the Fury X at under 230W iirc during gaming. and lower than a stock 980ti. I don't see why it HAS To be much worse while overclocked. Really depends on the voltage needed.
According to Tomshardware this card is more efficient than or about as efficient as a 980Ti
It was nice when Ryan was just the GPU guy. He now has way more responsibility as the EIC and the timeliness of GPU launch reviews has suffered in accordance with Anand's departure.
The AT review is the one I look forward to reading the most and it's the only one I don't get to read today.
That [H] review is pure shill.
Witcher 3 with HairWorks, check.
FC4 with Enhanced Godrays, check.
Dying Light, check. (ok fine, but when their list of games is tiny, including a few NV biased titles determines the conclusion)
AMD would actually need Fury X to be 50% faster to compete when its crippled so badly.
Then his GTA V result is out of line against almost every other site. As is the BF4 results.
Compare to other sites that use a lot of games, Fury X matches 980Ti at 1440p/4K. It takes a custom 980Ti to beat it (& those custom 980Ti beats Titan X too).
Also, DX12 is gonna be fun times for Fury.
Seriously what is going on with AT now a days.Oh he is sick, sorry missed that, get well soon Ryan
For those crying foul about testing only a few games however popular they are... including RS who USED TO say that testing the heaviest hitters was more relevant (since the difference between 40 and 30 fps is way bigger than the difference between 120 and 100 fps)...
http://www.hardwarecanucks.com/foru...682-amd-r9-fury-x-review-fiji-arrives-22.html
Fury X loses to 980 Ti at 1440p but can match the 980Ti at 4K (only 2% slower).
However OC vs OC I get the feeling Fury X gets spanked, and neither of them can really handle 4K with the heaviest hitters anyway. I'm still waiting for 14/16nm GPUs.
These are new and popular games people play. I'm sorry you have an issue with using games people are actually playing today.
Perhaps you should speak to the game developer for their use and choice of 3D features in their own games?
We will continue to use new game releases, popular game releases, and games people are actually playing on the PC. We are open minded and do not cherry pick our games based on who has sponsored a game. That thought never enters our mind in the decision to use a game. I don't care what 3d features are in it, as long as there are 3d features that push gaming forward on the PC.
Here you go.. this will help ease the pain of no review to read, but instead, look at the bar charts.
http://www.anandtech.com/bench/product/1442?vs=1513 Fury vs 980
http://www.anandtech.com/bench/product/1496?vs=1513 Fury vs 980 Ti
"Don't worry guys, firmware updates, new bioses and a windows update will make Bulldozer perform like it should!"
Getting the exact same vibes with this current future proofing DX12 discussion. People do realize that this card comes with a cutting-it-close 4GB frame buffer and is the last hurrah of 28nm right?
Fury X is pretty much the opposite of future proof. It's a stopgap card with limited memory. DX12 or not.
Your subjective playable settings are also pretty crap. I see the intention, but they can't be used for benchmarking and should not take up the most space.
Yes, we have used each of those games. Naturally, we update our gaming suite over time. Most games get at least a year on our gaming suite, sometimes even much longer. TR stayed on there for a very very long time, it was time to move it out, and new games in.
It is quite impossible for one man to play 20 games for each review. We can get through 5 most of the time, 6 if we have a little extra time. Keep in mind we actually play the game, use real-world performance. It takes many hours to find highest playable settings in each game, record fraps data, and also do apples-to-apples data which our readers demand, so we give them what they want. Each game is examined very thoroughly. We go for a quality over quantity approach. If you don't find use in quality, feel free to hit the X.
Awesome thanks a lot , only missing his conclusion at the moment :biggrin:
Fury sets AMD on the road to recovery, virtually all performance and power metrics compared to its predecessor have been achieved.
Unfortunately for them, it isn't quite enough to give them the fastest single GPU crown in '15.
Fury cards are in very short supply as well, so, even if people wanted to give AMD their money, you will be hard pressed to find a reseller selling one at this time.
Now, does any AT reader know of a cure to GPUitus? I seem to have come down with a bad case of it...
Give him our regards. I am looking forward to reading his easy to understand yet solid explanation of the new tech.I can give you one...