They are also posting 8K Bioshock SSs on their Radeon twitter page....
https://twitter.com/amdradeon
8k doable with Fury? :O
https://twitter.com/amdradeon
8k doable with Fury? :O
That screenshot looks quite detailed to me.Any details on that?
Any details on that?
Rumour wars!Wccftech just countered videocardz claim of 30k fury units. reports their "sources" say baseless and plenty to go around
For those that don't care one iota about power consumption, it will still come down to best performance after both cards are OC'd. However, prices are similar and its pretty much a tie, but OC'd Fiji consumes 200 more watts than OC'd 980 TI........ well........ :/
It was before 980Ti dropped at $650 and after-market 980Ti cards with 1.3Ghz boost clocks didn't start showing up at $690. Zotac AMP! Extreme will have 1355mhz Boost.
AMD is in a LOT of trouble. Even the 'mediocre' factory pre-overclocked EVGA 980Ti is wiping the floor with a 290X.
55% faster
60% faster
65% faster
36% faster
58% faster
57% faster
But here is the killer part - just 5W higher power usage than a similar rig with a stock 290X.
Source
Now you've seen me call the EVGA 980Ti after-market mediocre and there is a reason for it. Firstly, much lower factory pre-overclock than the better cards from Zotac, MSI, Asus and Gigabyte, and surely Galax and MSI. Secondly, EVGA continues to make the most sub-par NV products from the top AIBs (MSI/Gigabyte/Asus/Galax/Zotac/Inno3D), generally with a track record of skimping on quality parts and keeping things reference, not to mention they don't do GPU binning like Gigabyte does, which basically means their only good cards are Classified editions. The end result is a mediocre 1318mhz Boost overclock, but even at this crappy overclock, a 980Ti starts to obliterate the 290X by 70%+. Just imagine what a solid Gigabyte, Asus or MSI card will do at 1.45-1.5Ghz clocks. Ouch.
72%
71%
I hate to be the downer but my optimism is waning by the day. :| AMD almost needs a card 55-60% faster at 4K against the 290X to just keep up with an after-market 980Ti and based on guesstimates that would mean 1100mhz clocks with 4096 shaders. But then even if it does it, it still has 4GB of VRAM not 6 and it's likely going to use more power than a 980Ti OC. When things are this close, 50% extra VRAM and lower power usage sounds like a winning combination.
I wouldn't do it at all for that rez. Grab a 970 and overclock it to 1.5Ghz (or 290X for $270). Great 1080P stop-gap until Pascal. Everyone is different but if you are gaming on a tiny 22-23" 1080P monitor, I'd strongly consider a lower end card and say a monitor upgrade. Not necesarily this model but just giving you an idea that today you can buy a $300 27" 2560x1440 IPS model. I personally would much rather get a larger sized, higher PPI monitor than buy a $690 USD 980Ti Gigabyte G1 and game on a small-sized 1080P monitor with VSR.
They are also posting 8K Bioshock SSs on their Radeon twitter page....
https://twitter.com/amdradeon
8k doable with Fury? :O
No details beyond what theyve said and linked.
Could be runnig at 2fps for all we know. But then, that would be kind of counterproductive I think.
Something the team has been working on.... 8k gaming? maybe fury cross or tri fire?
Aaaannnddd.... 8K battlefield 4
http://download.amd.com/images/bf4%202015-05-13%2016-01-06-85.bmp?hootPostID=753a594e29997a5dc921fffdfc95135e
I'm thinking super demo rig with quadfire Fury WCE's.....running @ 8k VSR. That is really fun to think about.
There are typos and grammatical errors on that box text. I wonder if that means it's a fake, or just poor quality control by XFX?
To the right of the "AMD GCN Architecture" badge:
"If Mantles [SIC] is the language then Graphics Core Next (GCN) is the translator connects [SIC] software to hardware and fully take [SIC] advantage of performance potential in all and only AMD GPUs."
Even though "all and only" is technically correct, that's just a weird phrase.
that's exactly why we should be angier-- the masses don't know better, and they abused that.
also, based on NVidia's prior behaviors in this matter, we have no reason to think that code will stay in the driver implementation. For all we know, in a year's time for certain games they'll move the things to that VRAM explicitly to slow the card down
I'm thinking this is a single-FuryX setup running 8K... :O
unlikely
i mean i'd *love* to wrong, but 8k would be asking too much from a game as recent as that
Bigger question is - how much VRAM is that using? And is this a hint at 8GB HBM?
Bigger question is - how much VRAM is that using? And is this a hint at 8GB HBM?
Or a hint 4 GB of HBM is enough.
funny if it was to show 4gb hbm is good enough for even 8k