[Rumor (Various)] AMD R7/9 3xx / Fiji / Fury

Page 96 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
I think that's a bit of an unfair assessment. We all know 4k is the next big thing. The industry is moving towards 4k. For people in the market for a new monitor/TV, 4k has to be on their radar. I ask you this. Wouldn't those potential buyers want to know if there is enough supporting hardware/software for 4k to make sense to them? Having reviews on 4k gaming is useful. Even if you don't own a 4k monitor, you might want to know if the GPUs out there are strong enough to push those pixels before you take the 4k plunge.

4K is not adopting anywhere even close to the 720p->1080p pace. There is no broadcast 4k; there's not even a lot of broadcast 1080p. Most 4k is upconversion 1080p blu-ray.

For these very pragmatic reasons, it's not a big hit. A bit over 12,000 4k TVs were shipped in 2014 out of 6.4 Million sets. It's supposed to double this year, and again in 2016. At that rate, it will be 2018 before it's even in the 1% category.

And there is no single GPU card that can run any top tier game decently at 4k. And by decently, I mean without dropping into low teens on min frame rates (most drop into single digits). Average frame rats at 4k are very deceptive. All of this is why HDMI 2.0 doesn't matter except in a marketing sense.

The reviews and focus on 4K and HDMI 2.0 are sorta like including a tractor pull in a Honda Accord vs Ford Taurus review. Does it really matter which one is better at pulling that trailer? Neither will do it worth a crap.
 

MrTeal

Diamond Member
Dec 7, 2003
3,587
1,748
136
I see. Makes sense.

Also, AMDMatt is double-checking:

https://community.amd.com/thread/183135

Hopefully he gets back with us today.

I don't think you'll get an answer you like.
amdmatt@Jun18-2:33am said:
Double checking on this guys, will let you know.
amdmatt@Jun19-1:03pm said:
The AMD Radeon™ Fury X is an enthusiast graphics card designed to provide multi-display 4K gaming at 60Hz.



In addition, Active DisplayPort 1.2a-to-HDMI 2.0 adapters are set to debut this summer. These adapters will enable any graphics cards with DP1.2a outputs to deliver 4K@60Hz gaming on UHD televisions that support HDMI 2.0.
 

sze5003

Lifer
Aug 18, 2012
14,184
626
126
I don't think you'll get an answer you like.
How expensive are these adapters do you think? Surely if those that want to play on 4k tv's should have no issues getting one. At least they will be available which is better than not at all I think.
 

iiiankiii

Senior member
Apr 4, 2008
759
47
91
4K is not adopting anywhere even close to the 720p->1080p pace. There is no broadcast 4k; there's not even a lot of broadcast 1080p. Most 4k is upconversion 1080p blu-ray.

For these very pragmatic reasons, it's not a big hit. A bit over 12,000 4k TVs were shipped in 2014 out of 6.4 Million sets. It's supposed to double this year, and again in 2016. At that rate, it will be 2018 before it's even in the 1% category.

And there is no single GPU card that can run any top tier game decently at 4k. And by decently, I mean without dropping into low teens on min frame rates (most drop into single digits). Average frame rats at 4k are very deceptive. All of this is why HDMI 2.0 doesn't matter except in a marketing sense.

The reviews and focus on 4K and HDMI 2.0 are sorta like including a tractor pull in a Honda Accord vs Ford Taurus review. Does it really matter which one is better at pulling that trailer? Neither will do it worth a crap.

I think you've just proved my point. I didn't say people were buying 4k in doves. I said it would be on their radar. How would you know GPUs aren't able to drive 4k monitors? Reviews, right? It's just information we, the readers, want to know so we can make an inform decision on our purchase. That's the reason why reviewers should include 4k gaming.

I'm not arguing the popularity of 4k. Clearly, it's not. You're right. To most people, 4k doesn't matter, yet. I'm just saying it should be part of reviews. Why shouldn't?
 

007ELmO

Platinum Member
Dec 29, 2005
2,051
36
101
Only the early adopters push bleeding edge technology, so reviews should continue to push adoption. If we don't play with 4K (the readers of anandtech) - who does? I am all for 4K movies and gaming via projectors.
 

007ELmO

Platinum Member
Dec 29, 2005
2,051
36
101
It's 5 more days? For someone like me, the gtx 980ti has hdmi 2.0 which I desperately want but don't get you you wouldn't wait 5 days and pick the better option if you're a display port user

I think it's not that easy. The Gaming 980TI G1 was difficult to get, even with nowinstock tracker. You're talking 15 minutes of availability at a time every 3-4 days. Kind of annoying, have no reason to believe Fury X will be any different on availability.

I have the 980TI for a 1080p rig which is probably overkill, but I want 60FPS on every game there. I am definitely interested in what AMD can do in crossfire or dual GPU setup vs 980TI SLI to see what's closer to pushing 5760x1080p (3K) res.
 

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
How expensive are these adapters do you think? Surely if those that want to play on 4k tv's should have no issues getting one. At least they will be available which is better than not at all I think.

Nobody knows for sure, but from what I gathered it must be an active adapter because the Fury doesn't do HDMI 2.0 natively. For reference, a DP to DL DVI active adapter is $100+. Hopefully, because this adapter will probably be pretty popular, they'll price it at reasonable levels...
 

sze5003

Lifer
Aug 18, 2012
14,184
626
126
Nobody knows for sure, but from what I gathered it must be an active adapter because the Fury doesn't do HDMI 2.0 natively. For reference, a DP to DL DVI active adapter is $100+. Hopefully, because this adapter will probably be pretty popular, they'll price it at reasonable levels...
If I end up with the fury x I'll need an HDMI to dvi adapter so I'll look into how much those are. If they will be $100 bucks I would say it would have cost them less to include it. But since I'm not gaming at 4k I guess it does not affect me but I see why others are upset if they have to spend an additional $100.
 

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
If I recall correctly, you have a 2560x1440 monitor with only a DVI-DL port? You may need an expensive adapter after all, to get full resolution...
 

looncraz

Senior member
Sep 12, 2011
722
1,651
136
With VSR and DSR we can 4K down sample on our 1080 displays. A lot of people are interested in that.

Yeah, I've seen 4k and 1080p side by side, both showing the same 4k content (obviously the 1080 was downscaled) and I could barely tell the difference on 65" TVs. In fact, as soon as we stepped back about 5 feet the difference was very minor... and I wasn't about to pay $2000 extra for so little.

I think, though, if you compare against a LOW quality high-pitch 1080p the 4k will always look better... but so would any quality 1080p screen running high quality content.

I went into the store expecting to spend >$4k on a 4k TV and walked out with a quality $1k 1080p TV ($2k in the big name stores, I have connections ), then went out and had an expensive meal and had to find something else to do with my money... so I bought an R9 290. Then I bought a few SSDs. Then I bough a new case for the HTPC... and a shiny SSD for that. Then I ran out of things to buy with money left over :'(

"A fool and his money are soon parted."
-Some guy, who cares whom it was

--

On a side note, AMD should still have supported HDMI 2.0 - AND DisplayPort 1.3...

Hopefully on the refresh...
 

sze5003

Lifer
Aug 18, 2012
14,184
626
126
If I recall correctly, you have a 2560x1440 monitor with only a DVI-DL port? You may need an expensive adapter after all, to get full resolution...
Nah mine is only a 22 inch widescreen 1920x1080. I haven't gone to 2560x1440 yet maybe later if I can get a good deal when I upgrade the rest of my pc.
 

jackstar7

Lifer
Jun 26, 2009
11,679
1,944
126
I'm betting as soon as DP 1.3 is ready for the market, AMD will have it. They've been pushing Displayport hard from the start, at least from the point where they were using it to facilitate Eyefinity.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Nobody knows for sure, but from what I gathered it must be an active adapter because the Fury doesn't do HDMI 2.0 natively. For reference, a DP to DL DVI active adapter is $100+. Hopefully, because this adapter will probably be pretty popular, they'll price it at reasonable levels...

What? Try $15:

http://www.monoprice.com/Product?c_id=104&cp_id=10428&cs_id=1042801&p_id=12784&seq=1&format=2

Connect your DisplayPort equipped PC to a DVI monitor using this DisplayPort 1.2a to DVI Active Adapter from Monoprice!

This adapter is designed to allow you to connect a DVI monitor to a DisplayPort video output. It conforms to the DisplayPort 1.2a standard and supports DVI Dual Link output to resolutions up to 2560x1600p @60Hz.
 

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
Sorry, I meant for the OC'd 2560x1440 monitors, they need an adapter that can provide enough bandwidth. A lot of the cheaper adapters will drop frames or be limited to 60Hz.

Nice find on that Monoprice adapter though, looks like a new product because it has no reviews. We'll see when people start trying them out if they achieve anything higher than 60Hz.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
He means the good quality ones that are powered by a dedicated USB cable. these kinds usually have pretty poor performance and flake out a lot @ 25x16

You don't need a powered adapter for DL-DVI. I use one without issue. Unless you're using Display port 1.1, at which I believe you require a different adapter.

EDIT:
Sorry, I meant for the OC'd 120Hz monitors, they need an adapter that can provide enough bandwidth. A lot of the cheaper adapters will drop frames or be limited to 60Hz.

Nice find on that Monoprice adapter though, looks like a new product because it has no reviews. We'll see when people start trying them out if they achieve anything higher than 60Hz.


See that's different. Chances are for that you'll need a powered port either through USB or external A/C. Even then probably get one now for $60-70, still expensive, but not $100+.

EDIT #2: The Monprice adapter might be new, the one I have is not a cable just a straight adapter. I'd actually opt for a cable right about now. :/
 
Last edited:

tential

Diamond Member
May 13, 2008
7,355
642
121
I think it's not that easy. The Gaming 980TI G1 was difficult to get, even with nowinstock tracker. You're talking 15 minutes of availability at a time every 3-4 days. Kind of annoying, have no reason to believe Fury X will be any different on availability.

I have the 980TI for a 1080p rig which is probably overkill, but I want 60FPS on every game there. I am definitely interested in what AMD can do in crossfire or dual GPU setup vs 980TI SLI to see what's closer to pushing 5760x1080p (3K) res.
Meh if I had the cash I'd probably have bought it too in your case. Doesn't hurt anything and I really like the gigabyte g1.

I guess I'll judge it soon my expectation is that fury x crossfire will be the best way to game at 4k though. I may just give up on 4k all together and just use vsr and only get 1 card. It's all about seeing how feasible it really is at this post when the fury x comes out

Edit
Ugh I see the pictures and am reminded of why I want the fury x so much..... And the post I just read about 4k downsample vs native 4k.... Hmmm I'll really need to sleep on this. Ugh fury x just means ocing on quiet water which is so attractive!
 
Last edited:

.vodka

Golden Member
Dec 5, 2014
1,203
1,537
136
Four more days until the reviews hit. Then it'll be easier for you to choose.


The card does look amazing. What a change from their previous efforts.
 

sze5003

Lifer
Aug 18, 2012
14,184
626
126
Agreed it looks great I just hope the wires for the AIO are long enough to reach the top part of my case where I'll be removing my 120mm fan to put in the cooler.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Looking at that picture, any idea if I can use my own fan for the radiator? I'd rather put a nice Corsair SP120 on there.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |