[WCCF] AMD Radeon R9 390X Pictured

Page 28 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
I wouldn't want 4gb for 1440p let alone 4k. 4gb is crazy low and will be obsolete out of the gate. Hopefully they use it for a mid range chip or a small form factor version of the 390x, but still make a regular version with ddr5 with 8gb.

Moonbogg: I have 2 2560 x 1440 monitors, the BenQ BL3200 below on the rig below and an Achieva Shimian 27" for my 3770k rig which has a single air-cooled Sapphire Tri-X OC R9 290.

The fps on my rig below is obviously faster because I'm running 2 of the R9 290s in CF. That being said, I'm really not having any problem at 2560 x 1440 resolution.

I think the problems appear to be with 4k resolution.

I agree that the 4G HBM, if that's what it is, for the 390x Fiji will hurt it's perception. Since it's such a radical change of memory (DDR5 vs HBM) I think we have to wait for release and testing on games at 4k to see what effect 4G HBM has.
 
Last edited:

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
I wouldn't want 4gb for 1440p let alone 4k. 4gb is crazy low

4GB isn't "crazy low" by any stretch of the imagination. You've seen benchmarks from the 980 and 290x right? Was 4GB "crazy low" when the 980 launched just in September? Game vram requirements haven't changed all that much since September. They've increased some, but not massively. The only thing that's really changed is your perception. Conveniently right now too, strangely enough hmmm... More than 4GB is nice and good of course, but calling 4GB crazy low is 100% exaggeration.
 
Last edited:

PPB

Golden Member
Jul 5, 2013
1,118
168
106
Remember how suddenly 2GB from nv was allright vs 3GB of AMD and, now people are pooping their paints at the tought of having to settle with 4GB? I think 390x will have a 8gb variant but gosh, how goalposts move on this subforun.
 

Alatar

Member
Aug 3, 2013
167
1
81
Remember how suddenly 2GB from nv was allright vs 3GB of AMD and, now people are pooping their paints at the tought of having to settle with 4GB? I think 390x will have a 8gb variant but gosh, how goalposts move on this subforun.

That always goes both ways. When the teams switch positions (with efficiency, heat, perf crown, bus width, amount of memory) so do the people who prefer a certain team.

Back when Fermi was the hottest thing (huehue) out there AMD supporters were making a huge deal about efficiency and heat output, not so much anymore. With Fermi / Tesla amount of memory and bus width were important for NV supporters, not so much during kepler and maxwell. With Tahiti vs. GK104 OCing was very important for AMD supporters but that quickly stopped once the comparison switched to Hawaii vs. GK110. etc. etc.

Now the "VRAM capacity matters" and "VRAM capacity doesn't matter" teams just switch places for the next generation assuming AMD does indeed ship with 4GB cards.

That's the way of things. :thumbsup:
 

moonbogg

Lifer
Jan 8, 2011
10,637
3,095
136
GTA V breaks 3gb easily at resolutions under 4k. Add a little extra AA and that 4gb card is already choking. Going forward gaming is likely to require at least as much ram as games today need, and most likely more of course. I'm not comfortable with anything less than 6gb, and to be honest, I think any true high end GPU should have 8gb given today's gaming requirements and uptake of 4k monitors.
Last gen cards had 4gb, although Nvidia skimped again and went with 3gb. Today's mid range Nvidia cards have 4gb (well, some have 3.5gb). High ends coming out should have 8gb, but Nvidia will likely release 6gb unless you're in the mood to pay the Titan X premium.

I ran out of Vram on both my 570's and 670's. That gets kind of annoying if the GPU power is still there. I blame consoles. Its always a good idea to blame stuff on the consoles.
 
Last edited:

boxleitnerb

Platinum Member
Nov 1, 2011
2,601
2
81
I hope people don't buy for only now but keep their cards 2-3 years. Memory requirements will rather go up than down. Something to keep in mind...
 

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
HBM probably took a lot of money and manpower to create, but I'm not sure if that is a good move if true...
 

Stormflux

Member
Jul 21, 2010
140
26
91
Interesting articles coming out today about HBM. The 4GB capacity really does make someone think but before reviews I'll hold my judgement. We're only a month away from the real release. However... Everything recently about anything before the big jump to 14/16nm and HBM2 should make people think.

Some people are able to see a larger picture. Those contemplating 390X @ $850 vs. Titan-X $1000 are already pushing the luxury end. And the main concern of Future Proofing a laughable metric at the luxury end. If the 4GB 390X is questionable for future proofing, is Maxwell not the same?

We're already seeing Kepler tank in recent games. The same CAN happen when Pascal releases. Suddenly your $1000 card is tanking, and you find yourself buying the luxury end every generation which is getting more and more expensive. And for AMD, HBM1 is a very new technology with HBM2 confirmed a year away but with a proven GCN architecture that already is showing it's aging gracefully.

This generation is a Stop-Gap. Your money is yours of course. The premium of cutting edge I suppose.

I do like the fact that people brought up streaming textures vs. caching full textures. It's a real issue and quite important for devs to consider.
 

x3sphere

Senior member
Jul 22, 2009
722
24
81
www.exophase.com
Well, is it possible the extra bandwidth of HBM could offset having less VRAM? Let's say a game uses more than 4GB, but with the increased bandwidth the drivers could swap/discard data that isn't needed quicker and avoid falling back to system RAM.

Given how much faster HBM is, I just wonder if something like this is possible at the driver level.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Well, is it possible the extra bandwidth of HBM could offset having less VRAM? Let's say a game uses more than 4GB, but with the increased bandwidth the drivers could swap/discard data that isn't needed quicker and avoid falling back to system RAM.

Given how much faster HBM is, I just wonder if something like this is possible at the driver level.

I asked that earlier, and the consensus was no.

I also joked that 4gb of HBM would "act like" 6gb of ddr5.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
As a hypothetical, we could use ddr3 and ddr5.

Take a fictional game that requires 4gb of vram minimum to play well.

We will have the same fictional GPU.

Does 2gb of DDR5 beat 4gb of DDR3 in this fictional scenario?

I don't know, but I suspect it's not a realistic comparison.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Well, is it possible the extra bandwidth of HBM could offset having less VRAM? Let's say a game uses more than 4GB, but with the increased bandwidth the drivers could swap/discard data that isn't needed quicker and avoid falling back to system RAM.

Given how much faster HBM is, I just wonder if something like this is possible at the driver level.

Not like that, but addressing the same concern; many games that are pushing 4GB or more are only doing so because they are pre-caching game assets well before they are needed. Memory requirements are more elastic now than ever before due to the increasing complexity and general capability of GPUs.

In many games the 780 and 780 Ti are running out of GPU horsepower long before VRAM and they only have 3GB.

Realistically, the 390x/390/whatever they call them only has to last until 16/14 nm node. HBM2 and a massive shrinking node is going to move high end performance way, way up compared to 28nm -> 28nm, even with big architectural changes. Anyone trying to future proof themselves right now with 28nm is on a fool's errand. If 16/14 stays around as long as 28nm, the first 16/14nm chips should have a much longer competitive lifespan than current third revision 28nm chips.
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126

WTH does this even mean? Are Fiji gonna be the high compute models and we get something based on Tonga as the "sleek" gamer models?

At this point don't even taunt the carrot. GM200 is a gamer card through and through @ $1,000 it is a freaking rip off. If AMD is following this path screw them, Nvidia, and everyone else - I'm going back to consoles! (Some sarcasm)
 

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
Well, is it possible the extra bandwidth of HBM could offset having less VRAM? Let's say a game uses more than 4GB, but with the increased bandwidth the drivers could swap/discard data that isn't needed quicker and avoid falling back to system RAM.

Given how much faster HBM is, I just wonder if something like this is possible at the driver level.

Swapping between VRAM and system RAM is limited by the PCI-E lanes and the RAM speed, not by the bandwidth of the VRAM. If you go over 4GB, you're waiting on the slowest part. I tried to explain it the best I could here, but I'm not sure if it is completely accurate.

AMD could mitigate slowdowns by swapping before the asset is needed, but it seems like it would be on a game-per-game basis and would require a lot of work. They would almost have to anticipate what to swap long before it is needed.
 

maddie

Diamond Member
Jul 18, 2010
4,787
4,771
136
I wonder if it's possible to use GDDR5 as an LLC between the HBM and system memory.

Would that not negate the main use of HBM as a power saver?

Also not a single person has commented on this by Ryan Smith.
http://www.anandtech.com/print/9266/amd-hbm-deep-dive

" HBM in turn allows from 2 to 8 stacks to be used, with each stack carrying 1GB of DRAM. AMD’s example diagrams so far (along with NVIDIA’s Pascal test vehicle) have all been drawn with 4 stacks,"
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Would that not negate the main use of HBM as a power saver?

Also not a single person has commented on this by Ryan Smith.
http://www.anandtech.com/print/9266/amd-hbm-deep-dive

" HBM in turn allows from 2 to 8 stacks to be used, with each stack carrying 1GB of DRAM. AMD’s example diagrams so far (along with NVIDIA’s Pascal test vehicle) have all been drawn with 4 stacks,"

Yes, I've seen that a few times posted here as a link.
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91

So to summarize:

- 390/390X is Hawaii?
- Fiji Pro/Fiji XT which are the only new chips, will have a fancy name like "Titan" and will be priced accordingly?

Here is my beef with that:
It explains the $850 price for the Fiji XT card. Nvidia have successfully dragged AMD in the "exclusive" war and the previously $500 for top cards is now $850+. Great.

I just have to wonder though, will AMD ever release Fiji in a cheaper package, like Nvidia will do with GTX 980Ti or will they stay in the exclusive price field until 400 series is ready?

GPUs just got expensive
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
So to summarize:

- 390/390X is Hawaii?
- Fiji Pro/Fiji XT which are the only new chips, will have a fancy name like "Titan" and will be priced accordingly?

Here is my beef with that:
It explains the $850 price for the Fiji XT card. Nvidia have successfully dragged AMD in the "exclusive" war and the previously $500 for top cards is now $850+. Great.

I just have to wonder though, will AMD ever release Fiji in a cheaper package, like Nvidia will do with GTX 980Ti or will they stay in the exclusive price field until 400 series is ready?

GPUs just got expensive

Or it's just worthless scuttlebutt...
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
Or it's just worthless scuttlebutt...
Or it could be real. Sweclockers is not just a random news site.

Would that not negate the main use of HBM as a power saver?

Also not a single person has commented on this by Ryan Smith.
http://www.anandtech.com/print/9266/amd-hbm-deep-dive

" HBM in turn allows from 2 to 8 stacks to be used, with each stack carrying 1GB of DRAM. AMD’s example diagrams so far (along with NVIDIA’s Pascal test vehicle) have all been drawn with 4 stacks,"
There is no way 8 stacks is coming for Fiji XT.
We know its 4096bit. 8 stacks would give 8192bit.
I think they are referring to the technical limits of HBM in general. 8 stacks is what the dual Fiji is suppose to have. 4 stacks per GPU. Still 4GB/4096bit per GPU but 8 stacks in total.
 
Status
Not open for further replies.
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |