I doubt it, its an Nvidia title, so I'm guessing AMD cards will need some day one optimization by AMD when its released.
And of course everyone will blame AMD.
I doubt it, its an Nvidia title, so I'm guessing AMD cards will need some day one optimization by AMD when its released.
Its the biggest thing. Higher quality goes hand in hand with higher FPS. I can't have both that, then yes, may as well go console.
That's why I said most likely. Certainly all sources do not point to 20nm as guaranteed. Raghu provided a bazillion links over the last 6 months that also support the 28nm theory. I can't tell you 100% if it will be 20nm or 28nm as I do not have insider knowledge. However, neither GloFo nor TSMC have high performance 20nm node. The SoC 20nm may be good enough but I don't recall such a low power node used for SoC being adopted for high end GPUs.
There are some other factors involved. Considering NV goes for the largest dies in the industry, they would benefit from 20nm more than anyone. However, there are no rumours at all that GM200 is 20nm; it's actually the opposite of most sites expecting GM200 on 28nm. One has to ask why would NV with much higher volume sales and much stronger bargaining positioning for prices (as a result of higher sales than AMD) is not likely to adopt 20nm for GM200? That makes it less likely for AMD to do it.
Further, 20nm would have brought a substantial reduction in power usage but yet rumours / leaks say AMD is considering Hybrid water cooling and ChipHell implied 390X would have similar power usage to a 290X. As I said I wouldn't rule out 20nm completely but I think we can't rule out 500-550mm2 massive 28nm die 390X either. Historically ATI/AMD never made 500mm2 die but we have seen them go to 438mm2 with Hawaii. Now 28nm is even more mature, even cheaper and considering high voltage and high clocks contribute a lot to high power usage, going 1Ghz or less on a very wide chip in terms of functional units is better if the costs and yields justify it. In the past ATI/AMD would quickly transition from one node to the next making this strategy too costly for them to execute. Now 28nm would be on its 3rd iteration (7970-->290X --> 390X?).
Additionally, it's very risky to try to pull off a trifecta of new architecture (GCN 2.0) + new memory standard (HBM) + most cutting edge node (20nm) simultaneously. This introduces no room for error. Imo pulling off 2 of those with proven 28nm and hybrid water is less risky.
Finally, there is a matter of volumes, prices and yields. With 20nm so new, and Qualcomm, Samsung, Apple basically outbidding NV/AMD, it'll be very expensive to try and do a top-to-bottom mobile and desktop 20nm roll-out for AMD. They would have needed to start mass manufacturing now or next month to try and get Spring 2015 roll-out in volumes.
Don't forget that there are constant rumours that Qualcomm's 20nm 810 and 615 are overheating. Now if 20nm has trouble with such small SOCs, do you think it's realistic to utilize this process on a 350+mm2 high end GPU?
While AMD did state that they will have 20nm products out in 2015, that could easily be shrunken PS4/XB1 APUs.
As I said, whether or not AMD launches 20nm or 28nm GPUs, it matters more that they hit the necessary performance/watt, absolute performance and price/performance targets. If 28nm allows them to scale GCN another 40-50% beyond 290X, it will be perfectly fine to last them until 2016 when lower nodes become accessible.
Anyway, tying all of this back to Witcher 3:
1) The game could still be delayed 3rd time;
2) Until we know if NV or AMD run this game better, it's a guessing game, somewhat in favour of NV due to the title being GWs;
3) 5 months away is a long time in GPUs; something much faster can be out by May 19.
I wouldn't upgrade right now for the sole purpose of playing TW3. I also would not buy a 4690/4790K at this time. If I had to build a new system now, it would be 5820K or wait until BW.
AMD doesn't have a choice in the matter. They can't push GCN farther on 28nm, and they can't afford to deal with the problems of having a massive die and no new midrange chip, so the options are to try to force 20nm or to go more than 2 years without a new flagship while Nvidia is destroying their flagship with a cut-down version of the midrange Maxwell chip.
Recommended 8gb of ram? Will it utilize even 6gb of that?
Seeing as Dragon Age Inquisition has been shown to use around 6 GB of RAM, yes. In that case, 8 GB is the logical recommendation, as you will need the extra 2 GB for the operating system, other processes, and a nice bit of wiggle room for worst case scenarios.
The game's being made for consoles with 8 GB of shared CPU/GPU memory, people. Recommending 8 GB of system RAM on PC is not a shock.
6GB? What resolution? 4K or higher? Maybe buying an another Titan won't be such a bad idea, no other card have so much memory.
Mounting evidence for 20nm, even statements directly from AMD, and people still post this doom & gloom crap. :/
I need to start getting a collection of screen caps for these. When 20nm Rx 300 series arrive, I can really laugh mightily.
Mounting evidence for 20nm, even statements directly from AMD, and people still post this doom & gloom crap. :/
I need to start getting a collection of screen caps for these. When 20nm Rx 300 series arrive, I can really laugh mightily.
I'm talking system RAM, not VRAM.
Oh, that explains a lot as I thought 6GB is a complete overkill for a single GK110 but quad or triple SLI just might finally make use of that massive frame buffer and chug along where 4GB cards fail or at least were 3GB cards fail but so far I haven't seen such a situation. I remember some test where 3GB was not enough tough but I'm not really sure if my memory isn't deceiving me.
6GB is a lot of memory used even when it comes to the main memory that's more than any game I have ever bother to check how much memory they used, from what I concluded 4GB would be enough(bare minimum and 16GB recommended) because games tended to not exceed 2GB and all of there were while users on this forum were already recommending 16GB and now some even recommend 32GB for QUAD CHANEL memory systems. I checked DA3 and it used 2.1GB with everything on max quality.
I use custom preset with all the sliders slid to the right for maximum quality because ultra preset isn't really the highest possible quality but as I said that 2.1GB was just after loading the game.
ps. What game uses the most graphical memory?
Fortunately I'm not interested in 4K monitors but 2560X1600 or even 2560x1440 120HZ 28-30 inchers would be want I would want. But not TN, either IPS or some kind of PLS or xVA or monitor like the one Acer announced I don't know why people call it IPS, I guess it is out of ignorance.
What mounting evidence for 20nm?
Its the biggest thing. Higher quality goes hand in hand with higher FPS. I can't have both that, then yes, may as well go console.
Are you saying that Max settings are the only settings that are better than the consoles version? I don't buy that at all. Not to mention the higher FPS, and mouse controls. I'm sure some games vary.
Mounting evidence for 20nm, even statements directly from AMD, and people still post this doom & gloom crap. :/
I need to start getting a collection of screen caps for these. When 20nm Rx 300 series arrive, I can really laugh mightily.
I don't think many of us will have problems running this game.
Right now, unless it's ultra settings and higher FPS, then I'd rather play it on a console. Unless Witcher 3 is somehow better with a mouse/keyboard then a mouse isn't even a consideration. Ended up playing Witcher 2 with an Xbox controller as the kb/mouse controls did not make the game better.
That's a given considering TW3 is coming out for XB1/PS4, whose GPUs are far behind Tahiti, Hawaii, GK110, GM204, etc.
Right now, unless it's ultra settings and higher FPS, then I'd rather play it on a console. Unless Witcher 3 is somehow better with a mouse/keyboard then a mouse isn't even a consideration. Ended up playing Witcher 2 with an Xbox controller as the kb/mouse controls did not make the game better.
I think it's a given that Witcher 3 will be at 30 FPS on consoles.
Right now, unless it's ultra settings and higher FPS, then I'd rather play it on a console. Unless Witcher 3 is somehow better with a mouse/keyboard then a mouse isn't even a consideration. Ended up playing Witcher 2 with an Xbox controller as the kb/mouse controls did not make the game better.
I think it's a given that Witcher 3 will be at 30 FPS on consoles.