Witcher 3 system requirements

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Its the biggest thing. Higher quality goes hand in hand with higher FPS. I can't have both that, then yes, may as well go console.

Well, I can play on high settings, and still get higher IQ and FPS than a console, even though I am not playing at maxed settings. I also get to use a mouse instead of a joystick. Not only that, I save myself a ton of money, because I don't have to go out and buy a new console and games to go with it.

And if my GPU gets really far behind, I can buy one for less than a console.
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
That's why I said most likely. Certainly all sources do not point to 20nm as guaranteed. Raghu provided a bazillion links over the last 6 months that also support the 28nm theory. I can't tell you 100% if it will be 20nm or 28nm as I do not have insider knowledge. However, neither GloFo nor TSMC have high performance 20nm node. The SoC 20nm may be good enough but I don't recall such a low power node used for SoC being adopted for high end GPUs.

There are some other factors involved. Considering NV goes for the largest dies in the industry, they would benefit from 20nm more than anyone. However, there are no rumours at all that GM200 is 20nm; it's actually the opposite of most sites expecting GM200 on 28nm. One has to ask why would NV with much higher volume sales and much stronger bargaining positioning for prices (as a result of higher sales than AMD) is not likely to adopt 20nm for GM200? That makes it less likely for AMD to do it.

Further, 20nm would have brought a substantial reduction in power usage but yet rumours / leaks say AMD is considering Hybrid water cooling and ChipHell implied 390X would have similar power usage to a 290X. As I said I wouldn't rule out 20nm completely but I think we can't rule out 500-550mm2 massive 28nm die 390X either. Historically ATI/AMD never made 500mm2 die but we have seen them go to 438mm2 with Hawaii. Now 28nm is even more mature, even cheaper and considering high voltage and high clocks contribute a lot to high power usage, going 1Ghz or less on a very wide chip in terms of functional units is better if the costs and yields justify it. In the past ATI/AMD would quickly transition from one node to the next making this strategy too costly for them to execute. Now 28nm would be on its 3rd iteration (7970-->290X --> 390X?).

Additionally, it's very risky to try to pull off a trifecta of new architecture (GCN 2.0) + new memory standard (HBM) + most cutting edge node (20nm) simultaneously. This introduces no room for error. Imo pulling off 2 of those with proven 28nm and hybrid water is less risky.

Finally, there is a matter of volumes, prices and yields. With 20nm so new, and Qualcomm, Samsung, Apple basically outbidding NV/AMD, it'll be very expensive to try and do a top-to-bottom mobile and desktop 20nm roll-out for AMD. They would have needed to start mass manufacturing now or next month to try and get Spring 2015 roll-out in volumes.

Don't forget that there are constant rumours that Qualcomm's 20nm 810 and 615 are overheating. Now if 20nm has trouble with such small SOCs, do you think it's realistic to utilize this process on a 350+mm2 high end GPU?

While AMD did state that they will have 20nm products out in 2015, that could easily be shrunken PS4/XB1 APUs.

As I said, whether or not AMD launches 20nm or 28nm GPUs, it matters more that they hit the necessary performance/watt, absolute performance and price/performance targets. If 28nm allows them to scale GCN another 40-50% beyond 290X, it will be perfectly fine to last them until 2016 when lower nodes become accessible.

Anyway, tying all of this back to Witcher 3:

1) The game could still be delayed 3rd time;
2) Until we know if NV or AMD run this game better, it's a guessing game, somewhat in favour of NV due to the title being GWs;
3) 5 months away is a long time in GPUs; something much faster can be out by May 19.

I wouldn't upgrade right now for the sole purpose of playing TW3. I also would not buy a 4690/4790K at this time. If I had to build a new system now, it would be 5820K or wait until BW.

AMD doesn't have a choice in the matter. They can't push GCN farther on 28nm, and they can't afford to deal with the problems of having a massive die and no new midrange chip, so the options are to try to force 20nm or to go more than 2 years without a new flagship while Nvidia is destroying their flagship with a cut-down version of the midrange Maxwell chip.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
AMD doesn't have a choice in the matter. They can't push GCN farther on 28nm, and they can't afford to deal with the problems of having a massive die and no new midrange chip, so the options are to try to force 20nm or to go more than 2 years without a new flagship while Nvidia is destroying their flagship with a cut-down version of the midrange Maxwell chip.

Mounting evidence for 20nm, even statements directly from AMD, and people still post this doom & gloom crap. :/

I need to start getting a collection of screen caps for these. When 20nm Rx 300 series arrive, I can really laugh mightily when it arrives at 20nm. But I'd probably get a long ban for posting a thread with multiple screen shots.

Infraction issued for thread crapping.
-- stahlhart

edit:
You do not get to edit out mod edits for infractions
esquared
Anandtech Forum Director



editing the thread again to include a mod callout?
esquared
Anandtech Forum Director
 
Last edited by a moderator:

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Recommended 8gb of ram? Will it utilize even 6gb of that?

Seeing as Dragon Age Inquisition has been shown to use around 6 GB of RAM, yes. In that case, 8 GB is the logical recommendation, as you will need the extra 2 GB for the operating system, other processes, and a nice bit of wiggle room for worst case scenarios.

The game's being made for consoles with 8 GB of shared CPU/GPU memory, people. Recommending 8 GB of system RAM on PC is not a shock.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
Seeing as Dragon Age Inquisition has been shown to use around 6 GB of RAM, yes. In that case, 8 GB is the logical recommendation, as you will need the extra 2 GB for the operating system, other processes, and a nice bit of wiggle room for worst case scenarios.

The game's being made for consoles with 8 GB of shared CPU/GPU memory, people. Recommending 8 GB of system RAM on PC is not a shock.

6GB? What resolution? 4K or higher? Maybe buying an another Titan won't be such a bad idea, no other card have so much memory.
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
Mounting evidence for 20nm, even statements directly from AMD, and people still post this doom & gloom crap. :/

I need to start getting a collection of screen caps for these. When 20nm Rx 300 series arrive, I can really laugh mightily.

I was pretty much agreeing with you, but okay...
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
Mounting evidence for 20nm, even statements directly from AMD, and people still post this doom & gloom crap. :/

I need to start getting a collection of screen caps for these. When 20nm Rx 300 series arrive, I can really laugh mightily.

What mounting evidence for 20nm?
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
I'm talking system RAM, not VRAM.

Oh, that explains a lot as I thought 6GB is a complete overkill for a single GK110 but quad or triple SLI just might finally make use of that massive frame buffer and chug along where 4GB cards fail or at least were 3GB cards fail but so far I haven't seen such a situation. I remember some test where 3GB was not enough tough but I'm not really sure if my memory isn't deceiving me.

6GB is a lot of memory used even when it comes to the main memory that's more than any game I have ever bother to check how much memory they used, from what I concluded 4GB would be enough(bare minimum and 16GB recommended) because games tended to not exceed 2GB and all of there were while users on this forum were already recommending 16GB and now some even recommend 32GB for QUAD CHANEL memory systems. I checked DA3 and it used 2.1GB with everything on max quality.

I use custom preset with all the sliders slid to the right for maximum quality because ultra preset isn't really the highest possible quality but as I said that 2.1GB was just after loading the game.
ps. What game uses the most graphical memory?
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
Oh, that explains a lot as I thought 6GB is a complete overkill for a single GK110 but quad or triple SLI just might finally make use of that massive frame buffer and chug along where 4GB cards fail or at least were 3GB cards fail but so far I haven't seen such a situation. I remember some test where 3GB was not enough tough but I'm not really sure if my memory isn't deceiving me.

6GB is a lot of memory used even when it comes to the main memory that's more than any game I have ever bother to check how much memory they used, from what I concluded 4GB would be enough(bare minimum and 16GB recommended) because games tended to not exceed 2GB and all of there were while users on this forum were already recommending 16GB and now some even recommend 32GB for QUAD CHANEL memory systems. I checked DA3 and it used 2.1GB with everything on max quality.

I use custom preset with all the sliders slid to the right for maximum quality because ultra preset isn't really the highest possible quality but as I said that 2.1GB was just after loading the game.
ps. What game uses the most graphical memory?

Same here, more tessellation and apparently better textures in DA:I. In my case it does often get close to 3GB Vram used.

4GB system memory was good for years because games were designed with the 2GB 32bit limit in mind like you say. Now that consoles can manage 64bit (and have more memory) 8GB is becoming a much more common requirement for 64bit games.

The other thing with consoles is their memory is combined so the CPU and GPU can access from the same pool. With PC architecture some elements may need to reside in both Ram and Vram.

I believe the game that can use the most Vram is Shadow of Mordor. when you max out the textures it can use up to 6GB, really only needed if you have a 4K monitor. Some people claim they could play it smoothly on a 3GB 780. Personally I couldn't but then I did find I could play watchdogs a lot smoother than most people.
 
Last edited:

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
Fortunately I'm not interested in 4K monitors but 2560X1600 or even 2560x1440 120HZ 28-30 inchers would be want I would want. But not TN, either IPS or some kind of PLS or xVA like the monitor that Acer announced I don't know why people call it the IPS monitor, I guess it is out of ignorance. But maintaining 120 fps at 2560x1440 will be harder than 60fps at 4K, but with adaptive V-sync I don't need 120Hz all the time. Not to mention how taxing that's going to be on the CPU in many games it's going to be impossible because you can't scale CPU performance as much as graphics performance. 4 cards can get the job done but on the CPU from all you can do is buy a selected CPU that overclock better than most and even that will give you 10-15% more speed.
 
Last edited:

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
Fortunately I'm not interested in 4K monitors but 2560X1600 or even 2560x1440 120HZ 28-30 inchers would be want I would want. But not TN, either IPS or some kind of PLS or xVA or monitor like the one Acer announced I don't know why people call it IPS, I guess it is out of ignorance.

AHVA isn't a VA-style tech, it's more similar to IPS. That's why it's marketed as IPS.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Its the biggest thing. Higher quality goes hand in hand with higher FPS. I can't have both that, then yes, may as well go console.

Right, but I think what he means is that you can achieve higher IQ than consoles without necessarily maxing out every PC game. Basically there is an area between console IQ and max PC game IQ where the PC game still looks and runs faster than the console version. I think his point is valid. Also, it becomes a grey area what AA mode/level is considered "maxing out" a PC game. Some gamers might prefer 4K with little to no AA while others would take 1440P with some AA. I don't think maxed out graphics is the primary motivator for most PC gamers. There are other important factors such as large game library with BC compatibility, cheaper games, more variety of games, controls, mods, portability (laptop gaming vs. PS Vita/Nintendo 3DS gaming).
 

KentState

Diamond Member
Oct 19, 2001
8,397
393
126
Are you saying that Max settings are the only settings that are better than the consoles version? I don't buy that at all. Not to mention the higher FPS, and mouse controls. I'm sure some games vary.

Right now, unless it's ultra settings and higher FPS, then I'd rather play it on a console. Unless Witcher 3 is somehow better with a mouse/keyboard then a mouse isn't even a consideration. Ended up playing Witcher 2 with an Xbox controller as the kb/mouse controls did not make the game better.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Mounting evidence for 20nm, even statements directly from AMD, and people still post this doom & gloom crap. :/

I need to start getting a collection of screen caps for these. When 20nm Rx 300 series arrive, I can really laugh mightily.

I don't think it's conclusive to say that AMD cannot improve perf/watt and absolute performance on 28nm and there isn't enough evidence to throw out 28nm completely. I think you are focusing too much on 28nm vs. 20nm instead of the actual end result. If AMD can deliver an awesome product on 28nm that overclocks well like NV managed to do, then it's not as material to have 20nm. You make an assertion that AMD cannot really improve GCN on 28nm but why is that conclusive? I would be happy to be proven wrong and for AMD to launch 20nm cards. It's not as if gamers are against a more advanced node.

The doom and gloom is coming because AMD hasn't shown an ounce of anything regarding R9 300 series, while NV has already launched 850M/860M/965M/970M/980M/970/980 and 960 parts are coming this month. This is the reverse of GCN 1.0 vs. Kepler except unlike NV users who will wait 6-9 months to upgrade to NV, AMD users are less likely to do that. If you look at AMD's market share when they are behind by 5-6 months, their share drops off tremendously compared to when NV is behind. Once the gamers upgrades to a 970/980 and SLI of those chips, chances are they are done for another 2 years. Guess what, almost every gamer that got Maxwell is 1 less sale for AMD.

If based on rumours AMD's 300 series won't be out until late Spring or early Summer 2015, this will be the most lengthy delay from AMD. Do you remember how disastrous HD2900XT was? It launched May 14, 2007 vs. GeForce 8800GTX that came out Nov 8, 2006 and lost. This single generation set off the trend of AMD to NV jumpers that AMD never recovered from. Until HD2900XT, ATI and NV were fairly evenly matched in dGPU overall market share, with ATI even having the lead from time-to-time.



The reason so many people are concerned about AMD's future is that their CPU division's sales aren't going to support losses of their GPU division, and AMD/ATI graphics now has one of the lowest, if not the lowest dGPU market share against nV. All of this is happening without NV even having launched their high-end cards. That's really the scary part here since NV could easily drop 970 to $249, 980 to $399-429, and/or refresh GM204 to even faster cards by Summer 2015. Can you even imagine the margins NV is riding right now on GM204 at $549? NV is using 1 GM204 chip to cover 965M/970M/980M/970 and 980! They are literally milking this die, with 965M showing that even the half cut-down GTX980 is being used instead of being thrown away.

Even when AMD had good perf/watt with Oland, Pitcairn, Mars, etc. they failed miserably to get mobile dGPU design wins. That's another risk factor that even if R9 300 series flat out beats Maxwell in the key metrics, it doesn't guarantee AMD can get the design wins since it appears on the surface they have worse customer relationships than NV does today.

I don't think many of us will have problems running this game.

That's a given considering TW3 is coming out for XB1/PS4, whose GPUs are far behind Tahiti, Hawaii, GK110, GM204, etc.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Right now, unless it's ultra settings and higher FPS, then I'd rather play it on a console. Unless Witcher 3 is somehow better with a mouse/keyboard then a mouse isn't even a consideration. Ended up playing Witcher 2 with an Xbox controller as the kb/mouse controls did not make the game better.

That simply sounds as if you are a console gamer at heart, but will play PC games, only if they look a lot better.

That's fine, but there is absolutely nothing being lost if a PC gamer plays at high settings and still plenty of gains.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
That's a given considering TW3 is coming out for XB1/PS4, whose GPUs are far behind Tahiti, Hawaii, GK110, GM204, etc.


I meant that people on this forum generally upgrade their hardware often enough that a new game being playable wouldn't be a concern. Elsewhere where people aren't in the loop and probably bought something off the shelf, it might be a concern.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Right now, unless it's ultra settings and higher FPS, then I'd rather play it on a console. Unless Witcher 3 is somehow better with a mouse/keyboard then a mouse isn't even a consideration. Ended up playing Witcher 2 with an Xbox controller as the kb/mouse controls did not make the game better.

I think it's a given that Witcher 3 will be at 30 FPS on consoles.
 

kasakka

Senior member
Mar 16, 2013
334
1
81
Right now, unless it's ultra settings and higher FPS, then I'd rather play it on a console. Unless Witcher 3 is somehow better with a mouse/keyboard then a mouse isn't even a consideration. Ended up playing Witcher 2 with an Xbox controller as the kb/mouse controls did not make the game better.

Why I choose PC versions over consoles is that despite the high initial hardware investment, I can be guaranteed games with better image quality, higher resolution, better framerate and ability to choose if I want to play using a gamepad or mouse+keyboard. Add on top of that cheaper prices for games and possibility for mods depending on the game.

That's actually what has kept me from buying a PS4 - no exclusive must-have titles so far. 2015 might be different with Uncharted 4 etc coming out.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |