Witcher 3 system requirements

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Well, I know for a fact Dragon Age didn't use close to 8GB. Probably to account for the OS and such running as well. Might be time to think about 16GB as the new standard heh.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
Recommend 8GB of system memory? Is this common now? I haven't paid attention to recommended specs.

Yes. Most of the next gen games, those developed with the 8th gen consoles in mind, have been listing 8GBs as the reco requirement. Smart people on /r/buildapc have been going with 16GBs in builds.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
7,416
6,150
136
Eh, Shadow of Mordor wanted an i7-3770 in its recommended specs and low and behold the game comes out and not even a Sandy Bridge i3 bottlenecks:

 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Eh, Shadow of Mordor wanted an i7-3770 in its recommended specs and low and behold the game comes out and not even a Sandy Bridge i3 bottlenecks:

That's not really a good example though, because SoM isn't a seamless open world game, and it does not possess the density of games like AC Unity and the Witcher 3.

Witcher 3's biggest city Novigrad for instance will have about 2,000 unique NPC entities to populate it..

All of that is going to take CPU power..
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
Well, I know for a fact Dragon Age didn't use close to 8GB. Probably to account for the OS and such running as well. Might be time to think about 16GB as the new standard heh.

4-5GB I think on my 16GB RAM system. 16GB would be the standard I'd say now, you won't need to worry about that part if you chuck a slab in.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Only Broadwell and GM204 currently supports DX11.2 and higher.

Kepler is 11.0 and GCN is 11.1 unless Tonga supports 11.2.

Nope.

Read this: http://forums.guru3d.com/showthread.php?t=393510

AMD have their 'R' series cards listed as DX12 with footnote #11 as follows:

11. Based on our review of the Microsoft DirectX(r) 12 specification dated July 23, 2014, we are confident that devices based on our GCN architecture will be able to support DirectX(r) 12 graphics when available. We recommend that you check AMD.com prior to purchase to confirm that a particular device will support DirectX(r) 12 graphics. Note however, any changes to the DirectX(r) 12 specification after this date could impact or completely eliminate this ability – and AMD disclaims all liability resulting therefrom.

DirectX 11.3 will be supported by all existing DirectX 11 graphics cards, which means that there is nothing what would prevent AMD/NV DX11 cards from using it, unless MS themselves provided misleading information.

More here:
"During GAME24, Microsoft revealed DirectX 11.3. DX11.3 will be supported by all existing DX11 GPUs (or at least that’s the plan) and will be made available next year. DX11.3 will feature almost all features of DX12 (albeit its lower CPU Overhead benefits). This means that DX11.3 will support Ordered Rasterizer View or Rasterizer Ordered Views, Typed UAV Load, Volume Tiled Resources and Conservative Grid."
http://www.dsogaming.com/news/direc...ill-be-packed-with-a-number-of-dx12-features/

None of this matters because:

1) By the time next gen cards use DX12, everything out today will be too slow.
2) You will need Windows 10 for DX12 and since DX11.3 ships alongside DX12 based on MS's comments, chances are you'll also need Windows 10 for it.
3) Only DX12 has the low level API reduced CPU overhead Mantle-like optimization
4) No game to date as far as I am aware has shown any advantage of using anything above DX11.1. The better multi-threading is due to NV's own drivers, not anything at all to do with the DX feature set of their DX11 cards. The better 1080p and below performance due to a lower CPU overhead/better CPU utilization is present even with Kepler without the DX11.2, 11.3 or DX12 games.
5) GCN and Kepler could not have possibly had DX11.3 or DX12 listed at time of launch since at that time MS never officially announced DX11.3/12. You can't list something as officially supported without it official announcement from MS. However, it doesn't mean that retroactive support doesn't exist because of this fact. Yet, you keep listing outdated information based on what's stamped on boxes at release.

Finally you provide no credible evidence of how DX11.2 and 11.3 supoort will actually impact performance in The Witcher 3. Unless you can show us that these extensions actually matter by either claims of the developer or some feature set of Witcher 3 that benefits from them, they don't matter.

The onus is on you to prove that Kepler at most supports DX11.0 and GCN 11.1 because a simple Google search shows both support far beyond that. No major game developer is going to throw 99.9% of the gaming market under the bridge by coding a game with DX11.3/12 by May 2015. That's 100% guaranteed. These things won't matter at all for Witcher 3.


---

As far as 8GB of RAM goes, I can't think of any game GameGPU ever tested that ever used 8GB. Most games use 2-6.5GB. Even though a lot of system builders go for 16GB, it hardly shows a measured tangible benefit in gaming performance. Although it's hard to imagine anyone wanting to play a high-end game like The Witcher 3 in 2015 and not have 8GB of system memory.

The CPU spec is also questionable. An i5 2500K at stock is not fast but overclock it to 4.5-4.8Ghz, and it should not be far behind Haswell @ 4.5Ghz. I expect anyone with a modern i5/7 to play the game fine, and the biggest bottleneck for overclocked i5/7 systems will lie in the GPU instead.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106

They are not even 11.2 today as they claim. And yes, they will partly support DX11.3, just like any DX11.0 card using fallback. However they wont fully support DX11.3 Nor will they support DX12 in its full mode, only in he API reudction mode.

There is absolutely NOTHING in what you link that points to anything different. So lets not make up stuff that isnt there.

DXCapviewer in several threads on this forum have already shown what I said.

And nobody claimed Witcher would use anything besides deferred context. Do your home work for once.
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Don't think that is correct, all GCN cards fully support DX 11.2, so, not sure why you think deferred context rendering isn't supported.
Matter of fact, AMD did a write up on this in GDC2011.

AMD does not support deferred context rendering. Here is a screenshot of DirectX Caps viewer that Rvenger took of his R9 290:



Source

As you can see, driver command list support says no.

A better speculation is, the game is being developed with Nvidia's help, and they will not help optimize AMD's rendering path. That could easily account for the difference.
That's not how PC development works. AMD and NVidia use the same rendering path in the Witcher 3, which is DX11. Only in Mantle supported games does AMD have the capability to use a different rendering path than NVidia.

The only advantage performance wise that Gameworks gives NVidia, is that it allows them earlier access to the game code so they can get a head start on polishing their drivers.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
There doesnt exist a feature level 11_2. There is only an API level 11.2 with support for "Tiled Resources" and other stuff.

So with Windows 10 there will be at least a new API level 11.3 with new hardware features.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Nope.

Read this: http://forums.guru3d.com/showthread.php?t=393510

AMD have their 'R' series cards listed as DX12 with footnote #11 as follows:

11. Based on our review of the Microsoft DirectX(r) 12 specification dated July 23, 2014, we are confident that devices based on our GCN architecture will be able to support DirectX(r) 12 graphics when available.

They can list them all their want.
The fact is that only Maxwell has hw support for (some? we don't know) new DX12 exclusive features, while Fermi+ and GCN+ only "support" DX12

As for why that doesn't matter because todays cards will be too slow to run DX12:
GM204 ran Forza just fine, and my GTX 460 did just fine with DX11
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
They are not even 11.2 today as they claim. And yes, they will partly support DX11.3, just like any DX11.0 card using fallback. However they wont fully support DX11.3 Nor will they support DX12 in its full mode, only in he API reudction mode.

There is absolutely NOTHING in what you link that points to anything different. So lets not make up stuff that isnt there.

I've done my research as always but as usual you haven't.

"Intel has graciously agreed to make the source for their compelling Asteroids demo available to all developers in the DX12 Early Access program.
Oh, and the screenshot above? That was a screenshot from an Intel Haswell graphics DX12 machine running UE4.4’s Landscape Mountains demo."
http://blogs.msdn.com/b/directx/archive/2014/10/01/directx-12-and-windows-10.aspx

According to you Haswell doesn't even support DX12. Oops.

I am inclined to believe AMD, NV, Intel and MS over your unsubstantiated opinion. All 4 have committed to have their DX11 cards support DX11.3 and most of DX12 feature set that will go back to Fermi, Kepler and GCN 1.0, 1.1, 1.2 and Haswell GPUs. Even if all of those cards don't support the "full DX12" feature set, you haven't actually told us why this matters in terms of IQ or performance for those of us running overclocked i5/i7s that would barely benefit from Mantle-like CPU overhead reduction of DX12.

Most importantly, since we would need Windows 10 for those next gen DX11 feature sets and considering neither you nor anyone in this thread can provide any proof of the exact importance of DX11.2, 11.3 and 12 for the next wave of future games starting with Dead Light, Project CARS, The Division, The Witcher 3, etc. your entire point to single out outdated DX11 spec feature support you outlined regarding Kepler, Haswell and GCN is irrelevant to the discussion of Witcher 3. Unless you or anyone else here can prove that Tile-based Deferred Renderer, or any other DX11.2/11.3 feature, will be implemented in Witcher 3 and that its inclusion will have an adverse performance/IQ effect on all non-Maxwell GPUs (or alternatively a major IQ/performance boost for Maxwell), the point you are trying to make is moot.

More ironic is that for any serious Witcher fan, performance of 290/290X/780/780Ti/970/980 is irrelevant since if these cards don't perform well enough, they will upgrade to GM200/300 series by May 2015. Anyone else who purchased these cards months/year before Witcher 3 came out, guess you haven't learned a thing from Half-Life 2.

Next thing you are going to tell us that buying a DX11.3/12 GPU right now is more future-proof for BF5 in 2016, right?

The only advantage performance wise that Gameworks gives NVidia, is that it allows them earlier access to the game code so they can get a head start on polishing their drivers.

That's not how GW works.

NV provides specific in-house game code for free to game developers so that they can more easily and freely implement cutting edge graphical features and effects without dedicating their own resources to target 5-10% of the PC market that might have the latest GPU architectures to take advantage of these features.

"As part of their ongoing drive to improve the state of computer graphics, NVIDIA has a dedicated team of over 300 engineers whose primary focus is the creation of tools and technologies to make the lives of game developers better. What's truly interesting about GameWorks is that these libraries are free for any developers that want to use them. The reason for creating GameWorks and basically giving it away is quite simple: NVIDIA needs to entice developers (and perhaps more importantly, publishers) into including these new technologies, as it helps to drive sales of their GPUs among other things"
http://www.anandtech.com/show/8546/nvidia-gameworks-more-effects-with-less-effort

The GW effects libraries are NV-specific SDK game code, optimized for NV. It's not just about NV having early access to the overall game code of the game, but NV specifically providing their OWN NV-graphics card optimized code to be inserted into the game.

"300 visual effects engineers from NVIDIA who will be dispatched to developers across the globe offering library of SDKs, technology and algorithms and finally developer tools. Part of this strategy includes three new SDKs – Flex, GI Works and Flame Works.
‘We’ve dispatched our engineers to work onsite with top game developers and add effects, fix bugs, tweak performance, and train developers in open standards and work hand-in-hand with our game laboratory.’ NVIDIA

Read more: http://wccftech.com/nvidia-gamework...unified-gpu-physx-demonstrated/#ixzz3OEt7U0tz"

The purpose of NV's GW is to push next generation graphical effects, primarily to sell NV graphics cards, not just to gain early access to developer code. Since GW SDK libraries are NV's own code, not developer writing the code for tessellation, new God rays or realistic water or physics effects, naturally GW's titles are going to heavily favour NV cards most of the time. Consequently, NV gains an automatic advantage since NV-designed and optimized code is inserted into the game! This is why there were so many articles which called into question the highly controversial business practice of GW. If GW only allowed NV to gain access to the game engine's code and nothing more, it wouldn't be that controversial.

Well if you insist on ultra settings then yes. If you want simply playable then plenty of cards today will be fine.

Name any time in the history of next generation DX, where the first generation of DX cards could play next generation games of that DX generation at Ultra settings maxed out? This never happened. By the time true DX9, 10, 11 games came out, we needed 2nd or 3rd generation GPUs that supported that level of DX. History proved this for every single DX generation in the last 15 years. Since DX12's main differentiation is the lower-level Mantle-like API, its primary advantage over DX11.3 is the lower CPU overhead, not advanced graphical features.

Further, by the time MS releases Windows 10, and developers will actually start making DX11.3/12 games from the ground-up, all modern cards like 7970Ghz/680/770/280X/290/290X/960/970/980 will be obsolete. Since 2016, 2017 and 2018 games will be even more demanding, regardless of DX11.3/12, most of us will upgrade to something way faster available at $400-500 to play those future titles. I would bet the same people who are worried that their card won't support DX12 will be upgrading to Pascal/Volta anyway. For those who bought GM204 and think that DX12 is some kind of a future-proof feature, that's nice -- see you in 2016 when 14nm/16nm FinFET Pascal/GCN 3.0 are = 980 SLI at $550.

Considering $650 780/$1000 Titan can be had in a $250 R9 290 and $700 780Ti in a $350 970, it's remarkable that experienced PC builders still try to push the virtues of DX future-proofing. If you think you'll be playing 2016-2017 DX12 games on a 970/980 on Ultra, I don't know what to tell you....

Since new Windows adoption takes years, it'll be 2-3 years before developers start to make DX12 games from today. Chances are 2016 games will still be DX11 because development on them will have started late 2014 or mid-2015. It's not going to make sense for the developers to spend time focusing on DX12 code-path until adoption of W10 and DX12 GPUs pick up. As far as DX11.2 and DX11.3 go, DX8.1, 9.0b/c and DX10.1 have all shown to be extremely niche. Don't forget that next generation DX12 game engines are basically non-existent right now as adoption of Unreal Engine 4.4 is MIA.

--

It's funny how this whole DX11-12 discussion started because the OP made a conjecture that the Witcher 3 specs had something to do with deferred tiled resources when the real reason could be just as simple as those specs are just as irrelevant as they are for 90% of all PC minimum/recommended specs released by other developers.

For example AC Unity had minimum CPU requirements of:

"Intel Core i5-2500K @ 3.3 GHz or AMD Phenom II x4 940 @ 3.0 GHz or AMD FX-8350 @ 4.0 GHz"

Total BS as X4 940 and 2500K were miles apart in the final game. Despite AC Unity recommending Intel Core i7-3770 @ 3.4 GHz or AMD FX-8350 @ 4.0 GHz or better, i5 2500K's minimum requirement was actually faster in the game. Are we supposed to take CPU recommendations for games as gospel now when 90% of developers have failed before? :whiste:



Recommended GPU requirements are just as bad since they never tell us what settings and FPS the developer is talking about.

AC Unity recommended:
"NVIDIA GeForce GTX 780 or AMD Radeon R9 290X (3 GB VRAM)"

But in reality one needed 780 SLI or faster to hit 60 fps with 4xMSAA in Unity.



Lately the developer's minimum and recommended CPU/GPU specs are so off the mark, after looking at Lord's of the Fallen's requirements and then the final game, I don't even pay attention at all. Just wait until day 1 reviews and upgrade upgrades as necessary. This strategy never fails.
 
Last edited:

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
Kind of fun seeing that if i had my i3 2100 still from 2011 that in AC Unity a 980 @1080p maxed out would just provide nearly the same fps as the i3 could provide.

As far the requirements for TW3,they are a joke surely.Clockspeed on the P2 recommended alone is a tale in itself.In what universe could such a x4 940@3Ghz touch a 2500K?
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
As far the requirements for TW3,they are a joke surely.Clockspeed on the P2 recommended alone is a tale in itself.In what universe could such a x4 940@3Ghz touch a 2500K?

"Minimum System Requirements
Intel CPU Core i5-2500K 3.3GHz
AMD CPU Phenom II X4 940
Nvidia GPU GeForce GTX 660
AMD GPU Radeon HD 7870

RAM 6GB
OS 64-bit Windows 7 or 64-bit Windows 8 (8.1)
DirectX 11
HDD Space 40 GB"

Using the HD7870 or the GTX660 there is no difference in performance between the Phenom II 940 vs Core i5 2500K if you are GPU limited.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
"Minimum System Requirements
Intel CPU Core i5-2500K 3.3GHz
AMD CPU Phenom II X4 940
Nvidia GPU GeForce GTX 660
AMD GPU Radeon HD 7870

RAM 6GB
OS 64-bit Windows 7 or 64-bit Windows 8 (8.1)
DirectX 11
HDD Space 40 GB"

Using the HD7870 or the GTX660 there is no difference in performance between the Phenom II 940 vs Core i5 2500K if you are GPU limited.

Then the Intel CPU requirement should be lower, as there would be no difference in performance between a 2500K and older Nehalem/Westmere chips.
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
Then the Intel CPU requirement should be lower, as there would be no difference in performance between a 2500K and older Nehalem/Westmere chips.

I was thinking something like the i5 760,wasn't that pretty much the previous budget king cpu from Intel?
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
Then the Intel CPU requirement should be lower, as there would be no difference in performance between a 2500K and older Nehalem/Westmere chips.

Maybe those two CPUs where the ones they used as a minimum reference for developing and testing the game ???.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
I've done my research as always but as usual you haven't.

According to you Haswell doesn't even support DX12. Oops.

No you didnt.

There are 2 modes of DX12. One is the full feature, the other is the API reduction mode that is backwards compatible back to DX11.0. We got DX11.3 for the same reason, since DX11.3 got full feature parity with the full featured DX12.

There is one single GPU currently capable of the full featured DX12 mode and DX11.3, thats GM204.

Fermi(DX11.0), Kepler(DX11.0), GCN(DX11.1), Haswell(DX11.1), Broadwell(DX11.2) all runs the API reduction mode only for DX12. And none of them supports the full featured DX12 or DX11.3.

And you can pack the rest of your stawman away.
 
Last edited:

Grooveriding

Diamond Member
Dec 25, 2008
9,110
1,260
126
These specs rarely have any relevance to what you can expect when you get the game. Really looking forward to this one and so long as it isn't broken like so many nvidia Gameworks games end up being I don't care what they expect you to have to run it
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |