Witcher 3 system requirements

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Do you have any credible links to back this up? You make this statement often, but every leak, rumor, foundry press release, and product release all point to 20nm parts in 2015. AMD's flatly stated they will be shipping 20nm skus of something in 2015, Nvidia is shipping 20nm parts right now in the Tegra X1, etc.

That's why I said most likely. Certainly all sources do not point to 20nm as guaranteed. Raghu provided a bazillion links over the last 6 months that also support the 28nm theory. I can't tell you 100% if it will be 20nm or 28nm as I do not have insider knowledge. However, neither GloFo nor TSMC have high performance 20nm node. The SoC 20nm may be good enough but I don't recall such a low power node used for SoC being adopted for high end GPUs.

There are some other factors involved. Considering NV goes for the largest dies in the industry, they would benefit from 20nm more than anyone. However, there are no rumours at all that GM200 is 20nm; it's actually the opposite of most sites expecting GM200 on 28nm. One has to ask why would NV with much higher volume sales and much stronger bargaining positioning for prices (as a result of higher sales than AMD) is not likely to adopt 20nm for GM200? That makes it less likely for AMD to do it.

Further, 20nm would have brought a substantial reduction in power usage but yet rumours / leaks say AMD is considering Hybrid water cooling and ChipHell implied 390X would have similar power usage to a 290X. As I said I wouldn't rule out 20nm completely but I think we can't rule out 500-550mm2 massive 28nm die 390X either. Historically ATI/AMD never made 500mm2 die but we have seen them go to 438mm2 with Hawaii. Now 28nm is even more mature, even cheaper and considering high voltage and high clocks contribute a lot to high power usage, going 1Ghz or less on a very wide chip in terms of functional units is better if the costs and yields justify it. In the past ATI/AMD would quickly transition from one node to the next making this strategy too costly for them to execute. Now 28nm would be on its 3rd iteration (7970-->290X --> 390X?).

Additionally, it's very risky to try to pull off a trifecta of new architecture (GCN 2.0) + new memory standard (HBM) + most cutting edge node (20nm) simultaneously. This introduces no room for error. Imo pulling off 2 of those with proven 28nm and hybrid water is less risky.

Finally, there is a matter of volumes, prices and yields. With 20nm so new, and Qualcomm, Samsung, Apple basically outbidding NV/AMD, it'll be very expensive to try and do a top-to-bottom mobile and desktop 20nm roll-out for AMD. They would have needed to start mass manufacturing now or next month to try and get Spring 2015 roll-out in volumes.

Don't forget that there are constant rumours that Qualcomm's 20nm 810 and 615 are overheating. Now if 20nm has trouble with such small SOCs, do you think it's realistic to utilize this process on a 350+mm2 high end GPU?

While AMD did state that they will have 20nm products out in 2015, that could easily be shrunken PS4/XB1 APUs.

As I said, whether or not AMD launches 20nm or 28nm GPUs, it matters more that they hit the necessary performance/watt, absolute performance and price/performance targets. If 28nm allows them to scale GCN another 40-50% beyond 290X, it will be perfectly fine to last them until 2016 when lower nodes become accessible.

Anyway, tying all of this back to Witcher 3:

1) The game could still be delayed 3rd time;
2) Until we know if NV or AMD run this game better, it's a guessing game, somewhat in favour of NV due to the title being GWs;
3) 5 months away is a long time in GPUs; something much faster can be out by May 19.

I wouldn't upgrade right now for the sole purpose of playing TW3. I also would not buy a 4690/4790K at this time. If I had to build a new system now, it would be 5820K or wait until BW.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
One has to ask why would NV with much higher volume sales and much stronger bargaining positioning for prices (as a result of higher sales than AMD) is not likely to adopt 20nm for GM200? That makes it less likely for AMD to do it.

With 350mm2+ Console APU dies and 20+ million Console sales per year, i dont believe NVIDIA buys that much more volume than AMD at this time. And dont forget Kabini was also manufactured at TSMC, so AMD may have higher volumes than NVIDIA now.

While AMD did state that they will have 20nm products out in 2015, that could easily be shrunken PS4/XB1 APUs.

If they can use 20nm SOC for the Console APU chips, they can use it for the GPU chips as well.
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
Well, if Witcher 3 is anything like Witcher 2, it will like a lot of CPU.






Those are fun numbers,giggles seeing the i7 920 @ 2ghz still just come up under the stock x4 980 by just inches. As if any respectable i7 920 owner would be underclocking these things.

I am certain they decided to put in those 2ghz i7 920 numbers just for LOLZ, its like oh you think you can overclock your 980 well just take a lookie lou at this! Wasn't 2003-4 the last good year for 2Ghz chips anyways for games?
 

SPBHM

Diamond Member
Sep 12, 2012
5,059
413
126
witcher 2 would run at 24FPS on my i3 2100 and 15FPS on my 3.8GHz e5200 in some specific part (with very low GPU usage)
 

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
Those are fun numbers,giggles seeing the i7 920 @ 2ghz still just come up under the stock x4 980 by just inches. As if any respectable i7 920 owner would be underclocking these things.

I am certain they decided to put in those 2ghz i7 920 numbers just for LOLZ, its like oh you think you can overclock your 980 well just take a lookie lou at this! Wasn't 2003-4 the last good year for 2Ghz chips anyways for games?

Well, actually they put it in to show how the game performance scaled vs CPU. ie, a 920 at 2Ghz vs 3.8Ghz had what effect (58% increase in FPS). It's clear the game was CPU limited on a 2Ghz i7-920 w/GTX 590. Looks like somewhere between 3.2Ghz and 3.8Ghz it got GPU limited.
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
Well, actually they put it in to show how the game performance scaled vs CPU. ie, a 920 at 2Ghz vs 3.8Ghz had what effect (58% increase in FPS). It's clear the game was CPU limited on a 2Ghz i7-920 w/GTX 590. Looks like somewhere between 3.2Ghz and 3.8Ghz it got GPU limited.

Would be more fun if such a article could be revised today with a i7 4790k added in there . Cpu scaling is fun.It is why i thought the stock x4 980 vs 2Ghz i7 920 was kind of fun to see.

I always play around with scaling on my cpu,see where performance has landed to older cpus.Match my numbers with older reviews.
 

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
Well, I know for a fact Dragon Age didn't use close to 8GB. Probably to account for the OS and such running as well. Might be time to think about 16GB as the new standard heh.

With Dragon age on my total system memory being used is 6.5gb. So yea 8gb should rightly be the 8gb recommendation. & to those that said HT doesnt make a difference, it certainly did in BF4 & dragon age3 for me, much smoother.

Anyways im looking forward to this!!! Finally a 2015 game that requires a highend 2013 system!^^
 
Last edited:

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
Anyways im looking forward to this!!! Finally a 2015 game that requires a highend 2013 system!^^

I wanted to laugh at this then it took a couple seconds before i realized not much has happened since 2013. The 980 and 970 have been fun but just barely.
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
This game is unoptimized ...

The system requirements are bit BS since the minimum GPU performance is practically above PS4 ...

The game isn't even physically based too which is a load of crap.

This just shows that CD Projekt Red are nothing more than a bunch of overrated shoddy developers ...

If a game isn't physically based then it's not next gen in my book therefore these system requirements aren't justified IMO ...
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
The system requirements are bit BS since the minimum GPU performance is practically above PS4 ...

Yeah, because consoles and PC are exactly the same :whiste:

The game isn't even physically based too which is a load of crap.

Actually it is physically based. Not only that, but it uses no baked lighting whatsoever.

Source

This just shows that CD Projekt Red are nothing more than a bunch of overrated shoddy developers ...

Or perhaps you're ignorant of game development and game technology.

If a game isn't physically based then it's not next gen in my book therefore these system requirements aren't justified IMO ...

Great way to make a post. Start with a false premise, and end with a false assumption..
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
That's not how GW works.

NV provides specific in-house game code for free to game developers so that they can more easily and freely implement cutting edge graphical features and effects without dedicating their own resources to target 5-10% of the PC market that might have the latest GPU architectures to take advantage of these features.

The GW effects libraries are NV-specific SDK game code, optimized for NV. It's not just about NV having early access to the overall game code of the game, but NV specifically providing their OWN NV-graphics card optimized code to be inserted into the game.

If you don't like it, disable it. Problem solved.

The purpose of NV's GW is to push next generation graphical effects, primarily to sell NV graphics cards, not just to gain early access to developer code. Since GW SDK libraries are NV's own code, not developer writing the code for tessellation, new God rays or realistic water or physics effects, naturally GW's titles are going to heavily favour NV cards most of the time. Consequently, NV gains an automatic advantage since NV-designed and optimized code is inserted into the game!

Keyword most. The problem with this statement is, that it doesn't take into account the facts. And the facts are that programs such as G.E and G.W don't guarantee that a particular game will perform best on the sponsoring IHV's hardware.

We see this with Crysis 3, Far Cry 3, Metro Last Light, Bioshock Infinite etcetera..

If it weren't for programs like Gameworks, PC games would be near carbon copies of their console counterparts with just the ability to use higher resolutions and have higher framerates, with maybe some nicer looking textures..

Most developers aren't interested in pushing graphical boundaries beyond what the consoles are capable of doing.
 

KentState

Diamond Member
Oct 19, 2001
8,397
393
126
Curious to see side by side comparisons of the PS4 version vs ultra on the pc. Unless there is a significant difference, it's time to just retire the PC to the closet.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Curious to see side by side comparisons of the PS4 version vs ultra on the pc. Unless there is a significant difference, it's time to just retire the PC to the closet.

Now that's an inflammatory statement if I've ever seen one.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Curious to see side by side comparisons of the PS4 version vs ultra on the pc. Unless there is a significant difference, it's time to just retire the PC to the closet.

Is there reason to believe that PS4 graphics are equal to Ultra on PC? Usually the PC at Ultra have better graphics than their console counter parts.
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
Yeah, because consoles and PC are exactly the same :whiste:

They are, at least on the architecture front so I expected more tame system requirements out of this game ...

Actually it is physically based. Not only that, but it uses no baked lighting whatsoever.

Source

Hooray for crappy energy conserved blinn phong whereas the rest the AAA game industry is moving towards cook-torrance ...

The guys at CD Projekt Red must like shoehorning everything ...

Or perhaps you're ignorant of game development and game technology.

Perhaps you can't face the fact that CD Projekt Red are a bunch of hacks ...

Great way to make a post. Start with a false premise, and end with a false assumption..

Great way to make a response with some underhanded mudslinging ...
 

StinkyPinky

Diamond Member
Jul 6, 2002
6,886
1,102
126
You know you guys don't have to upgrade Seems somewhat excessive to spend hundreds of dollars to replace components for just the one game. You can always just play on high instead of ultra....

...a heretical idea for a pc gamer maybe....
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
You know you guys don't have to upgrade Seems somewhat excessive to spend hundreds of dollars to replace components for just the one game. You can always just play on high instead of ultra....

...a heretical idea for a pc gamer maybe....

The new forum goer seems to have fallen into the idea that a game is unplayable if it can't be played at maxed settings. I've seen a lot of other silly things like thinking Ultra gives a specific level of detail shared in all games. So playing Crysis 3 at high would look worse than playing Divinity Original sin at Ultra. Others believe that consoles at their default is as good as their PC ports at max.

PC games have settings for a reason. Use them.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Honestly for me if I didn't play at max settings on a PC I would play console games. The only reason I care about pc games at this point is because of the increased visual quality. If I was forced down to default settings I may as well play on medium.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Honestly for me if I didn't play at max settings on a PC I would play console games. The only reason I care about pc games at this point is because of the increased visual quality. If I was forced down to default settings I may as well play on medium.

Are you saying that Max settings are the only settings that are better than the consoles version? I don't buy that at all. Not to mention the higher FPS, and mouse controls. I'm sure some games vary.
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
I've never seen an i5-750 lose to a phenom II 940 in a game. Very strange they chose a 2500K and a phenom II quad as minimum. It's like they copied the idiotic assassins creed unity requirements.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
Are you saying that Max settings are the only settings that are better than the consoles version? I don't buy that at all. Not to mention the higher FPS, and mouse controls. I'm sure some games vary.

Its the biggest thing. Higher quality goes hand in hand with higher FPS. I can't have both that, then yes, may as well go console.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Are you saying that Max settings are the only settings that are better than the consoles version? I don't buy that at all. Not to mention the higher FPS, and mouse controls. I'm sure some games vary.

A lot of vocal of gamers believe if you can't Max out every setting it's not worth it. Irregardless of the iq improvement
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |