A game like this could benefit heavily from DX12. It's a shame that Ubisoft couldn't get it ready in time for release. I really hope they implement a DX12 patch later down the road though.
Its too bad they didn't include a 7700K or 8700k in that comparison to show how a CPU with half the core count but much faster will obliterate slower 16 thread CPU's.
I know my 8700K averages 120+ fps at those settings
What comparison are you talking about exactly? Care to do a comparison with me? I have a 6900K at 4.2ghz. It would be interesting to see how your higher clocked 8700K compares against my lower clocked 6900K with quad channel RAM. Lets run the internal benchmark at 1080p high quality global setting to make it more CPU bound, with MSI Afterburner enabled and take a screenshot of the final result.
I will try to do a final test at a later point however, before giving the system away. Alexandria should be what, 3-4 hours from where I am now.
In my runs of Origins, for 1080p ultra, the RAM usage was at around 7GBs and Vram usage at 3.5GBs. My system alone would have lots more RAM and Vram to spare if the game requested it, so if the devs took this approach, it's bad coding really. The game should allocate more resources if it found them.
I believe it has to do more with the protection though. It must be doing some on the fly encryption/decryption or whatever. I intend to test the cracked version once it comes out, mostly out of curiosity to see how the protection affects system resources. If it does that is.
Hello. Sorry for the bump. I did some Assassin's Creed Origins benchmarks and I think the findings are semi interesting.
My benchmark consists of an 8 minute run, doing various (but the same on all runs) stuff. Except on the 5850 which the performance was giving an unplayable experience, so I stuck with the built in benchmark. The built in benchmark is run on all the other systems as well, for reference.
Latest 1.5 patch for all systems with the same latest drivers amongst same gpus.
Everything is documented in the following videos.
Assassin's Creed Origins 1920X1080 Ultra GTX 1070 @2Ghz CORE i5-8600k @5Ghz - 82fps
Assassin's Creed Origins 1920X1080 ultra GTX 1070 @2Ghz CORE i7-860 @4GHz - 66fps
Assassin's Creed Origins 1920X1080 Ultra GTX 970 @1.5Ghz CORE i7-860 @4GHz - 49fps
Assassin's Creed Origins 1920x1080 High 7950 @1.1Ghz CORE i7-860 @4GHz - 40fps
Assassin's Creed Origins 1920x1080 Medium 5850 @900Mhz CORE i7-860 @4GHz - 10 fps
For the 5850 the performance was roughly the same for Very Low, Low and Medium. It is quite probably hitting a vram limit so hard, that the performance stays fixed at that framerate for all these settings. The HD630 of the i5-8600k, which is not documented here, gave me 8fps for medium. That does not mean that this is the usual performance delta of the old 5850 with the HD630, but that a story for another time.
I also did a couple of 720p benchmarks, in order to better evaluate , cpu differences.
Assassin's Creed Origins 1280X720 Ultra GTX 1070 @2Ghz CORE i5-8600k @5Ghz - 113fps
Assassin's Creed Origins 1280X720 ultra GTX 1070 @2Ghz CORE i7-860 @4GHz - 76fps
We may be having a 49% performance difference between the 8600k and the 860, but that's not the worst I have seen. In Grand Theft Auto V, the delta was at 60% and that at 1080p maxed graphics bar msaa.
Grand Theft Auto V 1920X1080 V.High fxaa GTX 1070 @2Ghz CORE i5-8600k @5Ghz - 115fps
Grand Theft Auto V 1920X1080 V.High fxaa GTX 1070 @2Ghz CORE i7-860@4ghz - 72fps
(this guy 4ghz user should change his username :lol
In GTA V, the run was more cpu limited, but the cpu load was lower. The reason is the much better cpu utilization of the physical and logical cores of the i7-860, in Assassin's Creed Origins.
Just look at the differences in the two cpu loads of the 860, during my runs of the two games at 1080p.
Assassin's Creed Origins
Grand Theft Auto V
I can't remember when was the last time I saw such huge gaming cpu load on the 860. Probably never. Anything above 50% is impressive to be frank. Most of the time, hyperthreading does very little. Even so it lost badly, due to the 6 threads of the 8600k being much stronger.
Of course I have seen worse, in AotS for example but also have see better, in CoD WWII for example, where the difference was minimal. But I digress.
Also here are some relevant 8600k vs 860 graphs, from the 720p run, to better highlight the differences.
I am very curious to see how my 2500k@4.8Ghz will fare here, which will be determined in a couple of weeks, since I have to finish some 860 testing on the 1070, before decommissioning it.
I am currently actually playing the game on the stock 8600k and my 970@1.5Ghz, at Ultra minus one tier lower for the shadows and game plays absolutely fine. By fine I mean borderline 60fps.
Quite interesting game I might add. Someone could easily call it, Assassin's Creed Origin Witcher 3 edition with Far Cry Primal DLC, but they would just be mean! I am very OK if games borrow good elements of other games.