I tried the beta on my GTX 570 SLI rig, at 1080P, 1XSMAA, All maxed, motion blur off.
With a single card the game was running at 30fps which is not to my liking, but it was quite playable still.
Turning SLI on, I got a solid 60 fps. Pretty impressive, considering that I don't recall my SLI scaling being so good before. I mean I've seen up to 80% scaling or maybe 90 but a solid 100% never. Either Nvidia has worked some magic on their drivers, or Crytek has built this game from ground up, with multi gpu in mind.
With these settings the game looks fantastic and I really don't think that motion blur has any use, if a game runs at a native 60fps anyway.
I wonder how my old 5850 CFX system will fair with this.
I took the time to test Crysis 3 beta on my old Q9550@4Ghz + 5850CFX @850Mhz (13.2 beta 4).
At 1080P, all maxed, no motion blur, no AA the game was very jerky and unplayable but the gpu usage was jumping from 50% to 70%. The primary suspect was the limited framebuffer of my 5850s and then the cpu.
I reduced the texture size to high and the game run very nicely indeed, hovering around 45fps.
The gpu usage was still around 75% for each card though and this is a clear cpu limit indication.
Indeed, monitoring my Q9550's usage, showed the game is really hammering the poor guy.
This shows how far Crytek has progressed with their code and that makes me very glad.
Speaking about cpus, my 2500k@4Ghz tested above, was easily processing the data required for 65fps and had some spare headroom as well. That makes a 2500k more than 50% faster than an equally clocked Yorkfield.
In any case, it's good to know that two 5850s, with the above settings and a faster cpu, could reach 60fps, if clocked a bit higher as well.