Nvidia is being quiet on VR by their standards but it's because they've got nothing much to talk about yet. As Zlatan says, they don't have the hardware (no "real" ACE's) and don't have the software.
They should have the hardware with Pascal but matching Mantle is another thing entirely. AMD's lead in VR is assured for the next 2 years I would say.
It's also important to realise that AMD hardware does have more market share than Nvidia and the game developers have likely switched to GCN.
Seriously, what?
They are talking about VR:
Siggraph: http://www.nvidia.com/object/siggraph2015-best-gtc.html
Gamescom: https://www.youtube.com/watch?v=sN-qxmb57uU
Computex: http://www.anandtech.com/show/9305/nvidia-announces-gameworks-vr-branding-adds-multires-shading
nVidia doesn't operate on the leading edge. They wait until the market is profitable and then jump in. If possible with something they can make proprietary and charge an added premium to their customers for. It works well for them and their customers seem to be OK with it.
They've missed a trick with VR and now everybody who is anybody is making games on LiquidVR instead. I think this is the big difference this time around - we're not just looking at short-term AMD gains here, it's over a period of years at least. I think even the most avid Nvidia fans will tire of being second best over a long duration.
The biggest thing about VR is the latency and the late-latching that AMD has with Mantle is a huge lead. It's hard to understand the difference until you see both side-by-side, but it will be seen in benchmarks. It will be interesting to see how Nvidia counters.
Right now... today, there is no money to be made from it. It's purely an expense. I will be shocked if AMD doesn't do like they always do and hand everything they've done over for the greater industry to use. nVidia will get involved if they need to manipulate anything for their benefit. Like adding one or two features that their hardware supports and AMD's doesn't and claiming a coupe. Other than that, they'll just join in when there's money to be made. When they do join in they will announce that what they have is better, the masses will nod in agreement, and it's business as usual.
This could conceivably happen in a couple of years but AMD's advantages are real. I think VR will really have taken off within a year and it's probably too early for Nvidia.
Don't get me wrong though, I still expect Nvidia will sell more cards for VR than AMD but the gap will really close. I think even if AMD gets close to 50% they'll be doing well but I can't see how the better informed enthusiasts won't shift over to them, assuming they actually care about their VR experience.
By the time more games heavily use async compute, it's timeframe is what, around late 2016?
We have a few on the way but I'm not inclined to think they will really push DX12.
So the current hardware differences won't matter so much (it matters some, for sure) as Artic Island vs Pascal.
One of the talking points AMD has brought up about asynchronous compute is that it will be easy for developers to get a handle on because they're already using it in console games -- console games that run on AMD GCN architecture. I would love to find out just how many ACEs the console graphics chips have.
The AMD chips closest to the custom APUs in the PS4 and Xbox One are Pitcairn and Bonaire, which each have two ACEs. Since they're custom chips though, AMD might have snuck in extra ACEs, particularly on the PS4. All GCN 1.0 chips have just 2 ACEs, but the Hawaii, which is GCN 1.1, has 8. The console chips are both most similar to GCN 1.1, from what I've heard, so a GCN 1.1 equivalent to Pitcairn might be designed for more ACEs. Just a guess though.
Its well known that PS4 has 8 ACE with 8 queues each for a total of 64 queues , similar to R9 290X (Hawaii), while Xbox One is limited to 2 ACE with 8 queues each similar to HD 7790(Bonaire).
https://en.wikipedia.org/wiki/PlayStation_4_technical_specifications#APU
http://forums.anandtech.com/showthread.php?t=2316133
Thirdly, said Cerny, "The original AMD GCN architecture allowed for one source of graphics commands, and two sources of compute commands. For PS4, weve worked with AMD to increase the limit to 64 sources of compute commands the idea is if you have some asynchronous compute you want to perform, you put commands in one of these 64 queues, and then there are multiple levels of arbitration in the hardware to determine what runs, how it runs, and when it runs, alongside the graphics thats in the system."
http://www.tomshardware.com/reviews/microsoft-xbox-one-console-review,3681-4.html
http://www.dailytech.com/Under+the+...+PS4+1+TB+Ultimate+Player+Ed/article37409.htm
"The Xbox One has only 2 asynchronous compute engines (ACEs), where as the PS4 has 8 ACEs that can work in parallel with the GPU."
http://www.anandtech.com/show/6837/...w-feat-sapphire-the-first-desktop-sea-islands
http://www.anandtech.com/show/7457/the-radeon-r9-290x-review/2
By the time more games heavily use async compute, it's timeframe is what, around late 2016?
VR isn't a big deal because nothing good is in the immediate horizon. Also I worry devs may have overestimated the enthusiasm for it, I mean as long term gamers here, do many of us enjoy playing games with a helmet on? It's such a major shift in how we game, the change factor may be off-putting.
What I noticed is that the hype for DX12/VR comes from gamedev chatter on twitter, reddit, tech forums, conferences etc. It's not like DX11 where the was little excitement coming from those who make games for a living. Because of this, I know it will be a game changer.
Sometimes the simplest things can be the most powerful things, and this is very much the case for Hyper-Q. Simply put, Hyper-Q expands the number of hardware work queues from 1 on GF100 to 32 on GK110. The significance of this being that having 1 work queue meant that GF100 could be under occupied at times (that is, hardware units were left without work to do) if there wasnt enough work in that queue to fill every SM or if there were dependency issues, even with parallel kernels in play. By having 32 work queues to select from, GK110 can in many circumstances achieve higher utilization by being able to put different program streams on what would otherwise be an idle SMX.
Dynamic Parallelism is NVIDIAs name for the ability for kernels to be able to dispatch other kernels. With Fermi only the CPU could dispatch a new kernel, which incurs a certain amount of overhead by having to communicate back and forth with the CPU. By giving kernels the ability to dispatch their own child kernels, GK110 can both save time by not having to go back to the GPU, and in the process free up the CPU to work on other tasks.
VR isn't a big deal because nothing good is in the immediate horizon. Also I worry devs may have overestimated the enthusiasm for it, I mean as long term gamers here, do many of us enjoy playing games with a helmet on? It's such a major shift in how we game, the change factor may be off-putting.
Everyone I know that is into Star Citizen, Elite, or any other space sim as well as racing enthusiasts are all waiting for VR headsets to hit the commercial market, myself included. I know several (again, myself included) who plan to own the Oculus, SteamVR, and Hololens. I add to that with plans for a StarVR when they release.
Ever since I had my first actual VR experience with DK2, I've known it'll be a hit. It was that impressive, memorable, and lasting of an experience. Anyone who has not had one on their head simply can't understand. I didn't before.
I reserve judgement until its out and I can play with it. It's a concern I have since its a revolutionary way to play games.
What, the Wii-mote didn't sell you on the experience of a new way to play games? Or Kinect?
You are sure that no iGPU can play The Witcher 3 at acceptable framerates?:sneaky:There is the graphics market and the gaming market
Most of those intel igp's are not part of the gaming market.
For instance the devs of Witcher 3 really didn't care about intl igps. No igp can play witcher 3 at acceptable framerates.
What, the Wii-mote didn't sell you on the experience of a new way to play games? Or Kinect?