Now, the problem with benchmarking cards aimed at the budget market is that virtually all tests are carried out using a Core i7 processor - and that makes sense as you are eliminating CPU as a bottleneck and highlighting hardware performance. On top of that, there's usually little difference between i5 and i7 results anyway when you are more frequently limited by GPU performance. However, on a card aimed at the budget segment, we are far more likely to see the GPU paired with a less capable processor - meaning that driver overhead becomes much more important.
Historically, AMD has had issues here. Watch the video above, and we'll see more stutter on several titles on AMD than we do on Nvidia (The Witcher 3 a particular case in point). So what happens when we re-bench the R7 370 and the GTX 950 on a Core i3? Stutter increases on AMD, but Nvidia is clearly impacted too. It's not entirely uniform across every game though - Crysis 3 hammers CPU, yet the R7 370 holds onto its performance just as well as the GTX 950. Also note the frame-time dips seen in Assassin's Creed Unity whether you are running on an i3 or an i7 - this isn't down to driver overhead, but rather the 2GB VRAM limit on both cards. But again note that the latency spikes are more pronounced on the AMD side.
Visualising this kind of information in a bar chart or a table isn't easy - it's something we're working on behind the scenes, but we can tell you the lowest and average frame-rates on both Core i3 and i7 with both cards. Lowest frame-rate is almost as blunt a metric as the average, but it does demonstrate that Nvidia is ahead and holds more of its performance - but again, the videos and the frame-time graphs are much more illuminating in terms of the actual experience. Our takeaway here is that AMD's driver overhead is still higher than Nvidia's, but it looks like improvements made on the 300 series launch driver do produce more stability when slower cards are paired with the Core i3.