Previously on my GTX 960, I would get a constant 99% GPU Usage when running Firestrike.
Now though, on a GTX 1060, I see that it is instead maxing out at 97-98%. Sure, it's not a major difference, but nonetheless a difference exists and I would like to know why.
I looked at the CPU usage, and it doesn't seem to be an issue during the Graphics test for Firestrike. I have an i5 3330 and the CPU usage doesn't go much higher than 50%.
I also have a PCIE 3.0 x16 slot so doesn't look it should be losing out there.
Can anyone hazard a guess as to why this happens? Would be interesting to see if I can get it back to a constant 99% or not.
Now though, on a GTX 1060, I see that it is instead maxing out at 97-98%. Sure, it's not a major difference, but nonetheless a difference exists and I would like to know why.
I looked at the CPU usage, and it doesn't seem to be an issue during the Graphics test for Firestrike. I have an i5 3330 and the CPU usage doesn't go much higher than 50%.
I also have a PCIE 3.0 x16 slot so doesn't look it should be losing out there.
Can anyone hazard a guess as to why this happens? Would be interesting to see if I can get it back to a constant 99% or not.