Contrary to some of the comments above, hyper-threading makes very little to no difference in the vast majority of games, at least according to the AT CPU bench and virtually every reputable benchmark I've seen.
I'm not convinced this applies anymore when looking at modern games. Certainly back in the PS3/360 days games were limited by the CPUs in those consoles, and dual cores were still fairly common even in gaming PCs, then a quad core and/or HT would have seen few gains. Now with octocores in the consoles modern games are much better at scaling across multiple threads, given how long CPUs last I'd question why
not to go with an i7 for a gaming as long as the budget allows it. One other argument is that if you're GPU limited there's plenty of settings to turn down to overcome that, while being CPU limited is going to be painful even with a monster GPU and there's little that can be done about it.
Here's
DigitalFoundry on the 6700k vs 6600k:
There was a time when games only utilised one or two cores - and for those titles, an
overclocked Pentium G3258 remains the best price vs performance processor on the market. Then gradually, we saw a migration across to titles using four threads - good for the Core i3 line (two cores/four threads), great for the i5 (four full cores). Throughout this time, an i7 offered virtually nothing extra for gamers, but times have changed.
The new wave of consoles has moved us into the many-core era; out of all the games we tested here, all of them - bar Shadow of Mordor - appear to utilise all eight threads available to an i7.
However, the average frame-rate results suggest that the advantages of the i7's hyper-threading are minimal, its stock performance often overcome with an i5 overclock - but it's a different situation on when we look at
the lowest recorded frame-rates, where the i5 is disadvantaged in several titles, and there are occasions where even 4.5GHz performance can't match the i7's stock stability. We should remember that our tests here are designed to propel CPU limitations to the forefront, and our contention is that in most titles where GPU is the bottleneck, the difference will be harder to detect.
But the bottom line is this - in many-core games that hit CPU hard, the i7 6700K offers a level of stability in excess of what the equivalent i5 is capable of.
The above quote really gets to the heart of the matter - an i5 is adequate for modern games, and in many situations the performance is going to be more or less the same as an i7, but when a game starts to hammer the CPU (and more and more games are) then the i7 can hold the framerates up while the i5 struggles under the load. Since stutter and low frames are so noticeable that secures the i7 as the best choice for modern games.
The gains going from a 2500k to a 2600k (especially in gaming) would be far less impressive and in many cases close to non-existent.
The OP mentions Witcher 3 so I went searching for CPU benchmarks for the game. I've found some which show very little difference across widely varied CPUs, and others which show the opposite. I would think this is due to where those benchmarks took place, Novigrad in particular is a known CPU killer, DigitalFoundry often test CPUs by riding through the town for that reason. Here's an interesting one:
This is exactly what I'd expect to see from a modern game, it's happy to scale to as many threads as you can throw at it. Obviously there's diminishing returns; octocores don't exactly justify themselves for gaming yet. But the HT on the 2600k propels it noticeably above the 2500k. Even the FX-8 series puts in a pretty great showing considering how old it is.