happy medium
Lifer
- Jun 8, 2003
- 14,387
- 480
- 126
Think about the i5 2500K, its quite old
How about for guys like me with my cpu? I was gonna buy a used 7970 for 150$ but it might be better for me to buy the Nvidia equivalent.
Think about the i5 2500K, its quite old
I agree , but the i3 has 2 cores and 4 threads and that seems to help a lot.
How about for guys like me with my cpu? I was gonna buy a used 7970 for 150$ but it might be better for me to buy the Nvidia equivalent.
bother getting anything above a 750ti performance because you will be CPU bottlenecked
My cpu can push a gtx680/gtx770/gtx960/r9 280x/7970 as per the research I've done.
Not saying that's the route i'm going.
i3 is bottlenecked in GTA V at ~30 min fps and ~42fps average. Not sure where an older Q series lands on there.
__________________
i3 just suck for modern gaming on any GPU that is stronger than it's CPU bottleneck.
There's no magic, check out NV's "driver overhead" with Titan X with an i3.
http://www.eurogamer.net/articles/digitalfoundry-2015-the-best-pc-hardware-for-grand-theft-auto-5
GTA V, the most CPU limited game of late:
https://www.youtube.com/watch?v=SLs-sMteggg
I wouldn't say its a driver overhead issue, its simply a case of major CPU bottlenecks, unable to deliver fps potential that the stronger GPU can handle.
The 750Ti or R260X is maxed out on FPS potential even with the lowly i3, but the R270X & R280X has GPU grunt left on the table, similar to the Titan X being held back by the i3.
How about 55 fps average 26 fps min.. See 18:21 of first video.
http://youtubedoubler.com/?video1=4vnvt6KpODw&video2=IZ0EqIZmYkI
My guess is the gtx570 was holding the old q9550 @ 4ghz back a bit.
I hate having to recommend the GTX 960 or even worse the GTX 750 Ti over the R9 280 to budget builders, but god, look at how the R9 280 tanks against the GTX 750 Ti in the GTA V benchmark in the driving section at the end when both are paired with an i3-4130.
https://www.youtube.com/watch?v=9pxeF08qmtg
Pre scripted gpu benchmarks...
they have nothing to do with actual gameplay it's like showing cinebench scores and claiming that you will get the same difference in games that you will see in cinebench.
It just doesn't work like that.
Even a celeron gets 25fps on gta 5 ,with no drops, while recording so it's definetly no cpu problem,it's just not a real life benchmark.
See a celeron run gta 5
For the second part, about the i3 beeing close to the pentium,again people look at benches like cinebench and draw their conclusions from that.
Look at some real life examples, youtube is full of them,with games that actually use 4 or more threads hyperthreading works wonders giving double the fps of a same Ghz pentium.
https://www.youtube.com/watch?v=nIcVetS92ic
Again, moral of the story: powerful AMD GPUs need a more powerful CPU to drive it than its Nvidia counterpart (except if you pair a 1000$ Titan x with a 140$ cpu)t, especially in games that utilize lots of cores.
I fixed that for you.
Here's a recent article from computerbase.de where they analyzed CPU scaling on R290X & 980 in a bunch of games:
http://www.computerbase.de/2014-12/early-access-spiele-benchmarks-test/2/#abschnitt_dayz
No difference. A few games lost performance going to dual cores, but the % loss is the same for AMD/NV.
The only benches I've seen where AMD GPUs was negatively affected by CPU cores/speed are under 2 scenarios:
1. NV sponsored games.
2. AMD FX CPUs. This one is easy, the CPU arch really needs multi-threaded drivers to take advantage of the architecture, but AMD doesn't have it. Whereas on an Intel SMT CPU, even a lowly i3, AMD performs just fine compared to similar NV GPU.
So its kinda funny that AMD GPUs work poorly with AMD CPUs, but pair it with a cheap Intel, it does just fine (outside of NV games).
Interestingly it looks like AMD does better in 2C setups, add HT or more cores and Nvidia tends to scale better (alternatively you could think of it as nvidia tanking at 2C CPUs). Nvidia also seems to have problems with HT on 4C CPUs.
Yes this is confirmed by pclab.pl
http://pclab.pl/art55238-3.html
This video sheds light on the situation I'm concerned with
https://www.youtube.com/watch?v=lQzLU4HWw2U
Some gamers on a budget will choose to spend more on a graphics card than a CPU, that makes perfect sense since even i3s don't bottleneck most games below 60fps yet. The problem is, those graphics cards are often tested on high end CPUs, which does not tell the whole story.
The current batch of AMD cards lose a big chunk of their performance if paired with a low end CPU, more than Nvidia does, even if the AMD cards are more powerful when paired with high end CPUs.
This is something reviews have not addressed. It's quite important, as if you were looking for a build under say 700, you'd probably go with an i3 and try to put 200 into the graphics, just for example. If you look at AMD card reviews, they may do better than an Nvidia one, but the problem is they'd lose more performance with the low end CPU than the Nvidia does.
This was great info, the money u save on going AMD u loose in performance if u gimp the cpu to.
uhhhh have you read any of the other posts in this thread dude?
Pre-scripted benchmarks are game play benchmarks at a repeatable location of the game. They are game play benchmarks, so I don't know how you can say they are not accurate. They may not tell the whole story, if there are other areas of the game which are easier and/or harder on your system, but that is no better than you remembering an easy spot of the game which made things look good.
Yes and i also watched how that 280 dipped down everytime the cpu struggled unlike the nvidia cards. All i needed to know, rest of u can continue bickering about whatever u want for all i care.