Is the 7850 worth it?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

gammaray

Senior member
Jul 30, 2006
859
17
81
Yeah I am pretty much riding on the fact that 7850 crossfired will be powerful enough to keep up. The reason I did not chose the cheaper 6870 or 560ti were because I think 2GB of ram will be needed to keep frame rates up at high resolutions. Skyrim still runs fine at medium settings on the 4770 so I don't expect the 7850 to fall behind too quickly. By the way, which version of the card would you chose? Is it worth buying the custom cooler versions or would I be better off with the stock Powercolour
No everything was stock. The fan started buzzing loudly so I sent it to NCIX for RMA. It took two months for Sapphire to tell me I have two options, accept a 6770 or take $100 in cash. I think we all know which option I took.

my sapphire 5850 fan did that as well except i replaced the fan myself without rma ing it...

(note to self: don't buy sapphire cards with crappy fans.)
 

StinkyPinky

Diamond Member
Jul 6, 2002
6,883
1,096
126
Wtf? A 7850 can definitely take advantage of the 2gb ram.

Add in the fact that in a years time you can pick one up for el-cheapo and have x-fire 7850's doing serious damage to games and all with its 2gb vram.
 

jmgamer

Member
Mar 16, 2012
36
0
0
Get 7850, splurge little more with 7870, or wait for kepler competition.

Damn, don't know what to do.
 

lifeblood

Senior member
Oct 17, 2001
999
88
91
Get 7850, splurge little more with 7870, or wait for kepler competition.

Damn, don't know what to do.
I WANT to wait for Kepler competition but I've already waited since December and I have no idea when NV will launch the competition. This is one of those times that if NV doesn't say anything it will lose out on a sale.
 

rdsn

Junior Member
Mar 1, 2012
20
0
0
I do have a very limited budget meaning I can only buy cards once every three or four years, so I think I will go with the Sapphire 7850. Thanks for your help everyone!
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
The same principle applies to AMD too. I'm not disagreeing with you, just that CPU bottlenecks are overblown sometimes, because few games can really take advantage of multiple cores. See, e.g.: http://www.techspot.com/review/458-battlefield-3-performance/page7.html An AMD Phenom II X2 560 (3.3GHz) was enough to keep up with much costlier CPUs. And this is not a wimpy game, either.

His X3 CPU (NOT yet unlocked) was tested here: http://www.guru3d.com/article/athlon-ii-x3-435-processor-review-test/14 It's not that bad of a CPU to require upgrading from, imho, unless he games at lower resolutions.

CPU bottlenecks are NOT overblown. My 4.1GHz 965BE is a pretty big bottleneck for me in several games, including BF3. Your link even shows the 560 is a huge bottleneck, the CPU is at 96%! That means the CPU is pegged half the time.

For newer games (Like BF3) a quad core is required to get decent performance in multiplayer. Anything less and the GPU (provided its half decent) is going to sit there twiddling its thumbs.
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
Yes, my cost analysis says the 7850 is the best cost for performance (when electricity costs are factored as well as cost of card vs. AT FPS @ 1920x1200) of any card other than a 6870 at 4 year change frequency, but it's a pretty close race. When all are OCed, the 7850 wins hands down.

2 years is a different story, but the @ 4 years, the power consumption advantage makes a real cost difference for the 7850 vs. deeply discounted older cards. At my power costs (which are probably higher than anyone else's in the world, it ends up at a $100 advantage with 2 hours of gameplay and 4 hours idle per day) over a 570, but at more modest power costs around 10 cents / kWh, it still holds an advantage over the 570.


It will be interesting to see how the GK106 can stack up to the 7850. The 7850 is significantly better in terms of power usage per FPS than a 7900 card, even OCed, and GK106 (from the leaks) looks to be at 1/2 the shaders and 3/4 the ROP of GTX680.

Midrange may be even more competitive than the high end this generation.

I'm wanting to favor nVidia right now due to the adaptive vsync, but mostly because of these annoying issues with the AMD drivers where anytime I have a flash webppage open (youtube, FB games, etc...) and alt-tab into a game, the FPS drops to a fifth of what it should be. It seems like it should be a minor issue, but it's been around for months now. You'd think it would be fixed quick, but no. Hopefully it actually makes (financial) sense to favor nVidia too. Here's to more waiting.... c'mon GK106. Hopefully it's out by the time IB is out and I'm ready to upgrade my CPU too.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
CPU bottlenecks are NOT overblown. My 4.1GHz 965BE is a pretty big bottleneck for me in several games, including BF3. Your link even shows the 560 is a huge bottleneck, the CPU is at 96%! That means the CPU is pegged half the time.

For newer games (Like BF3) a quad core is required to get decent performance in multiplayer. Anything less and the GPU (provided its half decent) is going to sit there twiddling its thumbs.

Look at how few frames you're losing with the 560, on the third picture in the TS post, relative to much more powerful CPUs: http://www.techspot.com/review/458-battlefield-3-performance/page7.html

And that's at 16x10. Look at 1200p results and the big cluster of minimum framerates at 51 to 54 fps. Average framerates for almost every CPU they tested was exactly 67 fps.

At 1600p or if you turned up graphics settings even higher (more AA or whatever), any guesses as to how much more compressed the results would be? Their own conclusion is that "[o]n the CPU side of things, we found that Battlefield 3 is not nearly as CPU demanding as many have made it out to be."

They do note that some games do a lot better on quad cores, but the poster above had a tri-core which is usually closer to quad core than dual core performance... plus he unlocked and overclocked it so he basically does have a quad-core anyway. And for some of the games where dual cores lag behind, who cares if you get 65fps on a slow dual core vs. 120 fps on a fast quad core. 65fps is still faster than most people's monitor's refresh rate. Until next-gen consoles come out, I doubt we'll see that many games really kill CPUs... welcome to consolification, where even semi-ancient hardware (the high end 5 years ago, which would be something like an oc'd Core 2 Duo [probably at about 3.6+ GHz] with oc'd 8800 GTX [roughly equivalent to a stock 6750 unless you hit a VRAM wall]) can STILL give you playable framerates. Not 60+ average fps, more like 30+ but that's playable in my book.

So yes, CPU bottlenecks are real, but they are not necessarily crippling, and in some non-action games low framerates aren't a big deal in the first place. Or if you are already at 60fps+.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I'm wanting to favor nVidia right now due to the adaptive vsync, but mostly because of these annoying issues with the AMD drivers where anytime I have a flash webppage open (youtube, FB games, etc...) and alt-tab into a game, the FPS drops to a fifth of what it should be. It seems like it should be a minor issue, but it's been around for months now. You'd think it would be fixed quick, but no. Hopefully it actually makes (financial) sense to favor nVidia too. Here's to more waiting.... c'mon GK106. Hopefully it's out by the time IB is out and I'm ready to upgrade my CPU too.

You can right click on any flash object (like a Youtube video) and go to Settings. Uncheck the box asking if you want hardware acceleration. Do this for each browser you use. It should save the settings automatically so you don't need to do it again until the next Flash update, in which case you need to do it for each browser again, but thankfully Flash updates rarely happen. That should take care of the problem, assuming your CPU is strong enough to render the Flash video all by itself, which it probably is. What is causing the problem is AMD's UVD kicking in and resetting your clocks, so just disable that stuff.

You can also disable hardware acceleration in your media players as well to prevent that from happening... actually I can go both ways on that: I use VLC without hardware acceleration, but when I *do* want hardware acceleration I have MPC w/ hardware acceleration active.
 
Last edited:

Concillian

Diamond Member
May 26, 2004
3,751
8
81
This is the way CPUs have been for a LONG time. I remember it being that way even in the AthlonXP days. CPU was important to a point, but most of the low minimum FPS dips that caused slowdowns you actually noticed were GPU related.

I actually FRAPSed a couple games doing CPU, memory and GPU @ underclocked, stock and overclocked and for the games I tested at settings I usually played at the CPU and memory speed was virtually irrelevant. Your average FPS would get lower when these were underclocked, but this would primarily come from high FPS locations. Dips / minimums were almost exclusively GPU based.

I haven't done this kind of thing with any modern game (it takes a LOT of time), but I doubt it's really changed much.
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
You can right click on any flash object (like a Youtube video) and go to Settings. Uncheck the box asking if you want hardware acceleration. Do this for each browser you use. It should save the settings automatically so you don't need to do it again until the next Flash update, in which case you need to do it for each browser again, but thankfully Flash updates rarely happen. That should take care of the problem, assuming your CPU is strong enough to render the Flash video all by itself, which it probably is. What is causing the problem is AMD's UVD kicking in and resetting your clocks, so just disable that stuff.

You can also disable hardware acceleration in your media players as well to prevent that from happening... actually I can go both ways on that: I use VLC without hardware acceleration, but when I *do* want hardware acceleration I have MPC w/ hardware acceleration active.

Hmm... I didn't know this was the issue.

I wonder if I can find the UVD clocks in BIOS and set them the same as 3D clocks. Do you know if the UVD clocks are different from the 2D clocks in BIOS? If there are separate UVD and 2D clocks, this sounds like a relatively easy thing to fix to make the next couple months more bearable until this card gets relegated to my wife's machine.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Hmm... I didn't know this was the issue.

I wonder if I can find the UVD clocks in BIOS and set them the same as 3D clocks. Do you know if the UVD clocks are different from the 2D clocks in BIOS? If there are separate UVD and 2D clocks, this sounds like a relatively easy thing to fix to make the next couple months more bearable until this card gets relegated to my wife's machine.

They are different (unless you are multi monitior in which case I think they are close to the same if not the esame), and I wouldn't mess in BIOS just for that. If I were you, I'd just disable hardware acceleration in the relevant programs you use.
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
I OC via BIOS flashing rather than Overdrive anyway (lets me muck with voltage, etc...)

Something fun to do when I get a bit of spare time. Not really afraid of flashing. I think I've flashed every card I've had since my Radeon 9500.
 
Last edited:

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Look at how few frames you're losing with the 560, on the third picture in the TS post, relative to much more powerful CPUs: http://www.techspot.com/review/458-battlefield-3-performance/page7.html

And that's at 16x10. Look at 1200p results and the big cluster of minimum framerates at 51 to 54 fps. Average framerates for almost every CPU they tested was exactly 67 fps.

At 1600p or if you turned up graphics settings even higher (more AA or whatever), any guesses as to how much more compressed the results would be? Their own conclusion is that "[o]n the CPU side of things, we found that Battlefield 3 is not nearly as CPU demanding as many have made it out to be."

They do note that some games do a lot better on quad cores, but the poster above had a tri-core which is usually closer to quad core than dual core performance... plus he unlocked and overclocked it so he basically does have a quad-core anyway. And for some of the games where dual cores lag behind, who cares if you get 65fps on a slow dual core vs. 120 fps on a fast quad core. 65fps is still faster than most people's monitor's refresh rate. Until next-gen consoles come out, I doubt we'll see that many games really kill CPUs... welcome to consolification, where even semi-ancient hardware (the high end 5 years ago, which would be something like an oc'd Core 2 Duo [probably at about 3.6+ GHz] with oc'd 8800 GTX [roughly equivalent to a stock 6750 unless you hit a VRAM wall]) can STILL give you playable framerates. Not 60+ average fps, more like 30+ but that's playable in my book.

So yes, CPU bottlenecks are real, but they are not necessarily crippling, and in some non-action games low framerates aren't a big deal in the first place. Or if you are already at 60fps+.

With a 560 you are GPU bound, which is why all CPU's get the same score. Throw a faster card in there and your CPU usage will climb high.

For instance, when I had the same CPU I have now, but with a 5750, my CPU usage in BF3 was about 40-45% on average.

Then I put in the 7950, and my CPU usage shot up to 90%. This is because the new card is far faster so it removes the GPU limitation.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
With a 560 you are GPU bound, which is why all CPU's get the same score. Throw a faster card in there and your CPU usage will climb high.

For instance, when I had the same CPU I have now, but with a 5750, my CPU usage in BF3 was about 40-45% on average.

Then I put in the 7950, and my CPU usage shot up to 90%. This is because the new card is far faster so it removes the GPU limitation.

If you actually read the review, you'd see that the "560" he talked about refers to the CPU (AMD Phenom II X2 560), not the GPU. They used a GTX 580 to test with which is faster than the card we're talking about in this thread (7850). Please read the TechSpot review for more information. Once again, I'm not saying CPU bottlenecks don't happen, just that they are blown out of proportion to their real-life impacts and prevalence.
 
Last edited:

rdsn

Junior Member
Mar 1, 2012
20
0
0
I used to have a AMD Athlon 64 x2 4800+ and I used both the ATI 4770 and 5850 with it playing skyrim. Between the two there was no noticeable difference in framerates whatsoever, I'll say that is a pretty bad CPU bottleneck.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
I used to have a AMD Athlon 64 x2 4800+ and I used both the ATI 4770 and 5850 with it playing skyrim. Between the two there was no noticeable difference in framerates whatsoever, I'll say that is a pretty bad CPU bottleneck.

Skyrim used to be ridiculously CPU limited pre 1.4 patch. Bethshda didn't even bother to enable SSE instruction (sic)! for the PC version. It goes to show how much they care about pc customers.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I used to have a AMD Athlon 64 x2 4800+ and I used both the ATI 4770 and 5850 with it playing skyrim. Between the two there was no noticeable difference in framerates whatsoever, I'll say that is a pretty bad CPU bottleneck.

You might also have other system bottlenecks like RAM or whatever, but it's probably the CPU.

Anyway, that CPU came SEVEN years ago and is slower than an E6400, and about on par with an E6300. All stock vs. stock. And if you actually can't notice framerate then perhaps you were already over 30 frames per second in either case, which would actually prove my point even more, because it's another 2 years older than the hypothetical Core2Duo@3.6+ + oc'd 8800 GTX setup I gave as an example. If you think a CPU from 7 years ago won't bottleneck Skyrim, especially at stock, good luck. But I bet a FIVE year old Core 2 Duo @ 3.6+ wouldn't bottleneck that 5850.
 
Last edited:

deadken

Diamond Member
Aug 8, 2004
3,196
4
81
Admittedly I haven't read the entire thread. But, having skimmed over the parts about dual vs. tri vs. quad cores, I figured I'd mention my recent experience.

I have a PII dual core BE CPU. I am able to unlock it to a quad core and overclock it to 3.6 without issues (haven't ever tried higher). I just had a problem with my system last night and somehow my BIOS settings got reset. I corrected what I remembered and then rebooted the system. During boot up I noticed that the 'extra' 2 cores weren't unlocked and I missed the opportunity to press the appropriate button to enable the motherboard to unlock them. Just for sh!ts and giggles, I loaded up MSI Afterburner and BF3 to see what a difference a quad core vs. dual core made. All in all, I got 20FPS on average vs. 30FPS on average (Caspian Border, multiplayer). I saw that instead of my 8800GTS (640mb) being pegged at %98+, it was only utilized around %70 (can't remember if it was %60 or %80).

I rebooted and pressed the 'unlock' key and was back up to 30FPS, so the number of cores were the only real variable. Please understand that I REALLY know it is time for me to upgrade my video card. It's the only reason that I've been spending so much time in the the 'Video Cards and Graphics' section of AT. I've been itching to buy since December and wanted a 7870 until the Feb launch date was missed, and then Nvidia released the 680 so close to the 7870 release date.

For now I am banished to 'wait and see' purgatory.

BTW: I know that my video card is the weak link of my system. It's also the oldest part. The rest of my system specs are:
AMD Phenom II 555 BE
Asus M4A79XTD EVO mobo
96GB Kingston SSD
640GB WD Black HD
2x4GB Ram
BFG OC 8800GTS 640MB
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |