[techpowerup] Radeon HD 7970 FOB Price Cut to $475

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Are we really going down this road about what is "enough." A subjective by nature topic? John A uses 8xAA and John B uses 32xSSGSAAwithJelly. Both John A and John B have the same exact experiences, except John A only spends $300 for his videocard and John B spent $600.

I get people have to justify their expenses "well mine can do all that AND FOLDING@HOME, NEWB" but holy crap the day we start telling someone "that isn't GOOD ENOUGH FOR YOU" is the day we've all just because corporate monkies.

There hasn't been a game since 3/2010 I couldn't play at settings I was satisifed with (ie I don't crack on the AA, I actually don't see a huge difference from 4x to 8x in majority of games) minus those odd balls (BF3 with MSAA or Batman: AC in DX11 mode).

I guess my HD 5870 wasn't good enough for the last 2-3 years. I don't see any games on the horizon that will break my card, but like someone who said they got Amazon gift cards - I'm at my upgrade period.

I'll keep in mind that the HD 7970 isn't good enough for me based on John B's opinions.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I don't use MSAA in Skyrim, but I do use FXAA with AO and my fps teeter on the edge of 60 fps at times.

With a 7970 I'd have to reduce my settings, or buy a second, when the first one already costs more than what I currently have.

It's clear we're going down a subjective path of argument, me, my, I aren't really things people can, nor should they argue with unless another persons beliefs are blatantly wrong.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
I don't use MSAA in Skyrim, but I do use FXAA with AO and my fps teeter on the edge of 60 fps at times.

With a 7970 I'd have to reduce my settings, or buy a second, when the first one already costs more than what I currently have.

It's clear we're going down a subjective path of argument, me, my, I aren't really things people can, nor should they argue with unless another persons beliefs are blatantly wrong.


1 x 7970 should be enough.


Tomshardware:

The close results suggest a CPU bottleneck. Skyrim didn’t present much of a challenge for these cards at its maximum settings with 4x MSAA, so we’ll enable transparency anti-aliasing and see if anything changes.



^ thats like as much as is possible of eyecandy to turn on, @ 1920x1080.

And 1 x 7970 is doing it fine, its beating the 590/6990 and 580/6970.... ect, yet notice how close they are? bottleneck is still not the GPU but the CPU.

at any point, the 7970 is more GPU than Skyrim needs at 1920x1080.
Unless you dont think 70+ fps is enough.... you could probably overclock your CPU abit if its not.

Above is done with a Intel Sandy Bridge 2500k@ 4.0ghz, and the thing holding the fps back, is clearly the CPU.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Are we really going down this road about what is "enough." A subjective by nature topic? John A uses 8xAA and John B uses 32xSSGSAAwithJelly. Both John A and John B have the same exact experiences, except John A only spends $300 for his videocard and John B spent $600.

I get people have to justify their expenses "well mine can do all that AND FOLDING@HOME, NEWB" but holy crap the day we start telling someone "that isn't GOOD ENOUGH FOR YOU" is the day we've all just because corporate monkies.

There hasn't been a game since 3/2010 I couldn't play at settings I was satisifed with (ie I don't crack on the AA, I actually don't see a huge difference from 4x to 8x in majority of games) minus those odd balls (BF3 with MSAA or Batman: AC in DX11 mode).

I guess my HD 5870 wasn't good enough for the last 2-3 years. I don't see any games on the horizon that will break my card, but like someone who said they got Amazon gift cards - I'm at my upgrade period.

I'll keep in mind that the HD 7970 isn't good enough for me based on John B's opinions.

Sure, a single HD 5870 may of been good enough for you but there are some that moved beyond that, have gone multi-GPU, and may be looking at the HD 7970 now, maybe for CrossFire. What if you decided to multi-monitor game or try Stereo3d or one of the gamers that desire 120hz gaming?

Not all gamers that spend 550 dollars for a GPU game at 1920 x 1080 and x4 AA. Higher price-point customers, some may look higher than just sweet spot settings or the most popular settings. That is why the card comes with 3 gigs of ram.
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Or at least a faster cpu.

Either way I use AO (does AMD even support AO in Skyrim?), without it I get fps exceeding 120 avg.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Skyrim needs to be benched at 2560 for high end cards quite obviously.

I won't bore you with details but the programmers at Bethesda are barely competant to say the least. Anyway the game is CPU bound--- As such it is a horrible benchmark to use for any type of GPU comparison.

Still has a 2gb address limit too IIRC, lmao. Bethesda sucks. Why are their games so great but bug ridden, poorly optimized pieces of crap? They should hire some real programming talent someday.

PS: anyone try the PS3 version of Skyrim? Amazingly , it is buggier than any bad PC port i've played. My buddies say the game is unplayable, the game becomes a slide show and crashes after 10-30 hours of play. It also apparently
destroys PS3 hard drives. More proof that the programmers at bethesda are incompetant.
 
Last edited:

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Either way I use AO (does AMD even support AO in Skyrim?), without it I get fps exceeding 120 avg.
So you go from 120 avg fps with this off, to turning it on, and just barely teetering on 60 fps avg?
Holy cow...... must be some HUUUUUGE image quality you get from that.

(looks below at image quality differnces)


With AO on:




Without:





To me it just looks like things go abit darker in areas where they should be shadows,
but you lose alot details in those area's.

Honestly Im not sure forceing AO in a game, that doesnt come withit, via driver hacks from old fallout3 profiles, is worth it. However it looks like its possible for nvidia users, im still not sure if amd users can do it.
 
Last edited:

Lonyo

Lifer
Aug 10, 2002
21,939
6
81
I won't bore you with details but the programmers at Bethesda are barely competant to say the least. Anyway the game is CPU bound--- As such it is a horrible benchmark to use for any type of GPU comparison.

Still has a 2gb address limit too IIRC, lmao. Bethesda sucks. Why are their games so great but bug ridden, poorly optimized pieces of crap? They should hire some real programming talent someday.

PS: anyone try the PS3 version of Skyrim? Amazingly , it is buggier than any bad PC port i've played. My buddies say the game is unplayable, the game becomes a slide show and crashes after 10-30 hours of play. It also apparently
destroys PS3 hard drives. More proof that the programmers at bethesda are incompetant.

They can't even do basics like properly code the ability to change controls.
Not only that, but they failed it in Fallout 3, patched it eventually, and they STILL failed it again in Skyrim.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
To me it just looks like things go abit darker in areas where they should be shadows, but you lose abit of quality too, things go so dark, its like your not able to see the things shadowed anymore.


With AO on:




Without:






http://www.bestgaminglaptop.net/new...d-to-skyrim-cod-mw3-with-nvidia-beta-drivers/


TOP picture = without AO (ambient occlusion)
Buttom pictire = with AO (ambient occlusion).

checking googleing to find out if AMD cards support it.


The part that irks me about driver level ambient occlusion is that sometimes it's just flat out wrong. For example, the fireplace in the foreground looks fantastic with AO on, as do some of the wooden poles towards the ceiling. However, there are odd shadows that encircle the entire pole when AO is enabled, which, to me, don't look accurate. As I understand it, this type of AO is just a quick shader hack (hence the performance hit), but it seems like it's over-zealous in places. I'd be interested to see more development in it though. NVIDIA had driver-level AO back when I had my GTX295, it just seems like it hasn't improved. I don't know what's taking AMD so long to get on that bandwagon, and maybe someone with more knowledge can chime in, but would OpenCL make something like this easier to do?
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
According to Steam, Skyrim is the fastest selling title (popular?) EVER ?

Earlier this month, Bethesda reported that it has shipped 10 million copies of Skyrim, and Steam reported the game to be the “fastest selling title” in its history.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
According to Steam, Skyrim is the fastest selling title (popular?) EVER ?

Its not surprising. Skyrim is a fantastic game marred by a console-centric UI and tons of bugs. That is par for the course at bethesda, same thing happened with morrowind and oblivion.

The level of incompetence in the programming is just amazing, though. You should read some of the horror stories at amazon.com or bethesdas forums about the PS3 version. Apparently it crashes nonstop and becomes a slide show after 20-30 hours of play.

Also, the PS3 version destroys the PS3 HDD after extended use. Amazing.
 

Oyster

Member
Nov 20, 2008
151
0
0
Its not surprising. Skyrim is a fantastic game marred by a console-centric UI and tons of bugs. That is par for the course at bethesda, same thing happened with morrowind and oblivion.

The level of incompetence in the programming is just amazing, though. You should read some of the horror stories at amazon.com or bethesdas forums about the PS3 version. Apparently it crashes nonstop and becomes a slide show after 20-30 hours of play.

Also, the PS3 version destroys the PS3 HDD after extended use. Amazing.

You're playing it wrong.
 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
JAG87 wrote:
The 7970, being barely faster than a Fermi card that is almost 2 years old architecture, doesn't give any indication of being able to stay on top for 2 years (actually it may be just a few months before it gets obliterated), hence it's a bad investment. Of course it may overclock by 30% like some claims, but this is all garbage until you see it with your own eyes that it overclocks and it's actually stable.
Yeah...those dumb guys setting all the overclocking records with it must be delusional....:whiste:
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Sure, a single HD 5870 may of been good enough for you but there are some that moved beyond that, have gone multi-GPU, and may be looking at the HD 7970 now, maybe for CrossFire. What if you decided to multi-monitor game or try Stereo3d or one of the gamers that desire 120hz gaming?

Not all gamers that spend 550 dollars for a GPU game at 1920 x 1080 and x4 AA. Higher price-point customers, some may look higher than just sweet spot settings or the most popular settings. That is why the card comes with 3 gigs of ram.

I've already tried all that minus adding a second card. Personally I'm not sold on 3D gaming and I can run the games I need two monitors for fine.

That doesn't change my post. Why are people telling others what is fine for others? One person said his HD 5870 was enough for him, and he was ready to upgrade now - he was berated for not waiting for Kepler and that HD 7970 wouldn't be enough for him. Do you not see where these posts are no less corporate babble?

Are we so invested in one camp we'd tell people to wait when they are in the market for buying and then berate them because we know more about their wants/needs? Glad I'm still not an every day poster here. You guys act more like children at times.
 

scooterlibby

Senior member
Feb 28, 2009
752
0
0
IMO this is literally nothing more than sound supply/demand inventory management.

The $525-$550 price-point was based on expectations of the supply being S0 units/day and the demand being D0 units per day where S0 was expected to be nearly equal to D0 (i.e. balanced supply/demand because of the price target).

What I read into this shift is that AMD has either internally revised their predictions of consumer demand downward (D0 was too high, so now they have D1 where D1 < D0), or they are seeing better than anticipated 28nm yields and are realizing that supply needs to be revised upwards and in response to this excess supply they need to lower prices and increase demand.


Image Source

Personally I would bet that they are seeing better than expected 28nm yields and have adjusted price down slightly so as to promote a slight increase in demand to avoid needless inventory buildup.

Seriously, thank you for posting this! I was reading some of the earlier theories thinking 'I wish Idontcare would lay it straight like usual,' and you did hehe. Kudos from an econ/stats nerd.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
JAG87 wrote:
Yeah...those dumb guys setting all the overclocking records with it must be delusional....:whiste:


WTF does this have to do with anything. Overclockers will use anything they have at their disposal at present time to set the highest possible record. How does this translate to the card being good for a gamer that doesn't have an LN2 pot on this components?


An 8800GTX from 2006 can still run any game with better visuals than any console can. I see the point you're trying to make, but it's invalid because it's wrong.

You're using unnecessary amounts of IQ enhancements to justify needing more performance. So while I'm sure you enjoy 8xAA for it's huge performance hit and negligible gain in IQ over 4xAA then by all means go ahead and use that as a way to say that a 5870 won't cut it for 2-3 more years. While you're at it you can also say that an 8800GTX is unable to run games at console quality levels also. I'm sure anyone still using an 8800GTX will tell you that it's still going strong even though it's 6 years old. A guy on my COD team is still using one to run BF3 fairly well.


I see, so now we're buying top of the line GPUs to play at mediocre settings? Necessary amounts of IQ enhancements are what justify spending $550 on a GPU. Review numbers mean very little, they are just a way to compare apples to apples. There are settings such as TRSSAA that make a huge difference to IQ. MLAA and FXAA are no substitute for it. Too bad AMD doesn't even support TRSSAA in DX10/11. Their product is literally unfit to be called high end, because it doesn't produce high end IQ.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I see, so now we're buying top of the line GPUs to play at mediocre settings? Necessary amounts of IQ enhancements are what justify spending $550 on a GPU. Review numbers mean very little, they are just a way to compare apples to apples. There are settings such as TRSSAA that make a huge difference to IQ. MLAA and FXAA are no substitute for it. Too bad AMD doesn't even support TRSSAA in DX10/11. Their product is literally unfit to be called high end, because it doesn't produce high end IQ.

Did you really just use this argument? It boggles my mind that you bring this up as an argument. AMD has better AA than nvidia. Transparency SS is not supersampled AA and it is not in the nvidia driver (AMD does include SSAA.)
True SSAA has a *huge* performance hit and nvidia doesn't include it. TrSS is a cheap form of AA just like FXAA.

Supersampling renders the scene at a considerably higher resolution and then down-samples pixel samples to the required resolution. This costs a lot of performance and isn't practical in most scenarios. However one benefit of supersampling is that it does collect (and sample) every single pixel in a scene, allowing for optimum image quality. SSAA can't be used in 95&#37; of games because it comes at a considerable performane cost. AMD offers it and nvidia doesn't, unless you want to muck around in nvidia inspector. Even then, it only works about 20-30% of the time and with ridiculously low framerates.

SS Transparency is essentially a complement to conventional sampling patterns. It works by anti-aliasing samples taken from within polygons where transparent textures are used to create effects that would cost too much if replicated with raw geometry. These transparent textures include objects like trees, grass and chain-link fences. The nice thing about SS transparency is that there is little performance cost. But the downside is that SS transparency is a joke and looks like garbage compared to SSAA. For comparison sake, even regular regular AA looks better than transparency anti aliasing. But the draw of SS transparency is the low performance hit for it. Which makes complete sense since it doesn't really improve image quality at all.

review / TLDR

Nvidia doesn't support SSAA in the driver. (TrSS is substantially worse) AMD does include SSAA. You talk down of FXAA but Transparency SS is a cheap form of AA just like FXAA is -- thus your assertion that nvidia has better AA within their drivers than AMD does is incorrect.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Another thing about SSAA. What it does is it takes the image internally and processes it at a substantially higher resolution. Say you have an image that is 1920x1080 on screen, SSAA will be rendered internally at 8 times that if you're using 8x SSAA. So you're basically using an effective resolution in excess of 15000x8000. And then it scales it back down and uses excess pixels in the calculation, smoothing over the colors and aliasing.

Obviously when the card is rendering internally at 2x-8x higher resolution than what appears on the screen, the performance hit is substantial. This is why nvidia does not include it in the driver. With Transparency SS, it does not do ANY of this. It only takes bits and pieces of the image and covers polygons with transparent textures. Thus the performance hit substantially lower than SSAA. But it also looks like crap compared to SSAA.

So bottom line that entire thing you wrote is ridiculous. AMD has better AA than nvidia, end of.

mini frustration / tangent

The other frustration with nvidia CP is that for roughly 60&#37; of games, the override setting does nothing. I'll throw an example out: Dead space 1, dead space 2, dead island, among others. Its completely frustrating that trying to manually enable AA in an older game results in nothing. I'm not a huge fan of AMD these days but CCC always obeys 99% of what you override in their control panel applet. Nvidia control panel generally ignores what you put in, especially for newer games. Its EXTREMELY annoying. For example, with CCC / AMD you can enable SSAA in dead space 2 which plays at great frame rates on Xfire 6970's. With nvidia CP, you cannot under any cirumstance enable SSAA. You can enable regular AA with nvidia inspector, and anything higher than 8x gets you a choppy framerate. And weird shadows sometimes.

Say what you will about AMD but at least their AA override works the great majority of the time. With NV control panel its loads of fun trying to get override AA in dead space 2 and finding it does nothing. Yet with AMD CCC you can simply select SSAA and thats it, it will obey what you put in the control panel. Nvidia doesn't have SSAA, and even if it it did, override doesn't work for dead space 2. It also works via override settings most of the time in CCC, the same cannot be said of nvidia. Look for yourself in nvidia inspector -- nvidia has game profiles for all game exe's in the registry and you can view them in nvidia inspector. Most games (including dead space, dead island like I mentioned above) have the setting "treat override as use applicaton preference" flagged. Thus for the great majority of games override does nothing. Thats fun stuff isn't it? Thats my biggest annoyance with my 580s. Here let me show you what i'm talking about on my system:

 
Last edited:
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |