I don't use MSAA in Skyrim, but I do use FXAA with AO and my fps teeter on the edge of 60 fps at times.
With a 7970 I'd have to reduce my settings, or buy a second, when the first one already costs more than what I currently have.
It's clear we're going down a subjective path of argument, me, my, I aren't really things people can, nor should they argue with unless another persons beliefs are blatantly wrong.
The close results suggest a CPU bottleneck. Skyrim didn’t present much of a challenge for these cards at its maximum settings with 4x MSAA, so we’ll enable transparency anti-aliasing and see if anything changes.
Are we really going down this road about what is "enough." A subjective by nature topic? John A uses 8xAA and John B uses 32xSSGSAAwithJelly. Both John A and John B have the same exact experiences, except John A only spends $300 for his videocard and John B spent $600.
I get people have to justify their expenses "well mine can do all that AND FOLDING@HOME, NEWB" but holy crap the day we start telling someone "that isn't GOOD ENOUGH FOR YOU" is the day we've all just because corporate monkies.
There hasn't been a game since 3/2010 I couldn't play at settings I was satisifed with (ie I don't crack on the AA, I actually don't see a huge difference from 4x to 8x in majority of games) minus those odd balls (BF3 with MSAA or Batman: AC in DX11 mode).
I guess my HD 5870 wasn't good enough for the last 2-3 years. I don't see any games on the horizon that will break my card, but like someone who said they got Amazon gift cards - I'm at my upgrade period.
I'll keep in mind that the HD 7970 isn't good enough for me based on John B's opinions.
Skyrim needs to be benched at 2560 for high end cards quite obviously.
So you go from 120 avg fps with this off, to turning it on, and just barely teetering on 60 fps avg?Either way I use AO (does AMD even support AO in Skyrim?), without it I get fps exceeding 120 avg.
I won't bore you with details but the programmers at Bethesda are barely competant to say the least. Anyway the game is CPU bound--- As such it is a horrible benchmark to use for any type of GPU comparison.
Still has a 2gb address limit too IIRC, lmao. Bethesda sucks. Why are their games so great but bug ridden, poorly optimized pieces of crap? They should hire some real programming talent someday.
PS: anyone try the PS3 version of Skyrim? Amazingly , it is buggier than any bad PC port i've played. My buddies say the game is unplayable, the game becomes a slide show and crashes after 10-30 hours of play. It also apparently
destroys PS3 hard drives. More proof that the programmers at bethesda are incompetant.
The part that irks me about driver level ambient occlusion is that sometimes it's just flat out wrong. For example, the fireplace in the foreground looks fantastic with AO on, as do some of the wooden poles towards the ceiling. However, there are odd shadows that encircle the entire pole when AO is enabled, which, to me, don't look accurate. As I understand it, this type of AO is just a quick shader hack (hence the performance hit), but it seems like it's over-zealous in places. I'd be interested to see more development in it though. NVIDIA had driver-level AO back when I had my GTX295, it just seems like it hasn't improved. I don't know what's taking AMD so long to get on that bandwagon, and maybe someone with more knowledge can chime in, but would OpenCL make something like this easier to do?To me it just looks like things go abit darker in areas where they should be shadows, but you lose abit of quality too, things go so dark, its like your not able to see the things shadowed anymore.
With AO on:
Without:
http://www.bestgaminglaptop.net/new...d-to-skyrim-cod-mw3-with-nvidia-beta-drivers/
TOP picture = without AO (ambient occlusion)
Buttom pictire = with AO (ambient occlusion).
checking googleing to find out if AMD cards support it.
According to Steam, Skyrim is the fastest selling title (popular?) EVER ?
Its not surprising. Skyrim is a fantastic game marred by a console-centric UI and tons of bugs. That is par for the course at bethesda, same thing happened with morrowind and oblivion.
The level of incompetence in the programming is just amazing, though. You should read some of the horror stories at amazon.com or bethesdas forums about the PS3 version. Apparently it crashes nonstop and becomes a slide show after 20-30 hours of play.
Also, the PS3 version destroys the PS3 HDD after extended use. Amazing.
Yeah...those dumb guys setting all the overclocking records with it must be delusional....:whiste:The 7970, being barely faster than a Fermi card that is almost 2 years old architecture, doesn't give any indication of being able to stay on top for 2 years (actually it may be just a few months before it gets obliterated), hence it's a bad investment. Of course it may overclock by 30% like some claims, but this is all garbage until you see it with your own eyes that it overclocks and it's actually stable.
JAG87 wrote:
Yeah...those dumb guys setting all the overclocking records with it must be delusional....:whiste:
Sure, a single HD 5870 may of been good enough for you but there are some that moved beyond that, have gone multi-GPU, and may be looking at the HD 7970 now, maybe for CrossFire. What if you decided to multi-monitor game or try Stereo3d or one of the gamers that desire 120hz gaming?
Not all gamers that spend 550 dollars for a GPU game at 1920 x 1080 and x4 AA. Higher price-point customers, some may look higher than just sweet spot settings or the most popular settings. That is why the card comes with 3 gigs of ram.
IMO this is literally nothing more than sound supply/demand inventory management.
The $525-$550 price-point was based on expectations of the supply being S0 units/day and the demand being D0 units per day where S0 was expected to be nearly equal to D0 (i.e. balanced supply/demand because of the price target).
What I read into this shift is that AMD has either internally revised their predictions of consumer demand downward (D0 was too high, so now they have D1 where D1 < D0), or they are seeing better than anticipated 28nm yields and are realizing that supply needs to be revised upwards and in response to this excess supply they need to lower prices and increase demand.
Image Source
Personally I would bet that they are seeing better than expected 28nm yields and have adjusted price down slightly so as to promote a slight increase in demand to avoid needless inventory buildup.
JAG87 wrote:
Yeah...those dumb guys setting all the overclocking records with it must be delusional....:whiste:
An 8800GTX from 2006 can still run any game with better visuals than any console can. I see the point you're trying to make, but it's invalid because it's wrong.
You're using unnecessary amounts of IQ enhancements to justify needing more performance. So while I'm sure you enjoy 8xAA for it's huge performance hit and negligible gain in IQ over 4xAA then by all means go ahead and use that as a way to say that a 5870 won't cut it for 2-3 more years. While you're at it you can also say that an 8800GTX is unable to run games at console quality levels also. I'm sure anyone still using an 8800GTX will tell you that it's still going strong even though it's 6 years old. A guy on my COD team is still using one to run BF3 fairly well.
I see, so now we're buying top of the line GPUs to play at mediocre settings? Necessary amounts of IQ enhancements are what justify spending $550 on a GPU. Review numbers mean very little, they are just a way to compare apples to apples. There are settings such as TRSSAA that make a huge difference to IQ. MLAA and FXAA are no substitute for it. Too bad AMD doesn't even support TRSSAA in DX10/11. Their product is literally unfit to be called high end, because it doesn't produce high end IQ.