If NVIDIA and AMD ran at exactly the same price/performance/heat/RAM ratios...

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Feb 19, 2009
10,457
10
76
Well CF still doesn't work in FC4 and its been awhile since release. NV GameWorks + Ubifail, working as intended.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Well CF still doesn't work in FC4 and its been awhile since release. NV GameWorks + Ubifail, working as intended.

I don't know...Nvidia can release a driver for a game that is not on the gameworks program and fix SLI. Why AMD can't fix crossfire is beyond me. They can complain that they need game code changes or some such but I've read other complaints from AMD claiming they need access to source code to optimize their driver for a title and Nvidia claims they do it without source code. So I get the impression that AMD is shifting blame a lot of the time.

edit: Ahh I found what I was misremembering. The 7970 and 7950 had broken crossfire for a while. Performance with 2 cards was worse than a single card at times. Not the 290 or 280. It has been a couple of years(almost).
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Would take trawling through old forum threads as I don't multi-gpu but I recall AMD was slow to produce CrossFire profiles for a while after the 7970 launched. But I've seen complaints about waiting for SLI profiles. My impression is that the 3rd party NVinspect(sp?) has better multi-gpu tweaking than RadeonPro(sp?) so it is a bit less frustrating for the technically inclined who might complain in a forum.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Would take trawling through old forum threads as I don't multi-gpu but I recall AMD was slow to produce CrossFire profiles for a while after the 7970 launched. But I've seen complaints about waiting for SLI profiles. My impression is that the 3rd party NVinspect(sp?) has better multi-gpu tweaking than RadeonPro(sp?) so it is a bit less frustrating for the technically inclined who might complain in a forum.

Sometimes a game using the same engine as another can use the same SLI bits in Nvidia Inspector to get SLI working. It's not always ideal though and new drivers are needed. Does RadeonPro work this way?
 
Feb 19, 2009
10,457
10
76
I don't know...Nvidia can release a driver for a game that is not on the gameworks program and fix SLI. Why AMD can't fix crossfire is beyond me. They can complain that they need game code changes or some such but I've read other complaints from AMD claiming they need access to source code to optimize their driver for a title and Nvidia claims they do it without source code. So I get the impression that AMD is shifting blame a lot of the time.

You just need to see Ubifail disable HBAO+ on AMD hardware in The Crew, due to GameWorks [Check out the review on H].

http://hardocp.com/article/2014/12/15/crew_performance_video_card_review/1#.VMG6eEeUefU

Then you dig deeper and you find statements from devs which say any feature that goes through GameWorks API cannot be shown for the competitor to optimize for or support, they have to do it on their own.

http://www.extremetech.com/extreme/...surps-power-from-developers-end-users-and-amd
"According to Nvidia, developers can, under certain licensing circumstances, gain access to (and optimize) the GameWorks code, but cannot share that code with AMD for optimization purposes."

So there's more proof of Ubifail + GameWorks artificially limiting AMD hardware just because.

For more info, directly from AMD, NV and Developers PoV:
http://www.extremetech.com/gaming/1...opers-weigh-in-on-the-gameworks-controversy/1

Now, I'm not saying what NV does is wrong, heck, they are allowed to compete how they wish with their money in GameWorks R&D and support. But to deny that GameWorks is there not just to enhance visuals/perf on NV, but also to cripple AMD where possible is to be ignorant.

AMD have also done this, once. Company of Heroes 2, rendering snow physics using DX11 compute through a path that runs a lot more efficient on GCN hardware.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I'm not talking about gameworks specifically. To avoid dragging this further off topic I will make it clear.

In the past, AMD has complained they didn't see source code for a game so they could not properly optimize their driver for that game and that was why they had lower performance. Nvidia said they don't always get source code either but they don't need it, they can optimize their drivers without actual game source code.
 
Last edited:
Feb 19, 2009
10,457
10
76
Nvidia said they didn't get source code either but they don't need it, they can optimize their drivers without actual game source code. The game was not part of gaming evolved.

I call [redacted] on this (not you, whoever at NV who said such things). Anyone remember Dragon Age running like a dog on NV hardware? Texture bugs and crippled SLI. Took about 2 months for it to be fixed. NV's excuse? They didn't get access to the Bioware build during development so could only optimize post release.

Alan Wake? Same deal.

GameWorks takes it to another level because its an encrypted blackbox. AMD can't do jack about features that utilize it, this comes from developers themselves who are told not to share specific GameWorks optimizations with AMD.


Profanity isn't allowed in VC&G. Come on, you know that Silverforce11.

-Elfear
 
Last edited by a moderator:

Abwx

Lifer
Apr 2, 2011
11,167
3,862
136
Nvidia said they don't always get source code either but they don't need it, they can optimize their drivers without actual game source code.

If that was true they wouldnt had been whining thet they didnt had the full code for Tomb Raider prior to the game being launched, and that full optimisations would be available only later, after they had the game code...
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
If that was true they wouldnt had been whining thet they didnt had the full code for Tomb Raider prior to the game being launched, and that full optimisations would be available only later, after they had the game code...

They don't need the source code which is might not be the same as what they were asking for, who knows.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
GameWorks takes it to another level because its an encrypted blackbox. AMD can't do jack about features that utilize it, this comes from developers themselves who are told not to share specific GameWorks optimizations with AMD.

This isn't true. Taken from an article about gameworks in watch dogs. They don't hinder performance, they enable extra features for their cards and specific optimizations. They don't prevent anyone from optimizing for AMD hardware too.

As for the Nvidia-specific source code: “The way that it works is we provide separate levels of licensing,” Cebenoyan explains. “We offer game developers source licensing, and it varies whether or not game developers are interested in that. Now, like any other middleware on earth, if you grant someone a source license, you grant it to them. We don’t preclude them from changing anything and making it run better on AMD.”

To put this particular argument to bed, I told Cebenoyan I wanted crystal clear clarification, asking “If AMD approached Ubisoft and said ‘We have ideas to make Watch Dogs run better on our hardware,’ then Ubisoft is free to do that?”

“Yes,” he answered. “They’re absolutely free to.”

And there’s nothing built in to GameWorks that disables AMD performance? “No, never.”
 
Feb 19, 2009
10,457
10
76
This isn't true. Taken from an article about gameworks in watch dogs. They don't hinder performance, they enable extra features for their cards and specific optimizations. They don't prevent anyone from optimizing for AMD hardware too.

You forgot this important point:

"According to Nvidia, developers can, under certain licensing circumstances, gain access to (and optimize) the GameWorks code, but cannot share that code with AMD for optimization purposes."

So while AMD can approach developers with optimizations which allow their hardware to perform better... it's a no-go area when it concerns GameWorks specific features.

Do you really believe AMD does not want to support CF in FC4 if they could? You must think so poorly of them to believe that.

Ultimately reality speaks louder than propaganda. How many NV sponsored titles or GameWorks games run like crap on AMD? A lot.
How many AMD GE games run like crap on NV? Not many..

The one time AMD played NV's game with GE was in COH2 with snow physics via dx11 compute. Could NV optimize that? Nope, not even now, it runs like crap on NV hardware.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Why AMD can't fix crossfire is beyond me.

Same reason NV's drivers have broken SLI in Watch Dogs and shadow/lighting bugs in SLI in FC4 -- there are always quirks with multi-GPU solutions for both NV/AMD.



edit: Ahh I found what I was misremembering. The 7970 and 7950 had broken crossfire for a while. Performance with 2 cards was worse than a single card at times. Not the 290 or 280. It has been a couple of years(almost).

Ya, and today HD7950/7970/7990/280/280X CF flies.

280X CF has lower frame times than a single 980.
http://www.techspot.com/review/948-geforce-gtx-960-sli-performance/page3.html

HardOCP and PCPer have noted that R9 290/290X CF is super smooth, smoother than 780/780Ti SLI.

If you happen to play less popular PC games, NV superior drivers is a total myth.



Not to mention in the last 5 months NV's driver support for Kepler is MIA.

Right now, 780 is only 9% faster than an HD7970Ghz, R9 290X > 970 or 780Ti at 1440P, while a 980 is just 10% faster than an R9 290X.



Ironically, you also mention DSR but R9 290 series has VSR.

VSR shows better IQ than NV's DSR:

1. "In Assassin's Creed 4: Black Flag, the result falls surprisingly clearly in favor of AMD's VSR. For VSR manages the feat visibly better to address not only the flicker in 2,560 × 1,440. In the same breath, the game will also be displayed more sharply than the Nvidia solution.

2. In Bioshock: Infinite, there is a tie between DSR and VSR with minimal differences. So the image at VSR is slightly sharper.

3. In The Walking Dead, the result turns in 2,560 × 1,440. Anti-aliasing is a little better at DSR, otherwise there is minimal differences. While VSR provides the minimal sharper image"
http://www.overclock.net/t/1529509/computerbase-de-amd-vsr-against-nvidia-dsr-review

Not sure then how DSR is some selling feature for NV.

I have been very happy with my NVIDIA cards.

With an admitted bias I'm looking for a reason to consider AMD other than price/performance.

Well on the NV side, the only good cards worth buying today are GTX970/980. If you cannot afford those, try to find a used GTX780/780Ti, or wait for the GTX960Ti.

It's very hard to NOT consider price/performance because it affects IQ and FPS in games. For example, R9 290's 45%+ faster performance will directly result in a better experience for you since you'll have more options to raise graphical settings. If this is not important, than why even upgrade from a GTX560Ti in the first place as it can still play games well at 1080P with medium settings.

Right now NV has HDMI 2.0 but unless you have a 4K HDMI TV, this doesn't matter. NV also has full DX12 support but you don't have Windows 10 and by the time DX12 games come out, most cards today will be outdated unless you are running GTX970 SLI or faster. NV has TXAA but it has as huge performance hit and poor IQ most of the time. If you can't live without PhysX, I guess it's a factor. If you love 3D gaming, then 3D vision is a big bonus for NV.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
This isn't true. Taken from an article about gameworks in watch dogs. They don't hinder performance, they enable extra features for their cards and specific optimizations. They don't prevent anyone from optimizing for AMD hardware too.

Sounds like the Publisher/Studio has to pay if they want the ability to access and modify the GameWorks sourcecode. Explains these tweets I found when the GameWorks discussion started with WatchDogs:



https://twitter.com/repi/status/452812842132332544

How many publishers+studios are going to accept a Nvidia TWIMTBP deal for support and most likely $$$ and then send some $$ back to Nvidia, or possibly pay it all back and then some depending on how much Nvidia sets the fees for, just to be able to modify GameWorks? Heck sometimes it will be the publisher who accepted the Nvidia branding deal and a 3rd party studio might just be contracted to get the game made for $X.
 
Last edited:

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
If Freesync turns out to be as good or better than G-sync I would go with AMD.

If not I would go with Nvidia for the built in hardware codecs for Shadowplay.
 

5150Joker

Diamond Member
Feb 6, 2002
5,559
0
71
www.techinferno.com
Sounds like the Publisher/Studio has to pay if they want the ability to access and modify the GameWorks sourcecode. Explains these tweets I found when the GameWorks discussion started with WatchDogs:



https://twitter.com/repi/status/452812842132332544

How many publishers+studios are going to accept a Nvidia TWIMTBP deal for support and most likely $$$ and then send some $$ back to Nvidia, or possibly pay it all back and then some depending on how much Nvidia sets the fees for, just to be able to modify GameWorks? Heck sometimes it will be the publisher who accepted the Nvidia branding deal and a 3rd party studio might just be contracted to get the game made for $X.


Johan is a hypocrite for saying anything seeing as how he is pushing Mantle (not quite open as first touted) and has a closed in-house engine not available to competitors to license. I view what NVIDIA is doing kinda like what console makers do to differentiate themselves. For example, Sony might give extra support to a company for a PS4 exclusive feature or timed exclusivity. NVIDIA does this through GameWorks to present a value to GeForce customers not found anywhere else.

Look at enhanced godrays for example, they bring a very tangible visual difference to the game and if GameWorks can help a 980 run those w/out a performance hit vs AMD taking one, then the consumer will grab the NVIDIA card and thus it will result in higher sales for NVIDIA. It's simple and good business sense at the end of the day.

To OP's question: I'd pick NVIDIA because I had 3 generations of AMD cards (mobile parts but removable MXM) that had various driver and hardware issues. Basically it was 3 strikes and you're out for me. Simple things like being able to put the laptop to sleep and wake properly didn't work with 3 generations worth of AMD drivers in Crossfire. Game's stuttered in Crossfire (this is before they fixed their frame latency) and when I finally got tired of it all and convinced Dell to replace my Alienware laptop's crossfire setup for an NViDIA equivalent, all of a sudden all those issues went away and my system was 100% stable and games were smoother - never looked back after that.
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
I would take 1 nv and 1 amd card and run hybrid franken Scalable Fire Link Interface (xfire SLI)

Would go nv for bragging, but what's is there to brag about if you didn't pay any extra for the same?
 

biostud

Lifer
Feb 27, 2003
18,406
4,967
136
I would probably go for AMD because freesync seems to have better potential than gsync. But since I don't plan on changing monitor any time soon, NVIDIA would be fine too.
 

5150Joker

Diamond Member
Feb 6, 2002
5,559
0
71
www.techinferno.com
I would take 1 nv and 1 amd card and run hybrid franken Scalable Fire Link Interface (xfire SLI)

Would go nv for bragging, but what's is there to brag about if you didn't pay any extra for the same?

You're seriously quoting and taking out context in your sig an old post from 2007 where I was making a sarcastic/tongue in cheek comment about shills?
 

Sunaiac

Member
Dec 17, 2014
83
22
81
I'd go AMD :
- better drivers
- cards still supported after the all new shiny and expensive one comes
- better multi GPU scaling
- better multi GPU technologie with XDMA
- moving the APIs forward with mantle
- new cards tend to bring something over previous ones. Exception is 285. Maxwell made not bringing anything a rule for a whole range of cards.
- general openness
- cards DO survive 3 years of service
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
280X CF has lower frame times than a single 980.
http://www.techspot.com/review/948-geforce-gtx-960-sli-performance/page3.html

HardOCP and PCPer have noted that R9 290/290X CF is super smooth, smoother than 780/780Ti SLI.

You realize that frametimes are inversely proportional to framerate, all other things being equal.

Higher avg fps = generally lower frametimes.

280X CF is faster than a single 980 by a tiny bit. Therefore it has faster frametimes. Though it is quite nice that 280X CF seems to scale so well.

More data is needed, mainly if there are any spikes in frametime.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
I still hold a grudge from the absolutely horrible drivers that AMD has had in the past... especially the drivers from their AllInWonder cards.

I'd had a few flaky Catalyst driver versions recently as well, although some of that might come from me overclocking the card.

Honestly, this is pretty silly. That's like not buying an Intel CPU today because Netburst was so bad, or not buying NV because you got tricked into purchasing a FX card 10 years ago....

Let it go!
 

smackababy

Lifer
Oct 30, 2008
27,024
79
86
Personally, I would pick Nvidia simply because their driver installer can do a clean install (remove the old driver before installing the new one) and AMD doesn't do that. Plus, CCC is terrible.

Currently, I buy whichever card performs the best, is available, and has the best features.
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
If NVIDIA and AMD ran at exactly the same price/performance/heat/RAM ratios? I would go with Nvidia because they have the better memory/texture compression. If you can compress textures leadign to more effective bandwidth, and it causes no loss of image quality, then that's a big bonus.

I wont take a side on drivers since I've had problems with both. I must say I do not like the way nvidia creates its own entire user account.
 
Status
Not open for further replies.
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |