Has Maxwell Killed SLI?

Pandamonia

Senior member
Jun 13, 2013
433
49
91
Looking at SLI scaling for 980TI all i can see is around 50% +/- THIS SUCKS

I refuse to believe that its bottlenecked when Tomb Raider hits 90+ still.

Since my 780 SLI i have noticed that SLI has got worse in performance and bugs. Seems like even on the new GPU its continuing that trend but the scaling is back at 4870X2 days. You are losing 50% of the power of the second card!??!?
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
Maybe they are starting to move their focus to NVLink stuff that will replace SLI in the next pascal chips.
 

YBS1

Golden Member
May 14, 2000
1,945
129
106
nVidia's piss-poor scaling in SLi (or maybe it's simply AMD's vast improvement in it) had me seriously considering switching back to Team Red lately. If I was using a 4K display it probably would be enough to do the trick, but it looks like from what I've seen the 980Ti SLi is still superior in most cases for 1080P, and it looked pretty much like a wash at 1440.

Even at that, let's say single card performance was a dead heat...You still have to weigh AMD's superior scaling against the fact they are historically much slower to support new releases with working profiles than nV. Makes it very frustrating for multi-gpu users to come to a clear cut decision on which way to go.
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
People probably won't believe me but I think their drivers are the culprit here since multi-GPU scaling is mostly dependent on just that ...

Despite Nvidia purchasing the creator of the technology, their competition does it the best since they are the ones innovating the most in this area with "Super Tiling" and "XDMA" putting tons of issues on multi-GPU scaling to rest plus they likely have the better drivers too for this when it comes to performance ...

SLI hasn't changed in god knows how long ...
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
People probably won't believe me but I think their drivers are the culprit here since multi-GPU scaling is mostly dependent on just that ...

Despite Nvidia purchasing the creator of the technology, their competition does it the best since they are the ones innovating the most in this area with "Super Tiling" and "XDMA" putting tons of issues on multi-GPU scaling to rest plus they likely have the better drivers too for this when it comes to performance ...

SLI hasn't changed in god knows how long ...

Slight correction. 3DFX SLi is completely different to nVIDIAs. Only the acronym is the same.

And your right. CF has come a LONG way. Remember them master/slave configs with external dongles? I remember them being caught with their pants down with the NV45 and that mysterious side connector on the PCB. But after all these years, the tables have turned with nVIDIA getting caught out by XDMA technology.

Simply a "good enough" solution that was "good enough" in all fronts (scaling, frametime etc) against its competition is now surpassed by a superior solution that makes it not "good enough".

Means we will get to see nVIDIA release something new on that front! Hopefully with Pascal generation..
 

MagickMan

Diamond Member
Aug 11, 2008
7,460
3
76
Memory bandwidth is the issue for nvidia, in SLI if you boost your memory clock substantially the improvements in performance are linear. Maxwell would benefit greatly from HBM.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Between potential new interfaces to GPUs and between GPUs and DX12 and AMD's very successful crossfire scaling efforts; Multi GPU has quite an interesting future ahead of it. Heterogenous and iGPU + dGPU scaling will be very interesting too. Finally get some value out of that iGPU silicon you sit on
 

Sabrewings

Golden Member
Jun 27, 2015
1,942
35
51
Between potential new interfaces to GPUs and between GPUs and DX12 and AMD's very successful crossfire scaling efforts; Multi GPU has quite an interesting future ahead of it. Heterogenous and iGPU + dGPU scaling will be very interesting too. Finally get some value out of that iGPU silicon you sit on

Assuming game devs put the elbow grease into making use of it.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Yeah... money talks for sure. I'm hoping that the major engine devs (Unity, Unreal, CryEngine) go all out on DX12 features so that a good portion of games can inherit the functionality without much/any additional work
 

Sabrewings

Golden Member
Jun 27, 2015
1,942
35
51
Yeah... money talks for sure. I'm hoping that the major engine devs (Unity, Unreal, CryEngine) go all out on DX12 features so that a good portion of games can inherit the functionality without much/any additional work

Hopefully so. My favorite indie games will surely benefit most from this. And CIG has to put SC on DX12. There's no other way they can achieve their goals of scope.
 

Squeetard

Senior member
Nov 13, 2004
815
7
76
I'm done with SLI and Xfire. Pain in the butt and micro stutter ruins whatever framerate advantage you think you have.

I recently disabled one of my 7970's as one of the games I was playing did not like it (Skyforge) I forgot to enable it and when I went into my other games (Archeage, Witcher 3, SWTOR, DA:I, FC4) I noticed that gameplay and graphics were way smoother even at half the FPS.

SLI and Xfire is snake oil I tell you.

BTW, before that I had a 590 hydrocopper and same thing.
 

Sabrewings

Golden Member
Jun 27, 2015
1,942
35
51
It's always been gimmicky to me. I'd rather just buy one big GPU than deal with the issues and inconsistency of linked GPUs.
 

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
Well, nowadays a single GPU is powerful enough.

I think this is Nvidia's strategy in a nutshell. SLI / CF has forever and always been buggy, even going back to 3dfx.

Getting to smaller nodes with more efficient architectures is the way to go. Too many people think efficient means weak, but when you have that efficiency and make a big (GPU, car engine, steam turbine, CPU, anything that takes power and does work) you get loads of output work that less efficient designs can't touch.

I think this is why you don't see any dual GPU Nvidia cards now.
 

moonbogg

Lifer
Jan 8, 2011
10,727
3,416
136
Without SLI my 144hz panel would be kind of lame. I like SLI. I'd rather have it than not.
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
The next generation of APIs will hopefully usher in a new era for SLI/CF by significantly reducing CPU overhead and making it easier for devs to program for it... but we'll see.
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
SLI with 144z G-Sync is perfect (as it should for the cost) when SLI works (i.e if SLI works and only issue is microstutter and hitches, then G-Sync/144hz seems to eliminate it. Exceptions to that are Bioshock Infinite, Wolfenstein). Either 144hz and/or G-Sync helps with SLI frame pacing.
 

Sabrewings

Golden Member
Jun 27, 2015
1,942
35
51
The next generation of APIs will hopefully usher in a new era for SLI/CF by significantly reducing CPU overhead and making it easier for devs to program for it... but we'll see.

Hopefully. I can't see SLI carrying on in its current state for much longer. AMD's XDMA engine scales much better especially at higher resolutions. Either Nvidia needs to improve their performance and scaling or they need to drop it altogether. DX12 could stave off the inevitable another year or two, but it's not going to solve all of their problems.
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
I've run SLI/CF in various configurations. In general Nvidia seemed to support games quicker in a profile;however AMD sure seems to be quicker now than even a year ago in releasing CF profiles (I think Dr. Lisa cleaned house and told people to shape up).

moonbogg just jumped to 980TI SLI from 680 SLI and with good reason. He wanted a big jump, new core increased ram etc. Good move. I replaced 2 290s in CF (actually put them in my 3770k rig) for a single GTX980TI - hardly an upgrade. I did it for a different reason. I wanted to go single cpu vs dual gpu with a card strong enough to just about equal the 2 I was replacing that had more Vram.

TitanX fit the bill but was simply too salty. Shortly after that the GTX980TI apeeared and I was tempted, but FuryX was due to arrive so I waited.

Since I custom water cool, the quick adoption of water blocks for the 980TI helped it's cause but the announcement that the Fury would have a block made me wait.

I ultimately purchased the EVGA GTX980TI SC because of the stock bumped up core (1102 vs 1000), the increased vram (6G-DDR5 vs 4G-HBM), the better OCing capacity (mine can run +150 vcore-1252 base and +300 mem all day vs the FuryX/Fury's PRESENT apparent inability to OC well. I understand the writer of MSI Afterburner is working on an OCing tool but I still thought the 980TI made more sense.

To each his own!
 
Last edited:

Jumpem

Lifer
Sep 21, 2000
10,757
3
81
I wanted to try 4K. I was thinking about SLI 980Ti. Now I am reading about issues. But I am not sure a single 980Ti would cut it for some games.
 

Berryracer

Platinum Member
Oct 4, 2006
2,779
1
81
I wanted to try 4K. I was thinking about SLI 980Ti. Now I am reading about issues. But I am not sure a single 980Ti would cut it for some games.
yeah on mu single 980 Ti I get around 153,000 in Ice Storm but once I enable SLI that score drops to 73,000

all other benchmarks are fine but it's kinda buggy you see
 

kasakka

Senior member
Mar 16, 2013
334
1
81
I find that amazing, is it the games or going to higher framerates or 144hz or 4k

There's simply still no single card capable of running games at 4K with full details settings. Wait a few years and the problem should go away and hopefully we then have 120+ Hz 4K screens too.
 

moonbogg

Lifer
Jan 8, 2011
10,727
3,416
136
There's simply still no single card capable of running games at 4K with full details settings. Wait a few years and the problem should go away and hopefully we then have 120+ Hz 4K screens too.

There should never be a card capable of running games maxed at 4k, ever. If one comes along, you have to wonder why its suddenly possible. GPU manufacturers are limited by the manufacturing node to a large degree, aren't they?
There is a practical wall that they will hit. So, with that wall having always been in place and knowing the wall will always be there, why would a card suddenly get magically faster than the mfg process allows?
The only way it happens is if a new target is established by game developers and GPU manufacturers to optimize the games and hardware to run well at the higher resolution, simply because people will be buying those larger res screens and that's where the money is.
If all people have are 1080p monitors, then stressing a GPU at 1080p will be the target and the goal. If everyone has 4k monitors, then they will make games run well at that res. They could do it now, but don't.
So, if games run well at 4k in a few years, then the graphics of those games probably won't be much better than todays, because the extra GPU muscle will just be spent on resolution. So developers will spend their GPU power budget on resolution rather than graphical improvements. What other choice would they have? If the target stays at 1080p/1440p, then they can spend the budget on graphical fanciness, but you can't have both with a hard limit on GPU power increases due to manufacturing limitations.

I think it will be a few years before GPU's can play today's latest games well at 4k, and it will be several more years before 4K games both run well and look better than today's games. So, 6 years from now when you buy a new game at it runs at 60fps on your 4K screen, and it gets 150fps on an old 1440p screen, just realize how much better that game would have looked if they had designed it for 1440p and spent the GPU budget on graphics rather than resolution.
 
Last edited:

Pwndenburg

Member
Mar 2, 2012
172
0
76
The news never seems to be good on either side of the fence. Nvidia seems to scale like crud and AMD gets support late. That's why I have always ultimately decided just to get the fastest single I can afford at the time. But I certainly understand why you would want sli if you need 144hz or high res. Glad I'm not sensitive to these things as some users are. I guess I'm lucky that I'm easy to please.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |