gamegpuKiller Instinct DX12 Benchmarks980TI vs 290X)

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

thesmokingman

Platinum Member
May 6, 2010
2,307
231
106
That's not at all what it says. The Editor states that due to circumstances related to AMD, Fury-based videocards will be removed/absent from future testing for several weeks. The most logical conclusion is that GameGPU received hardware review samples directly from AMD (hence why they had no Fury cards in the charts for months after launch), and now AMD wants to rotate these cards by giving them to another reviewer in Russia. It also explains the lack of i5 6600K, i7 6700K, R9 390, 390X, etc. since these products were never sent to them.


Driver issue sounds more plausible, am I right? :sneaky:
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Thanks RS for the translation.

Any time!

It does highlight a point I have not been happy with for years -- reviews are written based on free samples provided to the reviewer as opposed to the reviewer purchasing the product as we would. Inherently it makes it more difficult to provide an objective opinion on the purchase. At least this is solved by GameGPU since they almost never recommend cards since they test game performance, not GPU hardware reviews in a traditional sense. On the flip side, if Intel/AMD/NV don't like the showings in reviews, they can stop sending future samples, thus influencing the reviewer. It also means, you may get early launch samples the minute the latest hardware comes out if you are on their good side. Unfortunate how this industry operates.
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
Any time!

It does highlight a point I have not been happy with for years -- reviews are written based on free samples provided to the reviewer as opposed to the reviewer purchasing the product as we would. Inherently it makes it more difficult to provide an objective opinion on the purchase. At least this is solved by GameGPU since they almost never recommend cards since they test game performance, not GPU hardware reviews in a traditional sense. On the flip side, if Intel/AMD/NV don't like the showings in reviews, they can stop sending future samples, thus influencing the reviewer. It also means, you may get early launch samples the minute the latest hardware comes out if you are on their good side. Unfortunate how this industry operates.

That is why I appreciate Termie's articles.
 

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
That's not at all what it says. The Editor states that due to circumstances related to AMD, Fury-based videocards will be removed/absent from future testing for several weeks.

In other words, your guess is as good as his on why the Fiji cards are missing.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,450
10,119
126

Well, the CPU requirements, or lack thereof, are going to be good news for my friend who wants to play KI, but doesn't have an XBO. His PC has a Athlon II X4 3.0Ghz, which we could probably OC a little bit if needed. His current GPU is a GT610, but he already knows that he needs to replace it with something more powerful to actually play games.

Looks like a 270X would be the minimum to play at 60FPS locked at 1080P?
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
Well, the CPU requirements, or lack thereof, are going to be good news for my friend who wants to play KI, but doesn't have an XBO. His PC has a Athlon II X4 3.0Ghz, which we could probably OC a little bit if needed. His current GPU is a GT610, but he already knows that he needs to replace it with something more powerful to actually play games.

Looks like a 270X would be the minimum to play at 60FPS locked at 1080P?

A 370 can do it.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
You get better translation from russian to slovenian. Slovenian I understand.

They say, that graphic cards from amd fury family didn't tested, cause if they would, test would be in to infinity.

The translation is from a russian:
Fury is out of tests for unknown period due of some reasons on AMD side.

I'm thinking it might be they don't have a Fury sample anymore? So they can't test? Not sure translation is hard to understand.


Its now capped to 60 fps as of 2 days ago:


KI for Windows 10 now has improved support for monitors with refresh rates higher than 60hz.

http://shoryuken.com/2016/04/11/kil...h-addresses-kan-ra-and-glacius-glitches-more/
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Any time!

It does highlight a point I have not been happy with for years -- reviews are written based on free samples provided to the reviewer as opposed to the reviewer purchasing the product as we would. Inherently it makes it more difficult to provide an objective opinion on the purchase. At least this is solved by GameGPU since they almost never recommend cards since they test game performance, not GPU hardware reviews in a traditional sense. On the flip side, if Intel/AMD/NV don't like the showings in reviews, they can stop sending future samples, thus influencing the reviewer. It also means, you may get early launch samples the minute the latest hardware comes out if you are on their good side. Unfortunate how this industry operates.

Ah I see you thought the same thing, they lost their samples.

I do agree, and you also have the same issue on the flip side, if you've paid a lot of money for your cards, you'll want them to do better to prove to yourself that you made a good purchase. You see this reasoning in [H] reviews.
 

TheELF

Diamond Member
Dec 22, 2012
3,993
744
126
980Ti continues to perform really well. It's the rest of Maxwell stack that's falling apart while Kepler went on vacation a long time ago. 680 can barely keep up with a 7870/800 MHz 7950 in some games. 780/OG Titan are utter failures.

Well the 680 was a lot faster in the games of the day,it has similar compute power to the 7870 though so I don't get how anybody could be confused about why it's performing on the same level now.
http://gpuboss.com/gpus/Radeon-HD-7870-vs-GeForce-GTX-680

980ti has ~double the compute of the 680 and get's ~double the framerate,NV 1080 is supposed to have double the compute of the 980...
http://gpuboss.com/gpus/GeForce-GTX-980-Ti-vs-GeForce-GTX-680
 

Actaeon

Diamond Member
Dec 28, 2000
8,657
20
76
980Ti continues to perform really well. It's the rest of Maxwell stack that's falling apart while Kepler went on vacation a long time ago. 680 can barely keep up with a 7870/800 MHz 7950 in some games. 780/OG Titan are utter failures.

If I had a 980Ti though, I would be strongly considering selling it 1st week of June just to preserve most of the capital and transfer the resale value into GP104. It's better than seeing $650-700 780Ti plummet to $350-400 almost overnight when 970/980 launched.

Thanks RS, good advice as always. I do have a few months to think it over and see what other rumors come up about GP104 and GP100/102 before I decide to sell.

I ended up getting a great deal for 980 Tis, $529 shipped each and these are pretty nice ones. They are Gigabyte G1 Gaming with custom PCB/triple fan cooler and since I got them for less than the 980 was at launch I am pretty happy with the performance for the money. Thanks to Maxwell's generous OC capability I am already 30% quicker than a stock 980 Ti too.

My assumption at this time is that GP104 will offer the same cost/performance ratio as what I got with my 980 Tis. A high end GP104 with GDDR5X might be slightly faster, but I expect it to cost slightly more too ($600?). A lower end GP104 with just standard GDDR5 and a 256 bit bus probably will be slower, but at a cheaper price too.

Now, it doesn't account for any potential driver issues like what we saw with Kepler and GP104 will have more memory but 6GB seems to be fine with 4k for now. I am hoping that Maxwell won't fall off the cliff like Kepler due to enough architectural similarities with Pascal but that is TBD.

Anyway, I'll wait and see what happens with the GP104 rumors. More info is being leaked by the day so I should know a lot more about it by June to decide if the performance benefit is worth the trouble of reselling. Barring something amazing from Pascal, I am looking ahead towards Volta/Vega at this point. I think they will be the cards that would represent a meaningful upgrade for me. Pascal will be a great upgrade for anyone not on Maxwell and wants to stick with team Green.

Thanks again.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
Seems like the GCN effect is in full force. AMD this gen is it as consoles squeeze as much as they can out of their puny APUs. I was going to buy a 390 for Doom 4, but now I just might wait for the new GPUs.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
Seems like the GCN effect is in full force. AMD this gen is it as consoles squeeze as much as they can out of their puny APUs. I was going to buy a 390 for Doom 4, but now I just might wait for the new GPUs.

I'm a little tired of people just matter of factly stating this. No one knows this is the reason AMD's architecture is excelling in newer games. There's been plenty of pure console ports that did better comparatively on NV's hardware.

Is it so crazy to think at least some of the residual performance improvements are due to GCN being a good architecture on it own?

I believe that had the consoles not even existed that GCN would be doing just as well compared to nVidia's Kepler and Maxwell architectures.
 

Adored

Senior member
Mar 24, 2016
256
1
16
I believe that had the consoles not even existed that GCN would be doing just as well compared to nVidia's Kepler and Maxwell architectures.

If this is true, what changed? The DX12 results are even ahead of what I believed they would be - mostly because I had little faith in AMD being able to push their agenda in the way that Nvidia can.

The real surprise is that many games and devs that wouldn't normally be friendly to AMD hardware are showing unbelievable reversals.

There is a lot going on though. It's the console effect, DX12 multithreading, a little bit of async and better dev relations. Possible that AMD has improved DX11 drivers too.

We're seeing the true performance of GCN now yes, but it's been too late for 28nm.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
The DX12 results are even ahead of what I believed they would be - mostly because I had little faith in AMD being able to push their agenda in the way that Nvidia can.

I think this is again the wrong way to look at it. GCN is factually proven as the more forward looking architecture. Its merely that its "time has come", so to speak. You build an architecture that looks to the future.. the future has to occur eventually haha.

There is a lot going on though. It's the "console effect", DX12 multithreading, computational heavy techniques such as async that AMD was prepared for, and better game optimized drivers.

Cleaned that up a bit but I agree that its a combination of all these things.

We're seeing the true performance of GCN now yes, but it's been too late for 28nm.

Actually its right on time! Think about it this way: Those that bought a 7970 or a 680 way back at the dawn of 28nm both got reasonable values for their purchases for the life of 28nm. Now that 28nm is being phased out the 7970 GCN buyer can easily stretch to Vega 10/GP100 (or whatever their chosen upgrade path) for another 12 months. The 680 owner will be struggling with many titles even at lowered settings. That is a huge difference, that last 6 or 12 months of ownership for the consumer that wants 3+ years of use. In my opinion that is make or break for the value shopper.

^In my book that is perfect timing for us consumers. I don't frequent these forums as a financial analyst, business analyst, or marketing manager, I just look at what's best for me and other enthusiasts. In THAT light, AMD's decision to build GCN the way it did was perfect timing.
 
Last edited:
Feb 19, 2009
10,457
10
76
If this is true, what changed? The DX12 results are even ahead of what I believed they would be - mostly because I had little faith in AMD being able to push their agenda in the way that Nvidia can.

The real surprise is that many games and devs that wouldn't normally be friendly to AMD hardware are showing unbelievable reversals.

There is a lot going on though. It's the console effect, DX12 multithreading, a little bit of async and better dev relations. Possible that AMD has improved DX11 drivers too.

We're seeing the true performance of GCN now yes, but it's been too late for 28nm.

You're right, it is a combination effect. But it's not just about AMD finally extracting performance out of it. GCN as a uarch is a compute and graphics beast from the start. It's latency for compute on the ALU level is much faster than Kepler and faster than Maxwell. It has real priority context pre-emption, something Pascal just introduced finally. It's shaders can instantly switch from graphics or compute related work, no slow context switch required. This advantage alone means that as games become more compute heavy, Kepler and Maxwell will stall more, waiting on the context switch, while GCN just powers on. This is a very forward looking uarch, made for an era where there's more compute effects and processing in games.

However, the AMD ecosystem that developers of next-gen games are using and have to optimize for also gives them an advantage. This is AMD's long-term chess play. It's going to bear fruit for them for the next few years at the least, and if they manage the APU win for the next consoles, then that's another era in the bag. This is obvious even in cross-platform DX11 games, GCN performs above it's tier lately.

DX12 adds to this with MTR and AC, both features which extracts more performance from silicon that was untapped in DX11.

GCN remains their uarch, just iterated and improved, so Polaris & Vega will benefit from these effects. The improvements in GCN 4's hardware scheduler will ensure this happens as it can be dynamic enough to peak utilize wavefront threads of 4 to 64 and distribute optimal work to SIMDs/ALUs on per thread basis. They are designing the hardware to be very robust and flexible so that it needs less driver intervention to hit peak performance.
 
Last edited:

thesmokingman

Platinum Member
May 6, 2010
2,307
231
106
We're seeing the true performance of GCN now yes, but it's been too late for 28nm.


What? What do you mean "seeing" as in now, the present? 7970s have always been faster than their Nvidia counterpart, sometimes a lot faster. Nvidia played the whole boost clock thing to win perfect scenario benchmarks, winning the war of perception.
 

Adored

Senior member
Mar 24, 2016
256
1
16
What? What do you mean "seeing" as in now, the present? 7970s have always been faster than their Nvidia counterpart, sometimes a lot faster. Nvidia played the whole boost clock thing to win perfect scenario benchmarks, winning the war of perception.

I mean now yes with the 390x regularly beating the 980, the 390 mostly thrashes the 970 now, ditto 380 vs 960.
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
I think this is again the wrong way to look at it. GCN is factually proven as the more forward looking architecture. Its merely that its "time has come", so to speak. You build an architecture that looks to the future.. the future has to occur eventually haha.

And what happens if the future isn't the same as you predicted ? It becomes a waste just like OpenGL ...

OpenGL was once forward looking but you don't see many games using it because of the mistakes that compounded in it's design ...

Predicting the future is a VERY dangerous thing to do and look where it's got AMD in their market share predicament, they should learn to capitalize more aggressively on what they have currently ...

Had Killer Instinct been a DX12 game, AMD would have had another Quantum Break style domination on their hands a la 390 = 980Ti OC ...
 

Durp

Member
Jan 29, 2013
132
0
0
Freesync monitors are ~$150 cheaper than G-sync equivalents and now many newer games are showing a strong performance advantage on the AMD side. At this moment it's easy to choose red over green.

You might want look into that Nvidia.
 
Feb 19, 2009
10,457
10
76
I mostly mean with the 390x regularly beating the 980, the 390 mostly thrashes the 970 now, ditto 380 vs 960.

Ofc, the silicon is now performing where it was originally designed to be, but crippled by an API that it was NOT made for.

GCN was always made for a console-like API because of how closely AMD worked with Sony and MS on it's design. This goes way back to 2009 where both Sony and MS engineers collaborated on GCN.

But, despite running crippled in DX11, we see that these SKUs hold their own for many years.

Just go back to the start, 7979 ~= 680. 7970 Ghz > 680.

R290 ~= 780, the 290 was $400 vs $500 780.
R290X < 780Ti, the 290X was $549 (or $500) vs $699 780Ti.

It wasn't as if the 290/X were not competitive. Even in the Kepler era, AMD managed to rise to 37% marketshare and on an upwards trajectory until Maxwell shut it down.

The fact that next-gen games are more compute heavy and more GCN optimized, AMD didn't need anything new when Maxwell rolled out, they just rebadged Hawaii and suddenly the 390/X are competitive vs 970/980 for their prices at the time and now it leaves them in the dust.

Hawaii has stood the test of time, it beats Kepler, Maxwell and it's going to be routine where it rivals Titan X/980Ti in the DX12 era.

Polaris is the big leap for GCN, everything improved and the node change. It will take the strong GCN foundation and take it to the next level. Basically in the DX12 era, Polaris will stomp on it's competitors.
 

Adored

Senior member
Mar 24, 2016
256
1
16
Well I look at it overall, not just in performance. Yes Hawaii is putting Maxwell to the sword on performance but it still only looks as good in perf/Watt, sometimes still behind.

I feel more like AMD has corrected a wrong rather than made a great leap forward.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |