AMD's Tonga - R9 285 (Specs) and R9 285X (Partial Specs)

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

FatherMurphy

Senior member
Mar 27, 2014
229
18
81
Seriously? I'm not entirely sure what you are saying, but I think the gist is that you dispute my assertion that the Kepler architecture scaled nearly linearly in performance/watt.

Go look at the performance and average power numbers for GK107, GK104, and GK110. It's nearly linear, which is both surprising, awesome, and very different from the Fermi architecture.

If I'm wrong, someone please point that out. I thought this was widely known. Facts are facts, no need to get your dander up over it.

If you have a coherent reason for doubting that Maxwell will do the same... let's hear it.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
Was hoping for $199 on the 4gb flavor. It would be a real nice inexpensive and lower power consumption crossfire setup. I am disappointed.
 

parvadomus

Senior member
Dec 11, 2012
685
14
81
Yup, Kepler had great scaling: http://tpucdn.com/reviews/Sapphire/R9_290_Vapor-X/images/perfwatt_1920.gif

That is the latest review from TPU, @1080p GK110 is actually a bit better in performance/watt than the smaller chips. Actually, it almost looks as though it goes GK110 > GK104 > GK106 > GK107. Never noticed that before...

Kepler has great scaling cause the ratio of ROPs/shaders/TMUs etc. is almost the same through every chip. And because DP shaders are separate units than SP shaders, and are turned off in geforces while gaming (dont really know how this works but I remember reading something about it). GCN is different, I think the same SP shaders are used for DP, and then having more complex units in high-end chips make them loss efficiency.

About Tonga, it frontend is much bigger than Tahiti, and it can feed much better its shaders.
Looking only at its 190watt TDP and its firestrike performance, it looks much more efficient than all other GCN and all Kepler chips (its only 10watts more than 270X and much much higher score). I expect about 25-30% efficiency uplift over any Kepler based GPU. Thats not enough to catch little maxwell but it looks good enough to kill any GK104 and Tahiti based cards. If the chip is small enough AMD can adjust its price and still score a win when GM104 arrives (I expect GM104 to be considerably larger cause of its big caches).
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
AMD's very own GK104, 2.5 years late. Does almost(price) great against GTX 760.
Sure it can surprise us with few % here and there, but all-in-all this won't do against Maxwell, so AMD will once again have to play low-price game.
But at least this time around they have cheap and easy to manufacture chip/gfx card.

Then again it can't be this bad, so my guess is IT WON'T
But right now, mm2 is it's only saving grace I can think of.
#confused :|
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,249
136
AMD's very own GK104, 2.5 years late. Does almost(price) great against GTX 760.
Sure it can surprise us with few % here and there, but all-in-all this won't do against Maxwell, so AMD will once again have to play low-price game.
But at least this time around they have cheap and easy to manufacture chip/gfx card.

Then again it can't be this bad, so my guess is IT WON'T
But right now, mm2 is it's only saving grace I can think of.
#confused :|

Hopefully size doesn't dictate performance



Guess maybe they could use if for a SFF steam box.
 
Last edited:

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
Looking only at its 190watt TDP and its firestrike performance, it looks much more efficient than all other GCN and all Kepler chips (its only 10watts more than 270X and much much higher score). I expect about 25-30% efficiency uplift over any Kepler based GPU.

Doesn't the gtx 680 do something like 6600 in firestrike with a 195W tdp? seems more like 10%.

We'll see in the reviews, tdp and firestrike isn't power consumption and gaming performance of course.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
But right now, mm2 is it's only saving grace I can think of.
#confused :|

mm2 size doesnt really matter when the process is very mature and cheap. If the chip is 10 or 15$ in a 250$ card is more or less irrelevant. The 1GB VRAM less will save them more.

But its getting quite clear that Maxwell is a "conroe".
 
Last edited:

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
mm2 size doesnt really matter when the process is very mature and cheap. If the chip is 10 or 15$ in a 250$ card is more or less irrelevant. The 1GB VRAM less will save them more.

But its getting quite clear that Maxwell is a "conroe".

of course it matters.. ALOT!
1st of all chip is more like $30 or $40, so it's a fixed cost caring chunky piece of total gfx card value.
Then, the smaller the size, you can have more usable parts per wafer.

And finally it's an engineering feat, with AMD being TDP bounded, instead of mm2 like NV

mm2 matters alot, it's how AMD used to fight both NV and Intel
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
mm2 size doesnt really matter when the process is very mature and cheap. If the chip is 10 or 15$ in a 250$ card is more or less irrelevant. The 1GB VRAM less will save them more.

Lower mm2 + less VRAM at $249. Considering their 350+mm2 R9 280 3GB sells for as low as $185 and there are at least 5 R9 280 cards on Newegg for $199, AMD is going to make a lot more $$ selling this 285. Once Maxwell launches, they can drop the price to $199 and still make more $ than on a $200 R9 280.

But its getting quite clear that Maxwell is a "conroe".

I sure hope a brand new architecture that took 4+ years to design mops the floor with GCN 1.0/1.1 which is nearly 3 year old tech. Sure you can say GCN 1.1 is 1 year old but the improvements to GCN 1.0 are incremental not fundamental which goes back to nearly 3 years old Tahiti XT.

Maxwell is a far more dramatic architectural change compared to Kepler. Just like GCN mopped the floor with Fermi in compute and performance/watt, being a new darling architecture at the time, Maxwell should wipe the floor with GCN 1.0/1.1. When AMD designed 285/285X, they surely didn't intend for these cards to compete against GTX860/860Ti. Because otherwise what would R9 370/370X compete with, Pascal? For the next round NV seems on top of their game based on 750Ti and it's looking like they will be able to launch a brand new architecture top-to-bottom first vs. R9 300. Last time it took NV 6-9 months (!!) longer to release all of the desktop Kepler variants from 660Ti down to 650 when AMD finished everything from HD7770 to HD7970GE. Looks like NV is executing much better this time. Let's just hope AMD delivers something quite competitor or NV can continue to price its cards very high which isn't helping gamers.

----

AMD is slapping DX12 on the box of these cards. DX12 will be supported by all GCN cards and all NV GPUs starting with Fermi architecture. Wonder why NV didn't add DX12 to GTX750Ti spec sheet.

 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
, instead of mm2 like NV

mm2 matters alot, it's how AMD used to fight both NV and Intel

Lower mm2 + less VRAM at $249.

Has there been any indication or confirmation of Tonga's die size? Just curious, that's all. I fully expect it to be smalelr and be a solid 15% more efficient perf/watt than GK104's GTX770...

AMD is slapping DX12 on the box of these cards. Wonder why NV didn't add DX12 to GTX750Ti spec sheet.

Because DX12 hadn't been announced when GM107 was released, and AIB's printed up a bazillion boxes already that they don't want to throw away and eat the costs of.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
of course it matters.. ALOT!
1st of all chip is more like $30 or $40, so it's a fixed cost caring chunky piece of total gfx card value.
Then, the smaller the size, you can have more usable parts per wafer.

And finally it's an engineering feat, with AMD being TDP bounded, instead of mm2 like NV

mm2 matters alot, it's how AMD used to fight both NV and Intel

It matters for desktop, but is pretty meaningless if AMD wants to get their chips into more than desktops. NV is focusing on performance per watt because they're using kepler and maxwell technology for everything from tegra to geforce to quadro. Same technology and same performance per watt for all of their product lines, which makes sense. Thats why PPW matters so much, there's a much bigger world than just desktop.

AMD is doing the same, but i'm not sure that lowering the TDP by 10W compared to the R9-280 is all that impressive. IMO, it seems like the 285 will perform worse than the 280X, and maybe a bit better than the 280. At a 10W lower TDP than the 280.

It won't win them a ton of top tier ultrabook designs because the PPW matches GK104, unless they sell at bottom dollar. But the card is definitely cheaper to produce and should be a great competitor to whatever low end maxwell cards are coming, such as the 860. I think that's AMD's play here, and why they're discontinuing the 280 cards. It's cheaper to produce than the 280 and they can be flexible with the pricing and adjust depending on how Maxwell's lower tier cards (ie GTX 860) perform.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
When AMD designed 285/285X, they surely didn't intend for these cards to compete against GTX860/860Ti. Because otherwise what would R9 370/370X compete with, Pascal?

Since they just released this. I dont think there is any GCN 2.0 coming anytime soon. So it may actually be a theoretical GCN 2.0 vs Pascal.

GCN 1.1 seems like what will compete with Maxwell in its lifespan.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,601
2
81
Maxwell will live at least 2 years. I certainly think AMD will have launched a new gen during that time. I believe their respective generations are just a bit overlapping like Kepler and GCN were, not launching at the exact same time (which is unrealistic anyway since for that to happen they would have to have insider knowledge about the other company).
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
TDP for the GTX 760 is 170w. +15% performance on average on R9 285 sounds about inline with the stated 190w TDP. They caught up with Nvidia on Performance per watt but nothing stellar. If it overclocks well and the 4gb sku doesn't command a $50 price premium then its not too bad I guess. Maxwell is bound to shake things up a bit which I am eager to see.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Since they just released this. I dont think there is any GCN 2.0 coming anytime soon. So it may actually be a theoretical GCN 2.0 vs Pascal.

GCN 1.1 seems like what will compete with Maxwell in its lifespan.

There is no indication that 860/860Ti is launching soon either. The rumors are all talking about 870/880 and even that info is extremely vague, with some saying mid-September, others next year.

Saying GCN 1.1 will compete with Maxwell is similar to saying Fermi will compete with GCN 1.0 in its lifetime. Sure, there was overlap but 560Ti/570/580 weren't designed with 7850/7870/7950 in mind. Right now AMD is positioning 285 against 760 and keeping 280X against 770, at least for now. It's not realistic to expect AMD to use GCN 1.1 28nm from 2014 to 2016 until Pascal arrives. You honestly expect AMD to not release any update to GCN 1.1 in the next 2 years?

TDP for the GTX 760 is 170w. +15% performance on average on R9 285 sounds about inline with the stated 190w TDP. They caught up with Nvidia on Performance per watt but nothing stellar. If it overclocks well and the 4gb sku doesn't command a $50 price premium then its not too bad I guess. Maxwell is bound to shake things up a bit which I am eager to see.

I don't know why gamers continue to keep using TDP to imply TDP = power consumption when almost every generation we have seen that a card's TDP does not need to match its power consumption. In particular AMD's TDP rating tends to be very pessimistic.

7950 with 200W TDP uses about 145W, 760 uses just 160W, and a 250W TDP 7970 uses about 190W, 7970Ghz reference with 250W TDP used 240W:
http://www.techpowerup.com/mobile/reviews/Gigabyte/R9_280X_OC/24.html

Without checking real world gaming power usage, the 190W TDP rating is a vague point of reference considering how over the place AMD's TDP vs. real world power usage numbers have tended to fall.
 
Last edited:

ocre

Golden Member
Dec 26, 2008
1,594
7
81
Who was saying this? Because I can find someone that'll say just about anything if I look hard enough. Was this any kind of official statement or even reasonable industry leak? Or was it some guy on a forum somewhere?

Of course i am speaking of forum posters. I am not bashing AMD nor one of the people who screams AMD lies over and over. You may not see my POV but often times my post are to try to push down pre-hype. Be it threads on maxwell or Tonga, it doesn't matter. Even my post in freesync threads, it is all about out of control pre hype that just goes wild.

Especially in the post on expecting graphics cards, I think that pre-hype more often than not ruins more than it helps. It can turn a great card into a dud, all because unfounded expectations. They get out of control and really fast. I just feel like it's a major mistake. See with wild speculations a great GPU gets bashed because its not as good as people hype it up to be. Even if the cards meet those high marks, its just meeting expectations.

Its like stacking the deck against the technology. Its not useful at all from my POV. There is a nasty feeling that goes on after disappointment. And the higher the hype the less likely the card can hit all these marks. Turning a great card into at best meh and most likely a dissatisfaction. If the speculations don't run wild and unchecked, the chance of disappointment is very low. And even if the card comes out stellar, it will earn its buzz anyway. It will actually be more exciting without the pre-hype. More gratifying launches.

I mean, I love PCs and watching the technology advance. Its one of my greatest joys. I grew up in the PC era and I think its a special time in human existence. A great time to be living in. And for me, i choose not to accept the preHype. I just find that no matter the outcome, having extreme expectations is the least fulfilling. Its my position I hold and I often post in speculation threads to try to tone it down. Cause I don't see the benefit at all
 
Last edited:

ocre

Golden Member
Dec 26, 2008
1,594
7
81
I don't know why gamers continue to keep using TDP to imply TDP = power consumption when almost every generation we have seen that a card's TDP does not need to match its power consumption. In particular AMD's TDP rating tends to be very pessimistic.

7950 with 200W TDP uses about 145W, 760 uses just 160W, and a 250W TDP 7970 uses about 190W, 7970Ghz reference with 250W TDP used 240W:
http://www.techpowerup.com/mobile/reviews/Gigabyte/R9_280X_OC/24.html

Without checking real world gaming power usage, the 190W TDP rating is a vague point of reference considering how over the place AMD's TDP vs. real world power usage numbers have tended to fall.

Wow, I guess if the only thing you done with your card is play crysis 2 then you actually would have a point. But realistically power consumption varies from game to game, card to card, and even can change driver to driver. Even if only in tiny amounts. I don't think there are many here that believe tdp= exactly the power consumption used on every game no more than they believe firestrike scores in relation to game performance.

Your trying to pass it off as completely useless, now that is interesting. But I am sure that almost every knows power consumption isn't static and it is not gonna be exactly what the manufacturer list as tdp.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
There is no indication that 860/860Ti is launching soon either. The rumors are all talking about 870/880 and even that info is extremely vague, with some saying mid-September, others next year.

Saying GCN 1.1 will compete with Maxwell is similar to saying Fermi will compete with GCN 1.0 in its lifetime. Sure, there was overlap but 560Ti/570/580 weren't designed with 7850/7870/7950 in mind. Right now AMD is positioning 285 against 760 and keeping 280X against 770, at least for now. It's not realistic to expect AMD to use GCN 1.1 28nm from 2014 to 2016 until Pascal arrives. You honestly expect AMD to not release any update to GCN 1.1 in the next 2 years?



I don't know why gamers continue to keep using TDP to imply TDP = power consumption when almost every generation we have seen that a card's TDP does not need to match its power consumption. In particular AMD's TDP rating tends to be very pessimistic.

7950 with 200W TDP uses about 145W, 760 uses just 160W, and a 250W TDP 7970 uses about 190W, 7970Ghz reference with 250W TDP used 240W:
http://www.techpowerup.com/mobile/reviews/Gigabyte/R9_280X_OC/24.html

Without checking real world gaming power usage, the 190W TDP rating is a vague point of reference considering how over the place AMD's TDP vs. real world power usage numbers have tended to fall.


So you are saying the 190w TDP is worst case scenario based on the highest leakage chip? My Powercolor HD 7950 OC does not use 200w so I would have to say that you are correct in this matter. It uses around 160w. 270w peak while playing BF4 on a stock 2700k ITX rig. If I overclock the GPU to 1000mhz, that power consumption hits around 200w on the card alone.
 

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
Seems as though they used more accurate TDP numbers (as far as power consumption goes) for the R9 series cards. The 280X TDP is 250W, and that linked review shows it to be pretty close to that. While that is an overclocked card, the 7970GHz is not far behind @ 238W. Unfortunately, AMD did not specify TDP for the 290(X) cards, so we can't really compare those.
 

parvadomus

Senior member
Dec 11, 2012
685
14
81
TDP for the GTX 760 is 170w. +15% performance on average on R9 285 sounds about inline with the stated 190w TDP. They caught up with Nvidia on Performance per watt but nothing stellar. If it overclocks well and the 4gb sku doesn't command a $50 price premium then its not too bad I guess. Maxwell is bound to shake things up a bit which I am eager to see.

You cannot compare Nvidia TDP with AMD's. 270X is 180watts TDP, and still consumes like 40watts less than Nvidia 760s.
http://www.techpowerup.com/reviews/MSI/R9_270X_Gaming/22.html
285 consumes like 50 watts less than a 770, and performs about the same.
I think we will have to wait for real reviews to confirm the efficiency gains.
 

el etro

Golden Member
Jul 21, 2013
1,581
14
81
So you are saying the 190w TDP is worst case scenario based on the highest leakage chip?

No, its not chip by chip. 190W is the max you will make the card(Any card of the exactly same model) consume on heavy loads.


The chip leakage only affects the power consumption of 290 series cards(The cards use Powertune to determine which voltage apply to make the card remain at the biggest clock it can and under the 95°C temperature limit).
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
You cannot compare Nvidia TDP with AMD's. 270X is 180watts TDP, and still consumes like 40watts less than Nvidia 760s.
http://www.techpowerup.com/reviews/MSI/R9_270X_Gaming/22.html
285 consumes like 50 watts less than a 770, and performs about the same.
I think we will have to wait for real reviews to confirm the efficiency gains.

Your link is the power consumption for one game. You can't draw final conclusions from one game.
It doesn't work like that. Its a crazy assertion. Surely AMD methods in finding the 7970 tdp differed from nvidia and kepler but in no case would they base their figures from simply loading a save point in a single game (crysis 2).
Seems as though they used more accurate TDP numbers (as far as power consumption goes) for the R9 series cards. The 280X TDP is 250W, and that linked review shows it to be pretty close to that. While that is an overclocked card, the 7970GHz is not far behind @ 238W. Unfortunately, AMD did not specify TDP for the 290(X) cards, so we can't really compare those.
With Kepler NVidia did something radically different. Boost clocks. If you look at this from another angle, the reason for this radical change illuminates. Boost has everything to do with GPUs and varying loads. There has to be a set tdp but it ends up being a worst case scenario. This is because there is a wide variances in consumption which is very dependent on the task. Many times typical gaming did not consume as much. The solution was boost.

In effect it is essentially boosting clocks up when there is tdp to spare. Its making use of untapped potential. Now boost actually works by several mechanisms but ultimately tdp is still somewhat rooted in thermal dissipation, so knowing the architectures they could estimate pretty good approximations with the real time monitoring of just a few data points.

Seeing nvidia's boost led AMD to naturally follow suit. Its a great idea. Because by nature the card now boost when its not got headroom. This will shrink the spread from game to game. Its still gonna vary but not as much. Games that allow will remain in boost and task that are intensive won't be able to as much.

No, its not chip by chip. 190W is the max you will make the card(Any card of the exactly same model) consume on heavy loads.


The chip leakage only affects the power consumption of 290 series cards(The cards use Powertune to determine which voltage apply to make the card remain at the biggest clock it can and under the 95°C temperature limit).
Not sure exactly what your claiming but there is a difference chip to chip. I have no idea where you get that there isn't. Its not only common knowledge, it is something that chip makers have learned to exploit in a process called binning.

Chip to chip, there is a difference. Chip fabrication is not precise and the results arent perfectly equal. They have to meet minimums but all are not created equal
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |