AMD's Tonga - R9 285 (Specs) and R9 285X (Partial Specs)

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Tonga isn't going in an ultrabook period with a 190W TDP. Maybe a cut down version can fit in a beefcake notebook.

I know, I meant the architecture itself. I don't mean the exact shader configuration used in the 285, obviously 190W TDP is far beyond overkill for an ultrabook.

But it performs as well as Tahiti at a lower TDP, and performs better than pitcairn at same TDP levels - it's not an amazing efficiency jump but it's better suited as an architecture for portability. So I could see AMD using a cut down version of Tonga in ultrabooks. Similar to how NV uses cut down GK104 parts with substantially lower TDPs in ultrabooks. I do think Tonga will have a tough time against Maxwell v2 in ultrabooks (efficiency wise), but those parts aren't fully released yet.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
That is throttling. Stock 680 clocks are
Core Clock: 1006MHz
Boost Clock: 1058MHz

The real issue is the voltage which should be at 1.175v. Furmark is not running boost state. If I was developing drivers, I'd be throttling Furmark too. It kills cards.


When I get home I can run some furmark on my HD 7950 to see if AMD throttles on 14.7 Catalyst. I only paid $100 for the GPU but running it for 10 minutes shouldn't kill the card.

We probably should create a new thread if we want to delve into furmark further.

I don't think you understand how boost works.
no not really. if the card is under max load then it will actually go above rated boost clock even. that card is only hitting base clock at full load which clearly shows the clocks are being slightly throttled for one reason or another. in that screenshot temps and power limit seem mostly responsible for the lowered clockspeeds.

Boost works on several mechanisms. But if you can imagine, watching incoming power through pwm and temp engineers can extract the unused potential in task that allow. Depending on the demands of the load, the GPU boost its clocks as the power limit allows.

Future mark, metro, or mining it doesn't matter. If your card is meeting the power limit, boosting suffers. So whether or not nvidia drivers detect future mark, it doesn't matter. The boost technology is sophisticated enough that its gonna respond differently according to the load. The powerlimit permits boost clocks.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
No one said anything about using Crysis 2 as end all be all game for power usage. However, if you go browse 30 reviews of the cards outlined below in games, you will see that all of them consume different average and peak power.
So you know, here you are agreeing with me. Just as I said, different cards, different task, different loads. This makes it all the more strange you would use the lone TPU charts in your power consumption rant. The information you listed is only applicable to those cards used in the review for a specific run in crysis 2.

TDP actually has more to do with guidance for cooling/heatsink design due to heat dissipation. TDP != power consumption.

HD6970 = 250W
R9 280 = 250W
HD7970 = 250W
HD7970GE = 250W
780 = 250W
780Ti = 250W

All of these cards use a completely different amount of power but all have the same TDP rating.
Okay, is the a revolutionary discovery to you?
Of course they use different amounts. Not only is it task specific, it is also varies between two different GPUs of the same model.

7970 and 6970 are 190-200W cards and 780 is a 220W not a 250W card, while 780Ti easily exceeds 250W:
http://www.techpowerup.com/mobile/reviews/NVIDIA/GeForce_GTX_780_Ti/25.html
That's where you are wrong. You can't throw up a chart that has recorded the power consumption of one game and claim that is the real tdp. That is simply what the cards used in that task which is a specific load in a specific game for a specific amount of time.
This is not the tdp, its simply what the cards are using in that case.

Stand back and just think about it: are you going to claim now that an R9 280 uses up to 250W but 7970Ghz also uses 250W? The TDP ratings for AMD and NV tell us little about the card's real world power usage in games.

I don't think you know what I am claiming actually.

R9 280 is barely higher clocked than 7950 but has a 250W TDP. The real world power usage of cards such as 280 or 7950 or 7970 is far below 250W.

Look at the TDP of 280X vs. 770. You would think that 770 uses way less power in games but it's not even remotely true.

http://techreport.com/review/25466/amd-radeon-r9-280x-and-270x-graphics-cards/10

Or

http://www.anandtech.com/show/6994/nvidia-geforce-gtx-770-review/16

In fact, the variance in power usage among 280X models alone shows how worthless the TDP is for gauging real world power consumption in games:

http://www.techspot.com/review/841-radeon-r9-280x-roundup/page11.html

There are countless cases of cards which use much less power than their TDP, about the same power and way more power. The fact that AMD and NV even define TDP differently makes it even more pointless to compare them.

Then we get to the part where after-market cards use better cooling, digital power delivery and overall more efficient components. As a result, you can have an after-market card that uses less power than the reference design. Alternatively, the board could be designed specifically to support high overclocking (Classified/Lightning/Matrix/Vapor-X) and these tend to use more power than the reference design. Unless you test a specific card in question, often times the TDP rating doesn't align at all.

First it took gamers a long time to accept that Furmark was a waste of time for testing a GPU's real world power usage but using TDP to mean power consumption is another one of those old errors/myths that long needs to go away. We have tools that allow us to measure the card's real world power usage where it's no longer relevant to look at some arbitrary number on the box.

Again, I don't think anyone here believes a gpu always runs at its tdp. I actually believe a lot of people on this site understand that tdp has to do with heat dissipation. And most people know that gaming is not the most extreme load, that there are other task that strain can stress GPUs even harder.

People no more think gpus run at their max tdp all the time than they do CPUs. They understand stress testing and fully pegging all the cores.

But for GPUs,
There are other task besides normal gaming and these GPUs have to be able to dissipate the heat these task generate. Tdp is not useless as you seem to be suggesting.

No one is claiming that your tdp is exactly how much power your card will use in gaming. Most people know that gaming does not push your card as hard as something like mining or folding. But the GPU has to be able to dissipate the heat it can generate. Just like a PSU has to be able to sustain the max load even if that load is many times the normal load.

This is why GPUs recommend larger PSUs than they need. They list higher GFLOPS than they will ever produce in the real world task for everyday users.

It varies from task to task. Typical gaming loads aren't as demanding as the most demanding task. But depending on the game, consumption varies. You can't list the power consumption from one scene in one game and claim that this is the real power consumption, like that is all the card will ever use. That data is the real world consumption of that particular card in that specific task.

Power consumption, when it comes to GPUs, is not simple. You can come up with a typical gaming usage and that is fine. If that's what you want to focus on. Such and such card has x typical gaming load. But that's not revolutionary here on atf.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
No one is claiming that your tdp is exactly how much power your card will use in gaming. Most people know that gaming does not push your card as hard as something like mining or folding. But the GPU has to be able to dissipate the heat it can generate. Just like a PSU has to be able to sustain the max load even if that load is many times the normal load.

....
It varies from task to task. Typical gaming loads aren't as demanding as the most demanding task. But depending on the game, consumption varies. You can't list the power consumption from one scene in one game and claim that this is the real power consumption, like that is all the card will ever use. That data is the real world consumption of that particular card in that specific task.
.

That's why TDP is worthless for enthusiast PC gamers:

1) AMD/Intel and NV define TDP differently. Therefore, TDP cannot be compared between these brands. If someone uses TDP to compare AMD and NV GPU, they are wasting their time most of the time.

2) Since you already admitted that TDP deals with heat dissipation, not power usage, we shouldn't care what the TDP of any component is. In fact, there is another key reason why TDP is meaningless -- We can measure the actual power consumption of a given component in many tasks from gaming to distributed computing to mining. I don't care if my GPU has a TDP of 1000W. If I look at the power usage of 10 most demanding PC games from Crysis 2/3 to Metro LL and use distributed computing/rendering workloads, I have measured my 99% percentile power usage for real world (non-theoretical/non-power virus) scenarios.

3) Since a GPU can actually exceed its rated TDP, TDP tells me little about my GPU's maximum power consumption. Since I know it's not possible for a stock R9 280 to use as much power as a stock HD7970Ghz and that it's not possible for a stock 780 to use as much power as a stock 780Ti, the fact that they are all rated at 250W TDP tells me little about their power usage in the real world and shows how out of touch the TDP rating has become since it's just randomly assigned to many SKUs.

4) If you have no idea what the TDP is of a CPU/GPU, it wouldn't change anything about your build because ultimately you'd simply read a review which measured the real world power usage. Since there are GPU coolers capable of dissipating 450-600W (Gigabyte Windforce, Asus Matrix, MSI Lightning), I really don't care about a 250W TDP rated card overheating. Similarly someone who is going to be overclocking an X99 CPU isn't going to use a $20 budget heatsink.

5) Since different tasks increase the load on the rest of system components, especially games, if I just look at the TDP, I do not know the overall performance/watt (i.e., efficiency) of a modern gaming rig. Ultimately, I have to look at reviews which find that out for me.

It's OEMs that need TDP because they don't have the time to test power usage of games/distributed computing and other such tasks. It's a shortcut for them to know that if a GPU is rated at 250W TDP, well their 300W PSU won't cut it and an SLI system with 2 of those GPUs cramped into a micro-ATX case with 1 fan won't work well.

TDP is just a general guidance but for anyone who cares about real world power usage in games, they should just measure it. We tend to use TPU since they use GPU heavy games such as Crysis 2 and Metro which are pretty good at representing 95% of GPU load in games.

---

TL; DR

TDP is worthless for enthusiast PC gamers because:

1) We can measure real world power usage which makes TDP rating a theoretical number. Why do I care about a theoretical number when I can get actual real world results?

2) Since TDP is defined differently across AMD/Intel/NV, comparing TDPs is pointless.

3) Modern high-end CPU/GPU after-market cooling solutions have the capacity to easily dissipate 250W, which makes TDP heat dissipation ratings pointless for overclockers. Overclockers compare real world results, they don't care for theoretical numbers.

4) If I don't the TDP rating of a particular PC component, I lose nothing at all since in the end I only care about real world power usage in real world scenarios. Since a GPU or a CPU doesn't operate in a vacuum, I need to know the actual power usage of the total system when assessing if my PSU is sufficient.

5) Looking at TDP of individual components tells me little about their efficiency overall in a complete build under load because of CPU/GPU limited situations.
 
Last edited:

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
That's why TDP is worthless for enthusiast PC gamers:

1) AMD/Intel and NV define TDP differently. Therefore, TDP cannot be compared between these brands. If someone uses TDP to compare AMD and NV GPU, they are wasting their time most of the time.

2) Since you already admitted that TDP deals with heat dissipation, not power usage, we shouldn't care what the TDP of any component is. In fact, there is another key reason why TDP is meaningless -- We can measure the actual power consumption of a given component in many tasks from gaming to distributed computing to mining. I don't care if my GPU has a TDP of 1000W. If I look at the power usage of 10 most demanding PC games from Crysis 2/3 to Metro LL and use distributed computing/rendering workloads, I have measured my 99% percentile power usage for real world (non-theoretical/non-power virus) scenarios.

3) Since a GPU can actually exceed its rated TDP, TDP tells me little about my GPU's maximum power consumption. Since I know it's not possible for a stock R9 280 to use as much power as a stock HD7970Ghz and that it's not possible for a stock 780 to use as much power as a stock 780Ti, the fact that they are all rated at 250W TDP tells me little about their power usage in the real world and shows how out of touch the TDP rating has become since it's just randomly assigned to many SKUs.

4) If you have no idea what the TDP is of a CPU/GPU, it wouldn't change anything about your build because ultimately you'd simply read a review which measured the real world power usage. Since there are GPU coolers capable of dissipating 450-600W (Gigabyte Windforce, Asus Matrix, MSI Lightning), I really don't care about a 250W TDP rated card overheating. Similarly someone who is going to be overclocking an X99 CPU isn't going to use a $20 budget heatsink.

5) Since different tasks increase the load on the rest of system components, especially games, if I just look at the TDP, I do not know the overall performance/watt (i.e., efficiency) of a modern gaming rig. Ultimately, I have to look at reviews which find that out for me.

It's OEMs that need TDP because they don't have the time to test power usage of games/distributed computing and other such tasks. It's a shortcut for them to know that if a GPU is rated at 250W TDP, well their 300W PSU won't cut it and an SLI system with 2 of those GPUs cramped into a micro-ATX case with 1 fan won't work well.

TDP is just a general guidance but for anyone who cares about real world power usage in games, they should just measure it. We tend to use TPU since they use GPU heavy games such as Crysis 2 and Metro which are pretty good at representing 95% of GPU load in games.

---

TL; DR

TDP is worthless for enthusiast PC gamers because:

1) We can measure real world power usage which makes TDP rating a theoretical number. Why do I care about a theoretical number when I can get actual real world results?

2) Since TDP is defined differently across AMD/Intel/NV, comparing TDPs is pointless.

3) Modern high-end CPU/GPU after-market cooling solutions have the capacity to easily dissipate 250W, which makes TDP heat dissipation ratings pointless for overclockers. Overclockers compare real world results, they don't care for theoretical numbers.

4) If I don't the TDP rating of a particular PC component, I lose nothing at all since in the end I only care about real world power usage in real world scenarios. Since a GPU or a CPU doesn't operate in a vacuum, I need to know the actual power usage of the total system when assessing if my PSU is sufficient.

5) Looking at TDP of individual components tells me little about their efficiency overall in a complete build under load because of CPU/GPU limited situations.

Great post.

You highlight a number of key points. I would add that 'TDP' (or really we should just say 'actual power consumption' because it means something tangible) depends HIGHLY on the usage model. This is really important now because so many CPUs and GPUs are bundled with extensions and hardware that may only be used in very specific circumstances.

Example 1: A modern CPU power-usage will be very different between a 'standard 100% load' vs. a 100% load with full ASX and HT being taxed.

Example 2: The graphics card measured power usage can be very different between a 100% 'gaming GPU load' vs. a 100% load during BT mining.

Those situations, and what you are buying your hardware for, can make a big difference in what or how you measure efficiency for your purposes. A certain GPU may be more efficient for gaming vs. mining, and a competing GPU is exactly the opposite.

We should be careful that we not only state the measured power usage, but also reference HOW it was measured.

Just my $0.02...
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
I think the situation is getting much better with the new direction things are going. boost and AMDs variable clocks are ways to extract more performance when the load permits. As these get more sophisticated, the difference between apps could become very minor.

Its a great direction in my opinion.
 

lopri

Elite Member
Jul 27, 2002
13,211
597
126
I don't know about "overhyped" aspect of these products. If anything my impression is that AMD is keeping a low key? Just looking at the model numbers (275/285), it's rather clear to me that these cards are not for those who already own 270/280 cards and are looking to upgrade.

If they named the cards as R9 370/380 then yes that would be problematic.
 

tollingalong

Member
Jun 26, 2014
101
0
0
If they named the cards as R9 370/380 then yes that would be problematic.

"A rose by any other name would smell as sweet." -Shakespeare

Names are for the uninformed. What matters is the underlying technology. I don't think anyone is impressed with the initial 285 findings.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
"A rose by any other name would smell as sweet." -Shakespeare

Names are for the uninformed. What matters is the underlying technology. I don't think anyone is impressed with the initial 285 findings.

And what are those findings ??
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
that it's going up against the GTX760.

Because one slide said Tonga is 15% faster than 760 in BF4 at 1440P ???

That was to highlight the performance at the same price as GTX-760. Tonga will be closer to 770 in performance but at lower TDP and lower price.
 

FatherMurphy

Senior member
Mar 27, 2014
229
18
81
Regardless, the "15% faster than GTX 760" slide gives you a good idea about its performance. The TDP given gives you a good idea about how much energy it will use to achieve that performance (unless AMD has radically departed from the way in which it gauges TDP). In other words, roughly equivalent to the PPW efficiency of a full GK104, or somewhat better.

We'll know for sure on September 2.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
LOL
I take it they should advertise it as "15% slower than gtx780", so some people here have a good time describing amd marketing team competences.

yes..."15% slower than gtx780". that should be a pure winner
lets see how far that takes you

no idea what you are LOL-ing about. AMD is going against 760 in their slides as I have predicted some time ago

BTW... what happened with HBM and 20nm?? pinnging NostaSeronx!!
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
yes..."15% slower than gtx780". that should be a pure winner
lets see how far that takes you

no idea what you are LOL-ing about. AMD is going against 760 in their slides as I have predicted some time ago

I came with a rough placeholder number since there are no reviews of this card yet! Way to do nitpicking... Point still stands. You never compare your products to something faster, showing the advantages of competitors products.

BTW... what happened with HBM and 20nm?? pinnging NostaSeronx!!
How is HBM and 20nm By the way to this thread?
 

FatherMurphy

Senior member
Mar 27, 2014
229
18
81
Fellas, we are chasing our tails here. Let's address this question: will Tonga, which is replacing Tahiti, be around as long as Tahiti has (2+ years). I can't imagine so. I think Tonga is a stop gap until the 16nm (20nm + FinFet) node in the summer/fall of 2015.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I think the goal is twofold, i've mentioned this before but my theory:

1) a better replacement for Pitcairn in portable form factors. Obviously they won't use the 190W TDP configuration, but it is better performing than pitcairn at same TDP levels presumably. So they will use cut down versions of Tonga for their high end mobile dGPU as opposed to pitcairn which has been used for years

2) better positioned to compete with lower tier 2nd gen maxwell cards soon. Cheaper to produce, higher margins, whereas Tahiti would be cutting it tight margin wise since they're more expensive to produce. So Tahiti (280/280X) is apparently going EOL once 285/285X are on the market.

Main thing is, cheaper to produce, therefore they won't bleed money and they can be flexible with pricing as a response to lower tier 2nd gen maxwell cards (ie GTX 860). Just my theory. I don't think the 249.99$ MSRP will last long at all on the 285.
 

alexruiz

Platinum Member
Sep 21, 2001
2,836
556
126
As others have said, let's wait and see.
Reviewers should already be working hard benchmarking these puppies, so we will know soon.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
F1sh, I was responding with AtenRa's (#160) post in mind. I have to agree my post was not clear. Anyway....

Will that card put a final nail in the GCN 1.0 coffin? Is amd going to EOL 270/x 280/x and leave boinare, thonga and hawaii on the shelves? Is it GF's 28nm? Would love to see cross-fab competition
 

DiogoDX

Senior member
Oct 11, 2012
746
277
136
yes..."15% slower than gtx780". that should be a pure winner
lets see how far that takes you

no idea what you are LOL-ing about. AMD is going against 760 in their slides as I have predicted some time ago

BTW... what happened with HBM and 20nm?? pinnging NostaSeronx!!
No one in their right mind would believe Tonga would bring 20nm and HBM.

I'm hopoing that sushiwarrior comes with more rumors about Fiji. The guy just disappeared of the forum.:hmm:
 

PPB

Golden Member
Jul 5, 2013
1,118
168
106
No one in their right mind would believe Tonga would bring 20nm and HBM.

I'm hopoing that sushiwarrior comes with more rumors about Fiji. The guy just disappeared of the forum.:hmm:

Yeah, people with real insider info (be it people close to the production channel, working for the companies discussed, or game devs that have a clearer picture of the underlying hardware used) leave when they realize this forum is plagued with fanboys and PR people.
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
I can't believe how many people are saying it should be called 275 before benchmarks are released. Something like 280E or 280L, sure. If it's slightly faster than 280 when the 2GB doesn't bottleneck it, 275 would end up being just as bad of a name as 285. After that, we'd end up with 275X which is definitely faster than 280 and possibly faster than 280X

The reason the name sucks is because there's no good name for it other than R9 370.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
Well, remember the 290 and 290x release? They put the 290x up against the 780 and not the Titan. It ended up being slightly faster than the Titan. They might be playing it down so the reviewers will be surprised of the gains on this Tonga vs Tahiti. Who knows until we find out.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,249
136
Well, remember the 290 and 290x release? They put the 290x up against the 780 and not the Titan. It ended up being slightly faster than the Titan. They might be playing it down so the reviewers will be surprised of the gains on this Tonga vs Tahiti. Who knows until we find out.

The untouchable Titan if I remember correctly.

Not really in the market but I do look forward to the reviews at least.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |