OCUK: 290X "Slightly faster than GTX 780"

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AdamK47

Lifer
Oct 9, 1999
15,322
2,928
126
Whenever the next gen of each manufacturer is about to be released, there will be those who will always be gunning for the highest end of opposing team. In this case it's the nVidia Titan. It's amusing when I scroll through posts looking for these oddities.

It's been this way since the 3Dfx days and I hope it never ends. PC gaming needs it. Fanboyism and competition is great.

The truth is, ATI will not match nVidia's Titan. No matter how hard you wish to believe it.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136

so you do not believe what a AMD employee has to say about their new products. fine. wait till reviews. surely there is atleast a newer stepping involved here. Incorporationg Bonaire style power management into Tahiti and Pitcairn would improve perf/watt and that is exactly what seems to be the case.

btw gibbo has got it all wrong. R9 270x is a Pitcairn clocked at 1.1 Ghz and R9 280 is a HD 7950 with clocks not yet revealed.

http://translate.google.com/translate?sl=auto&tl=en&js=n&prev=_t&hl=en&ie=UTF-8&u=http%3A%2F%2Fwww.chiphell.com%2Fthread-864727-1-1.html

Average performance:
GTX 760 →100%
GTX 660Ti →94.5%
R7 270 →89.7%
HD 7870 →82.4%
Performance per Watt:
GTX 760 →100%
GTX 660Ti →108%
R7 270 →119%
HD 7870 →114%

So you see with R9 270x perf has gone up and so has perf/watt. by just increasing clocks on the same chip thats not possible.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,193
2
76
so you do not believe what a AMD employee has to say about their new products. fine. wait till reviews. surely there is atleast a newer stepping involved here. Incorporationg Bonaire style power management into Tahiti and Pitcairn would improve perf/watt and that is exactly what seems to be the case.

btw gibbo has got it all wrong. R9 270x is a Pitcairn clocked at 1.1 Ghz and R9 280 is a HD 7950 with clocks not yet revealed.

http://translate.google.com/translate?sl=auto&tl=en&js=n&prev=_t&hl=en&ie=UTF-8&u=http%3A%2F%2Fwww.chiphell.com%2Fthread-864727-1-1.html

Average performance:
GTX 760 →100%
GTX 660Ti →94.5%
R7 270 →89.7%
HD 7870 →82.4%
Performance per Watt:
GTX 760 →100%
GTX 660Ti →108%
R7 270 →119%
HD 7870 →114%

So you see with R9 270x perf has gone up and so has perf/watt. by just increasing clocks on the same chip thats not possible.

Wouldn't the r9 280 be the 7970?

Or are yields so good they won't lose any chips? Wouldn't it be the 290x as the full die, the 290 for chips that aren't able to hit the 290x targets, and then 280 as 7970?
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
Wouldn't the r9 280 be the 7970?

Or are yields so good they won't lose any chips? Wouldn't it be the 290x as the full die, the 290 for chips that aren't able to hit the 290x targets, and then 280 as 7970?

R9 290X - Hawaii XT
R9 290- Hawaii Pro
R9 280X - HD 7970 at USD 300 . clocks seem to be around 1 Ghz.
R9 280 - HD 7950
R9 270X - HD 7870(1.1 Ghz) at USD 200
R9 270 - HD 7850

http://www.techpowerup.com/191453/amd-gpu14-event-detailed-announces-radeon-r9-290x.html
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,193
2
76
R9 290X - Hawaii XT
R9 290- Hawaii Pro
R9 280X - HD 7970 at USD 300 . clocks seem to be around 1 Ghz.
R9 280 - HD 7950
R9 270X - HD 7870(1.1 Ghz) at USD 200
R9 270 - HD 7850

http://www.techpowerup.com/191453/amd-gpu14-event-detailed-announces-radeon-r9-290x.html

Ah, didn't realize there was a 280x. This is by far one of the worst naming schemes in existence. They should have just done it like this.

R913x
R913
R813x
R813
R713x
R713
etc.

So, R(performance level)(year released)(x or not). Maybe that's dumb too, but what they have now is kind needlessly complicated.
 

Saylick

Diamond Member
Sep 10, 2012
3,389
7,154
136
They are basically following Intel's CPU naming scheme. To the general populace, the i3, i5, and i7 monikers are the easiest way for them to distinguish performance brackets from one another.

AMD is using R7 and R9 instead of i3 and i5/i7, respectively.

In the same vein as how we refer to Intel CPUs by their product name (e.g. 2600K, 3570K, 4770, etc), AMD GPUs should be referred to by their 3 digit code (e.g. 290X, 290, 280X, etc) where the first digit refers to the generation and the second digit refers to the family. The addition of an X at the end signals a fully enabled chip, i.e. XT, and models without an X refer to a cut-down model, i.e. PRO.

x90X and x90 cards will still represent the high end, much like how x970 and x950 represented the current high end. After the 9 series is the 8 series, i.e. x80X and x80, followed by the 7 series, etc.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Problem is a lot of us already have enough performance, what we want are cool features and more immersion.

AMDs answer to slower performance is Mantle, their answer to PhysX is sound.

If you improve overall performance it allows you to increase fidelity in any or all areas. Typically when features are added they come with a performance cost. AMD is adding features that improve performance. Don't worry, they will immediately add "immersion" to the games that we didn't have enough power to do before.

If you are going to try and disagree with a post, actually disagree with what's in the post. I still say more games will use Mantle than PhysX. Do you disagree with that?
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I predict neither will be used as much as DX11+ so what does it even matter? I also predict Nvidia will still sell more cards.

I see that you are like Balla, you can't refute the post so you twist it. Do you think there will be more Mantle games or PhysX?

Mantle doesn't preclude Dx11+. BF4 is going to still be a Dx11+ game. Since you mention Dx11+, when is nVidia going to advance beyond the Dx11.0 feature level?

Your 2nd part... nVidia is very likely to sell less and less cards as time goes on. They are definitely falling behind. I hope they are an excellent cellphone company. They'll need to be.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Mantle doesn't preclude Dx11+. BF4 is going to still be a Dx11+ game. Since you mention Dx11+, when is nVidia going to advance beyond the Dx11.0 feature level?

Kepler and Fermi supports DX11.1 gaming related features through hardware.. Only the non gaming features are unsupported.

Source

Your 2nd part... nVidia is very likely to sell less and less cards as time goes on. They are definitely falling behind. I hope they are an excellent cellphone company. They'll need to be.

Honestly, where do you dream up this nonsense? NVidia is in a much stronger position than AMD, and is far more profitable as a company. They have the majority of the discrete GPU market share, and they have multiple income streams that have nothing to do with PC gaming.

Even if Mantle becomes a success, you think NVidia won't be able to put their own spin on it and use it? According to AMD, it's non proprietary so there is no licensing fee.

Maxwell GPUs are also rumored to have ARM CPUs integrated onto the cards themselves to reduce CPU overhead associated with abstraction and increase performance.

Anyway, why would you even want NVidia to go down? Do you love AMD that much, that you want to see them be the sole discrete GPU manufacturer?
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Ah, didn't realize there was a 280x. This is by far one of the worst naming schemes in existence. They should have just done it like this.

R913x
R913
R813x
R813
R713x
R713
etc.

So, R(performance level)(year released)(x or not). Maybe that's dumb too, but what they have now is kind needlessly complicated.

OEMs would hate that naming scheme due to conveying some actual information, which is a big consideration for the big two GPU companies. Especially year the die first released.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Kepler and Fermi supports DX11.1 gaming related features through hardware.. Only the non gaming features are unsupported.

Source



Honestly, where do you dream up this nonsense? NVidia is in a much stronger position than AMD, and is far more profitable as a company. They have the majority of the discrete GPU market share, and they have multiple income streams that have nothing to do with PC gaming.

Even if Mantle becomes a success, you think NVidia won't be able to put their own spin on it and use it? According to AMD, it's non proprietary so there is no licensing fee.

Maxwell GPUs are also rumored to have ARM CPUs integrated onto the cards themselves to reduce CPU overhead associated with abstraction and increase performance.

Anyway, why would you even want NVidia to go down? Do you love AMD that much, that you want to see them be the sole discrete GPU manufacturer?

nVidia does not fully support all of Dx11+ features. Fact supported by your own link. Don't give me the typical nVidia response that it doesn't matter. They are, almost without exclusion, later than AMD supporting the latest Dx features. You'd have to go back many years to find otherwise.

Mantle needs to be more than open for nVidia to use it. It has to be compatible with their hardware, which as far as I know, it's not. nVidia can try to release their own version, but good luck with that. AMD is not introducing Mantle for the PC. It's derived from functions that are used in all of the consoles. They are merely making use of code that is already there.

I never said I want nVidia to go down. I'm just stating my opinion of just how powerful Mantle is going to be. Again, if you are going to refute a post you need to actually refute it. Instead you confirm it with your link while trying to downplay it, and then claim I said something I never did, so you don't have to try and address what I really said.

I hope AMD makes some money and I hope nVidia has a counter that also improves performance. Too bad nVidia took advantage of their market position and released their top card this time around at $1000, and their 2nd tier salvaged chips for $650. We're all going to pay more now. Hopefully, nVidia cuts prices, and then AMD responds, etc... We're seriously screwed at the moment with regards to pricing.

On a side note, I also hope that Mantle removes the CPU bottle neck to so we can buy cheaper CPU's and still get the job done.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,634
180
106
Lets see,

Should one look at Anandtech, that use different hardware with different machines, and post system power consumption,

Or

Should one look at TechPowerUp that use an Integra multimeter, that measure the GPU exact power consumption


So you are using that as proof that another website posting total system power are wrong because the system with the 7970GHz consumes less?

And we talking about 6-10W difference?


I also like how the average and peak power consumption are measure in crysis 2 which then have no performance numbers available.
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
So you are using that as proof that another website posting total system power are wrong because the system with the 7970GHz consumes less?

And we talking about 6-10W difference?


I also like how the average and peak power consumption are measure in crysis 2 which then have no performance numbers available.

Let me put in context for you:

A user posted, this picture. It said that 7970GHz consume 9W less than GTX 780. And 7970GHz draw 39W less than Titan.

Then I posted TechPowerUp, which is able to measure the exact power draw from the GPU alone using a really expensive digital meter that measure the PCI-e ports, stating that GTX 780 draw 16W less than 7970GHz and Titan the same as 7970GHz.

It means that udteam is missing the correct draw by a massive 55W compared to Titan and 25W compared to GTX 780.

Because they have hooked a digital meter to the power outlet which the computer is connected to that measure the complete PC power consumption. Highly inaccurate, especially if they use different hardware (like CPU) with the GPUs.


When did Gibbo go to work for AMD?

No but he is the admin there and he said he "checked" which I take as he knows what he is talking about.
 
Last edited:

Leadbox

Senior member
Oct 25, 2010
744
63
91
Then I posted TechPowerUp, which is able to measure the exact power draw from the GPU alone using a really expensive digital meter that measure the PCI-e ports, stating that GTX 780 draw 16W less than 7970GHz and Titan the same as 7970GHz.

It means that udteam is missing the correct draw by a massive 55W compared to Titan and 25W compared to GTX 780.

Because they have hooked a digital meter to the power outlet which the computer is connected to that measure the complete PC power consumption. Highly inaccurate, especially if they use different hardware (like CPU) with the GPUs.
If the card worked on its own independent of the rest of the system then measuring power consumption at the card would mean something
Clearly however we know that a system with a geforce card in it consumes more power than one without, geforces seem to be more cpu dependant
So if measuring system consumption at the wall is highly inaccurate then I guess measuring CPU power consumption with a geforce installed is also inaccurate
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
Let me put in context for you:
No but he is the admin there and he said he "checked" which I take as he knows what he is talking about.

Gibbo could not even tell that R9 270x is a Pitcairn chip with higher clocks(1.1 ghz). R9 280x and R9 280 use the Tahiti chip. both these chips seem to have undergone slight tweaks for better perf/watt. R9 290X and R9 290 uses the Hawaii chip. Gibbo has got his info wrong.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I see that you are like Balla, you can't refute the post so you twist it. Do you think there will be more Mantle games or PhysX?

Mantle doesn't preclude Dx11+. BF4 is going to still be a Dx11+ game. Since you mention Dx11+, when is nVidia going to advance beyond the Dx11.0 feature level?

Your 2nd part... nVidia is very likely to sell less and less cards as time goes on. They are definitely falling behind. I hope they are an excellent cellphone company. They'll need to be.

Physx cause it's already been used in UE3, is tied into UE4, and Witcher is using it. That's more games than Mantle already.

Nvidia falling behind? Their marketshare has only increased in the dGPU category since last year. http://www.tekrevue.com/2013/08/19/nvidia-takes-62-of-discrete-gpu-market-share-in-q2-2013/

nVidia does not fully support all of Dx11+ features. Fact supported by your own link. Don't give me the typical nVidia response that it doesn't matter. They are, almost without exclusion, later than AMD supporting the latest Dx features. You'd have to go back many years to find otherwise.

DX11.1 features can be used through software. There is no hardware requirement.
 
Last edited:

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
If the card worked on its own independent of the rest of the system then measuring power consumption at the card would mean something
Clearly however we know that a system with a geforce card in it consumes more power than one without, geforces seem to be more cpu dependant
So if measuring system consumption at the wall is highly inaccurate then I guess measuring CPU power consumption with a geforce installed is also inaccurate

Are you serious? Seriously?

Anandtech posted total power consumption during Battlefield 3. Meaning CPU/RAM/Display/fans, everything run.

UDteam posted total power consumption during Firestrike benchmark. Meaning everything runs there as well.

Compare techpowerup and the rest. Having over 100W more should be a dead give away. That is why I say that techpowerup is the site to go to if you want accurate power consumption information about GPUs.

As for CPU measurements, I have no idea how they measure that.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
So is it safe to assume you're intentionally thread crapping? Any talk of power consumption of the R290X is merely speculation because we only have leaks (which oddly enough, suggest better than Titan power consumption) but those are leaks.

Efficiency generally goes up with every generation, just as the GTX 480 to 580 despite the 580 having more CUDA cores and shaders - since the 290X is using a new updated architecture, we can only speculate, not extrapolate with any accuracy. If you're wanting to extrapolate efficiency from prior products, I think it's safe to assume that you are making a concentrated effort to thread crap.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
nVidia does not fully support all of Dx11+ features. Fact supported by your own link. Don't give me the typical nVidia response that it doesn't matter. They are, almost without exclusion, later than AMD supporting the latest Dx features. You'd have to go back many years to find otherwise.

They don't support all of DX11.1, but they support the features that matter through hardware.

Anyway, it's not as though it matters.

Mantle needs to be more than open for nVidia to use it. It has to be compatible with their hardware, which as far as I know, it's not. nVidia can try to release their own version, but good luck with that. AMD is not introducing Mantle for the PC. It's derived from functions that are used in all of the consoles. They are merely making use of code that is already there.

As far as I can tell, Mantle is just like DirectX. It sits between the 3D application, and the device driver, and as such, NVidia could use it by implementing their own front end and developing their own driver for it.

It is NOT low level. It's just a lot more efficient than DirectX due to less layers of abstraction as it's only compatible with GCN. The use of true low level requires fixed hardware, which is something PCs do not have. With consoles, developers can bypass an API completely and talk directly to hardware if they want.

I hope AMD makes some money and I hope nVidia has a counter that also improves performance. Too bad nVidia took advantage of their market position and released their top card this time around at $1000, and their 2nd tier salvaged chips for $650. We're all going to pay more now. Hopefully, nVidia cuts prices, and then AMD responds, etc... We're seriously screwed at the moment with regards to pricing.

AMD as a strong competitor will be good for consumers. But you can't blame NVidia for charging such prices, if people are willing to pay for it.

The Titan still sold like hotcakes despite it's ridiculous price.

On a side note, I also hope that Mantle removes the CPU bottle neck to so we can buy cheaper CPU's and still get the job done.

That would be good yes.
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
Are you serious? Seriously?

Anandtech posted total power consumption during Battlefield 3. Meaning CPU/RAM/Display/fans, everything run.

UDteam posted total power consumption during Firestrike benchmark. Meaning everything runs there as well.

Compare techpowerup and the rest. Having over 100W more should be a dead give away. That is why I say that techpowerup is the site to go to if you want accurate power consumption information about GPUs.

As for CPU measurements, I have no idea how they measure that.

A dead give away that geforce cards require more system resources, yes thats what you can take away from TPU's at the card power measurements
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |