[Rumor, Tweaktown] AMD to launch next-gen Navi graphics cards at E3

Page 36 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

RaV666

Member
Jan 26, 2004
76
34
91
If we go by this, GPU with 180W TDP, and 225W TBP will draw around 200-225W of power, because that is what is its board power draw.

Nvidia also uses TDP measure only for GPU, without the board. RTX 2070 is 180W TDP GPU with 215W power draw for the whole board.
AMD Navi is touted to be 180W TDP GPU with 225W power draw.

Anyways, we're grasping at straws.
This is something that you fail to grasp.
Nvidias TDP refrers to the BOARD POWER, go to nvidia store and look at the specs.
And this is confirmed by all the tests.
AMD also uses BOARD POWER at their website.
They both tend to lie to some degree, there are different samples, some are better some are worse, there are diffrent testing methodologies.
BUT, however you wanna spin it. 5700 is gonna draw more power than the 2070, and depending on your cherrypicking, youre gonna get between 20-40W less.There is absolutely no doubt about that, ofcourse IF the TBP we have for 5700 is correct.
I am pretty sure amd gave out this GPU TDP/Board power TDP numbers to muddy the waters, and this is exactly the thing we have here.
AND nvidia did a similar thing, they launched 2070 , gave out MSRP and TDP for the non oc versions, and then sent to the reviewers FE models which have higher TDP and price and performance.And almost nobody talked about this.
Turing launch was pretty disgraceful for most review sites.With tomshardwares "JUST BUY IT" as a pinnacle of selling out.

@lifeblood

I can confirm the undervolting, my vega 56 has 15% higher firestrike scores vs ref V56 at stock TDP of ~=210W (thats 180W GPU package power).This things throttled to hell without undervolt.
 
Reactions: ZipSpeed

Glo.

Diamond Member
Apr 25, 2015
5,761
4,666
136
This is something that you fail to grasp.
Nvidias TDP refrers to the BOARD POWER, go to nvidia store and look at the specs.
And this is confirmed by all the tests.
175W TDP is board power draw, confirmed by test you say?




Yeah.

If Nvidia states 175W TDP for RTX 2070, and then the GPU on average draws 200-215W depending on model, they are blatantly lying, and they are getting away with it . Or maybe it shows what Perception Nvidia has. Better perception then there is REALITY!

Im not spinning anything anywhere. You are grasping at straws that RTX2070 is 175W, when in reality, that GPU draws 200-215W, depending on model. I can also say that my ass has level 7 hurricane strength when I blow farts, but it will be a blatant lie, when it can not move a toy windmill.

Gross.
Please leave flatulence tales out of tech forum
debates. If you have health concerns about gas,

post in the health and fitness forum.

AT Mod Usandthem
 
Last edited by a moderator:

maddie

Diamond Member
Jul 18, 2010
4,787
4,771
136
You need only look at Radeon VII and rx 590 for that answer. If the forums are to be believed lots of people have been able to do it. A few were even able to both undervolt and give it a small overclock.

I believe the reason AMD overvolts so much is to be able to use as many dies as possible. However, although AMD has a history of doing that, they may or may not do it with Navi as well. I'll give it a 90% chance based on past history and how close it appears to be with 2070.
Didn't some high level engineer get sent to RTG to assist with power control. She was a CPU team member previously.

It's not just to maximize die yield, but their voltage control is not self regulated well enough. The circuitry should be able to vary voltages and tune itself on the fly, allowing dies of different quality to reach their individual best performance. We should not be able to undervolt and overclock. The die should be doing this itself. This alone if fixed, will narrow the gap between Nvidia significantly.
 

RaV666

Member
Jan 26, 2004
76
34
91
175W TDP is board power draw, confirmed by test you say?




Yeah.

If Nvidia states 175W TDP for RTX 2070, and then the GPU on average draws 200-215W depending on model, they are blatantly lying, and they are getting away with it . Or maybe it shows what Perception Nvidia has. Better perception then there is REALITY!

Im not spinning anything anywhere. You are grasping at straws that RTX2070 is 175W, when in reality, that GPU draws 200-215W, depending on model. I can also say that my ass has level 7 hurricane strength when I blow farts, but it will be a blatant lie, when it can not move a toy windmill.

First of all, AGAIN SAME TEST shows vega 56 as a 230W card! SAME TEST!
Second, i didnt say that "2070 draws 175W" !
I SAID that it WILL DRAW LESS than 5700, the AMOUNT OF WHICH depends on a variety of factors.
What i SAID is that 2070 draws LESS power than 225W amd advertises.And the amount, if its 50W or if its 25W doesnt really matter that much.
And AGAIN, you are showing a graph from one website that shows that nvidia "lies" by 20W (its 195W on this graph not 215W) , and on the SAME GRAPH it shows that AMD LIES by 20W.
You really dont see it ?
And yes every test confirms the fact that 2070 draws less power than vega 56, and AMD advertises vega 56 as a 210W card, which means, every test proves that 2070 will draw less than the 225W 5700 TBP.
Dude, you are stuck.Im ending this because im not sure anymore what are you trying to prove here.
We are not discussing the EXACT 2070 power figure, we are stating that 2070 will draw less, so amd cant price it the same.Thats the concept, you get me ?
 

Glo.

Diamond Member
Apr 25, 2015
5,761
4,666
136
First of all, AGAIN SAME TEST shows vega 56 as a 230W card! SAME TEST!
Second, i didnt say that "2070 draws 175W" !
I SAID that it WILL DRAW LESS than 5700, the AMOUNT OF WHICH depends on a variety of factors.
What i SAID is that 2070 draws LESS power than 225W amd advertises.And the amount, if its 50W or if its 25W doesnt really matter that much.
And AGAIN, you are showing a graph from one website that shows that nvidia "lies" by 20W (its 195W on this graph not 215W) , and on the SAME GRAPH it shows that AMD LIES by 20W.
You really dont see it ?
And yes every test confirms the fact that 2070 draws less power than vega 56, and AMD advertises vega 56 as a 210W card, which means, every test proves that 2070 will draw less than the 225W 5700 TBP.
Dude, you are stuck.Im ending this because im not sure anymore what are you trying to prove here.
We are not discussing the EXACT 2070 power figure, we are stating that 2070 will draw less, so amd cant price it the same.Thats the concept, you get me ?
Why are you constantly writing about Vega, which has ZERO relation to Navi's TDP/TBP?

You say that RTX 2070 draws less power than AMD advertises. For which GPU? Navi? Why do you stick to Vega's power draw, and then talk about Navi's power draw, when NOTHING was confirmed? We have seen dual 8 pin Connector GPU boards that ASRock readies, but how confident are you that they are not for Big Navi that will come in 2020, and small Navi will not stick to single 8 pin connector, which boards also we have seen from Asrock?

Why do you draw so far fetched conclusions when we do not know a bloody ounce about Navi? And lastly. Why are you so confident RTX 2070 will consume less power than Navi GPU? Untill we get any data, we can speculate, about it.

And all you what you are doing, you are so confident that it will be slower, more power hungry and equally expensive as RTX 2070.
 

maddie

Diamond Member
Jul 18, 2010
4,787
4,771
136
What about the rumors of both increased performance and lower prices on RTX cards? This is very Nvidia, as Huang hates losing. Wish this is true as I'm not a bank.
 

RaV666

Member
Jan 26, 2004
76
34
91
Why are you constantly writing about Vega, which has ZERO relation to Navi's TDP/TBP?

You say that RTX 2070 draws less power than AMD advertises. For which GPU? Navi? Why do you stick to Vega's power draw, and then talk about Navi's power draw, when NOTHING was confirmed? We have seen dual 8 pin Connector GPU boards that ASRock readies, but how confident are you that they are not for Big Navi that will come in 2020, and small Navi will not stick to single 8 pin connector, which boards also we have seen from Asrock?

Why do you draw so far fetched conclusions when we do not know a bloody ounce about Navi? And lastly. Why are you so confident RTX 2070 will consume less power than Navi GPU? Untill we get any data, we can speculate, about it.

And all you what you are doing, you are so confident that it will be slower, more power hungry and equally expensive as RTX 2070.

Oh im done doing logic with you.Just clean answers.
First sentence does not make any sense, AMD is not advertising 2070 power consumption.

2.Because, VEGA is a product from AMD, so navi follows the same power consumption testing regiment like vega ... You know, apples to apples.
3.We are working with the assumption that 225TBP is correct.If you just plainly dont agree, then write so.
4.I am confident that 2x8Pin is not for big navi in 2020, because thats a year from now and they are not producing either these chips or pcbs for it.But they are showing the cards that will be here in a MONTH.
5.I am very very confident that asrock isnt showing us a product a year away.Confidence level over 9000.

As a sidenote i will add, that i am pretty sure we are gonna see mid range navi just shown by amd first, then small navi later this year (which probably will need one or none power connectors), and only at the end fo the year or more probably in next year "navi 20" or "big navi".
 

Glo.

Diamond Member
Apr 25, 2015
5,761
4,666
136
What about the rumors of both increased performance and lower prices on RTX cards? This is very Nvidia, as Huang hates losing. Wish this is true as I'm not a bank.
Now they should also make no 6-pin GPU that is faster than RX 570 .

If they will lower prices: FINALLY! Maybe there will be some choice in GPU market. And this last sentence is the most telling about how ridiculous situation we are in...
Oh im done doing logic with you.Just clean answers.
First sentence does not make any sense, AMD is not advertising 2070 power consumption.

2.Because, VEGA is a product from AMD, so navi follows the same power consumption testing regiment like vega ... You know, apples to apples.
3.We are working with the assumption that 225TBP is correct.If you just plainly dont agree, then write so.
4.I am confident that 2x8Pin is not for big navi in 2020, because thats a year from now and they are not producing either these chips or pcbs for it.But they are showing the cards that will be here in a MONTH.
5.I am very very confident that asrock isnt showing us a product a year away.Confidence level over 9000.

As a sidenote i will add, that i am pretty sure we are gonna see mid range navi just shown by amd first, then small navi later this year (which probably will need one or none power connectors), and only at the end fo the year or more probably in next year "navi 20" or "big navi".
So all of your points are just your assumptions about Navi, and not facts. Good that you made it clear right now.

Mid Range Navi - July. Small Navi - September. Big Navi - January, at the least.
 

RaV666

Member
Jan 26, 2004
76
34
91
Now they should also make no 6-pin GPU that is faster than RX 570 .

If they will lower prices: FINALLY! Maybe there will be some choice in GPU market. And this last sentence is the most telling about how ridiculous situation we are in...

So all of your points are just your assumptions about Navi, and not facts. Good that you made it clear right now.

Mid Range Navi - July. Small Navi - September. Big Navi - January, at the least.
Well, we first agreed on a set of data that we worked off, and then apparantly you decided that you dont like that data so you now started to disregard it .If "THE TALK" isnt about leaks ,previews and some logical assumptions , then we can all go home and discuss nothing ,but yet, still, here you are
 

beginner99

Diamond Member
Jun 2, 2009
5,223
1,598
136
How big the blunt would have to be, for CEO of AMD to decide that pricing Navi GPU, that is 10-15% faster than RTX 2070, at 300-350$ is a good business decision.

That has been answered multiple times already but it's clear they would do it for market share which is greatly needed. RTX2070 competitor for $499 will at best keep market share at status quo. And at one point, devs will stop caring about AMD cards altogether. then it's game over.

In the context. If Navi, small GTX 1660 Ti competitor, in die size, is faster than RTX 2070, why AMD would have to price it at 300-350$? They CAN price it at 500$.

Same response I already made. They can price it at $500 if they only want/have few to sale. Brand recognition. talk to gamers clueless about hardware. they buy NV unless AMD has a clearly better offer (=cheaper for same fps). At $500 it will only sell to AMD crow and they might still loose more market share.
 

Glo.

Diamond Member
Apr 25, 2015
5,761
4,666
136
Well, we first agreed on a set of data that we worked off, and then apparantly you decided that you dont like that data so you now started to disregard it .If "THE TALK" isnt about leaks ,previews and some logical assumptions , then we can all go home and discuss nothing ,but yet, still, here you are
No. I just like to talk about FACTS and not assumptions. I do not assume that Navi will consume any amount of power, because... we haven't seen any review of it. You provided only your assumptions, and believe that you won any argument. How can I consider you serious? I can't that is why I don't want to discuss this topic, at this time.

I was talking only about RTX 2070 power, and how people perceive it to be better than it really is. I even proven it to you, that you, yourself perceived it to be better than in reality is. You said that if Navi has 225W TBP, then it will chew 30-40W more than RTX 2070. I proven you that it might not be the case. If it will chew more 30-40W more than RTX GPUs it will have to consume between 250 and 275W of power, which will make it on the same level as Vega GPU. Which is ridiculous in itself.
That has been answered multiple times already but it's clear they would do it for market share which is greatly needed. RTX2070 competitor for $499 will at best keep market share at status quo. And at one point, devs will stop caring about AMD cards altogether. then it's game over.



Same response I already made. They can price it at $500 if they only want/have few to sale. Brand recognition. talk to gamers clueless about hardware. they buy NV unless AMD has a clearly better offer (=cheaper for same fps). At $500 it will only sell to AMD crow and they might still loose more market share.
Devs will stop caring about AMD GPUs? When they have ALL OF CONSOLE marketshare for them? Are you serious?

Money is not in marketshare but in margins. If majority of market, having the choice of GTX 1050 Ti, and RX 570 is still picking slower, and more expensive GPU over the faster and cheaper one, then it shows level of stupidity in the market, and brand recognition. No product, living equally short as GPUs will be able to change this perception.

So they can go for price margins. Because they have at least competitive product.

Its funny. Majority of market picked GTX 1050 Ti over RX 570, which was faster and cheaper. What makes you believe the same thing would not happen with Navi GPUs vs RTX 2070?

If so, then why should AMD care at all, considering they have Consoles to them, and cloud gaming?
 
Last edited:

beginner99

Diamond Member
Jun 2, 2009
5,223
1,598
136
Devs will stop caring about AMD GPUs? When they have ALL OF CONSOLE marketshare for them? Are you serious?

Yeah. I layed that trap intentionally. Look what happened to mantle in was it BF3? It stopped really working beyond tonga or fijii. Don't remember exactly but tonga did need a fix to make it work at all. Look at the state of dx12. Low level optimizations (hardware and software!) from consoles simply don't transform over to PC. If they did, why haven't we been seeing that the last 5 years? Polaris still underperformed on power use and fps/tflops. Exactly what should not happen on an optimized combo.

Money is not in marketshare but in margins. If majority of market, having the choice of GTX 1050 Ti, and RX 570 is till picking slower, and more expensive GPU over the faster and cheaper one, then it shows level of stupidity in the market, and brand recognition. No product, living equally short as GPUs will be able to change this perception.

And margins can increase with higher sales (market share) because the high costs for a GPU are up-front and not the actual manufacturing of the part. The more units you sell, the more you distribute R&D and the higher margin is at a lower price.
Navi being consumer mostly (it's low margin compared to instinct product line even at $500), volume matters a lot.
 

RaV666

Member
Jan 26, 2004
76
34
91
No. I just like to talk about FACTS and not assumptions. I do not assume that Navi will consume any amount of power, because... we haven't seen any review of it. You provided only your assumptions, and believe that you won any argument. How can I consider you serious? I can't that is why I don't want to discuss this topic, at this time.

I was talking only about RTX 2070 power, and how people perceive it to be better than it really is. I even proven it to you, that you, yourself perceived it to be better than in reality is. You said that if Navi has 225W TBP, then it will chew 30-40W more than RTX 2070. I proven you that it might not be the case. If it will chew more 30-40W more than RTX GPUs it will have to consume between 250 and 275W of power, which will make it on the same level as Vega GPU. Which is ridiculous in itself.
Devs will stop caring about AMD GPUs? When they have ALL OF CONSOLE marketshare for them? Are you serious?

Money is not in marketshare but in margins. If majority of market, having the choice of GTX 1050 Ti, and RX 570 is till picking slower, and more expensive GPU over the faster and cheaper one, then it shows level of stupidity in the market, and brand recognition. No product, living equally short as GPUs will be able to change this perception.

So they can go for price margins. Because they have at least competitive product.

Its funny. Majority of market picked GTX 1050 Ti over RX 570, which was faster and cheaper. What makes you believe the same thing would not happen with Navi GPUs vs RTX 2070?

If so, then why should AMD care at all, considering they have Consoles to them, and cloud gaming?
You are ridiculous.
You were assuming just few posts above that 225TBW figure was real.
Then you said it doesnt mean that 5700 will consume more power, because nvidia specs the GPU power consumption.
I have proven you WRONG, because nvidia ALSO states TBP.
Then you proceeded with one cherrypicked result, igoring of course the one i gave "just because", and MOREOVER IGNORING the fact that amds numbers are also inflated in this chart.
So its LOGICAL, that if you see 195W result for a 175W card from nvidia, and 230W result from a 210W card from amd YOU ADJUST THE NEXT AMD CARD THE SAME.But no, you see one things and disregard another things. One time 225TBP was real, and now its just "conjecture" .

And now this

"I proven you that it might not be the case. If it will chew more 30-40W more than RTX GPUs it will have to consume between 250 and 275W of power,"

Thats blatant lie! even taking into account only the techpowerup graph!
195W+40W its 235W and NOT 275W, thats hideous what you are pulling.
AND 235W would be only 10 watts more than TBP, while vega 56 from the same manufacturer, using the same power measuring standards, pulls 20W more than advertised.
 

Glo.

Diamond Member
Apr 25, 2015
5,761
4,666
136
Yeah. I layed that trap intentionally. Look what happened to mantle in was it BF3? It stopped really working beyond tonga or fijii. Don't remember exactly but tonga did need a fix to make it work at all. Look at the state of dx12. Low level optimizations (hardware and software!) from consoles simply don't transform over to PC. If they did, why haven't we been seeing that the last 5 years? Polaris still underperformed on power use and fps/tflops. Exactly what should not happen on an optimized combo.



And margins can increase with higher sales (market share) because the high costs for a GPU are up-front and not the actual manufacturing of the part. The more units you sell, the more you distribute R&D and the higher margin is at a lower price.
Mantle is Vulkan, so who cares about it, anyway? I guess, everybody .

Margins. Apple and Nvidia shows that Margins, are what any company should go for if they want to make money, and not marketshare.

You still haven't answered my question.

If majority of market having a choice of faster, and cheaper AMD GPU, compared to Nvidia, picks Nvidia GPU what makes you believe same thing will not happen with Navi/RTX 2070?

What AMD has the benefit in pricing their GPUs at a massive discount compared to Nvidia, while still not getting any sales?
 

DrMrLordX

Lifer
Apr 27, 2000
21,797
11,144
136
Do you think Nvidia got back 150 mln USD for every single one design they made for Turing GPUs? They have designed GT102, GT104, GT106, GT116, GT107. 5 designs. Do you think they recouped all of those design costs, already?

I doubt every Turing design cost them that much. Regardless, if RTX cards and Quadros haven't already paid for Turing, I'd be fairly surprised. 1660Ti will also contribute some healthy payback.

I don't know how much AMD has spent on Navi, but they will be lucky to make those kinds of profits off it. Vega10 and Vega20 are more-or-less bankrolling that entire project.
 

Glo.

Diamond Member
Apr 25, 2015
5,761
4,666
136
You are ridiculous.
You were assuming just few posts above that 225TBW figure was real.
Then you said it doesnt mean that 5700 will consume more power, because nvidia specs the GPU power consumption.
I have proven you WRONG, because nvidia ALSO states TBP.
Then you proceeded with one cherrypicked result, igoring of course the one i gave "just because", and MOREOVER IGNORING the fact that amds numbers are also inflated in this chart.
So its LOGICAL, that if you see 195W result for a 175W card from nvidia, and 230W result from a 210W card from amd YOU ADJUST THE NEXT AMD CARD THE SAME.But no, you see one things and disregard another things. One time 225TBP was real, and now its just "conjecture" .

And now this

"I proven you that it might not be the case. If it will chew more 30-40W more than RTX GPUs it will have to consume between 250 and 275W of power,"

Thats blatant lie! even taking into account only the techpowerup graph!
195W+40W its 235W and NOT 275W, thats hideous what you are pulling.
AND 235W would be only 10 watts more than TBP, while vega 56 from the same manufacturer, using the same power measuring standards, pulls 20W more than advertised.
****.

What I actually said, maybe you do not understand this, is that WE DO NOT KNOW ANYTHING ABOUT NAVI, and its blatantly stupid to assume anything about its power draw or performance.

I was not assuming few posts above anything about TBP. I pointed that Nvidia states TDP for their RTX 2070. Which in reality translates to 200-215W for whole board. I said that if you state TDP for Nvidia, then state TDP for Navi, and not mistake it for board power. Then you went off to your whole rumbling in last few posts trying to prove me, incorrectly, that Nvidia TDP is the board power, which has been proven in tests.

Yeah, clearly it has been disproven. How hard for you is it to grasp this concept of reference design vs Custom design?

Is it a blatant lie, about RTX 2070 power draw? Why do you ignore in the charts the power draw of CUSTOM CARDS? ASRock demoed what? Reference designs? Why you think that comparing reference design to custom cards is fair? If there will ever be reference Navi design, sure lets compare power draws of them.

Its the same thing for your argument about Vega 56. AMD reference design states 210W TBP for Navi Reference Design. Is there anything like that? Where are the bloody reference designs for that GPU, eh? Or maybe TechpowerUp tested custom cards, only? How hard is it for you to see this?

I told you. You have way too much assumptions about Navi GPU. We can go back to this discussion when the GPUs will be reviewed, and then make educated opinions. Right now, your rumblings were pointless.
 

Glo.

Diamond Member
Apr 25, 2015
5,761
4,666
136
I doubt every Turing design cost them that much. Regardless, if RTX cards and Quadros haven't already paid for Turing, I'd be fairly surprised. 1660Ti will also contribute some healthy payback.

I don't know how much AMD has spent on Navi, but they will be lucky to make those kinds of profits off it. Vega10 and Vega20 are more-or-less bankrolling that entire project.
7 nm designs are 250-300 mln per design. So quite a lot they could've spent... If we go by design costs of Zen 2, Navi 10, Navi 14 and Navi 20 that they cost the same, then Nvidia with 150 mln per 12 nm design is still at least 200 mln USD in front of AMD.

Now it makes me thinking. How much Nvidia will pay for Ampere, which presumably will be similar to Volta GPU.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
It was brilliant move. Because that way AMD could have jacked up the price point for MAINSTREAM platform from 350$ for top end SKU to 500$. NOBODY cared, because the CPU was neck and neck with Intel's HEDT platform. Even tho, it was just mainstream platform.

Everybody compared Core i7 7700K competitor with HEDT CPU, and nobody minded the increased price tag.

In the context. If Navi, small GTX 1660 Ti competitor, in die size, is faster than RTX 2070, why AMD would have to price it at 300-350$? They CAN price it at 500$.

You're still missing the point. Yes, the Ryzen chips were on a mainstream platform compared to Intel's HEDT. That meant less memory bandwidth, yes, but still roughly equivalent performance at LESS THAN HALF THE PRICE of Intel's offering.

That's the point. If you're the underdog, you HAVE to provide a compelling price/performance combo. It's not enough to match the dominant player, you have to beat them. Ryzen would have been a flop if it only had 10% better price/performance than Intel's chips.

If they can really beat RTX 1070 across the board by 10%, that's a $399 card at most if AMD wants it to sell. If it matches RTX 1070 in most games and sometimes falls short by 5% or so, then it's a $329-$349 card if they want it to sell.
 
Reactions: beginner99

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
I believe the reason AMD overvolts so much is to be able to use as many dies as possible. However, although AMD has a history of doing that, they may or may not do it with Navi as well. I'll give it a 90% chance based on past history and how close it appears to be with 2070.

AMD can do this if they want. But they can't do this and also expect to get away with charging a premium price. It's a cheap, bargain-basement tactic.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
Do you think Nvidia got back 150 mln USD for every single one design they made for Turing GPUs? They have designed GT102, GT104, GT106, GT116, GT107. 5 designs. Do you think they recouped all of those design costs, already?

That $150-million-per-die figure is pure speculation.
 

beginner99

Diamond Member
Jun 2, 2009
5,223
1,598
136
If majority of market having a choice of faster, and cheaper AMD GPU, compared to Nvidia, picks Nvidia GPU what makes you believe same thing will not happen with Navi/RTX 2070?

Because the 1050 Ti doesn't need a 8-pin and has lower power use. That's why people opt for it besides brand. If the 570 were at same price, it would simply less even less.

We don't now AMDs goal or supply. Pricing it at 2070 levels makes sense if most wafers go to Zen2 chiplets anyway for now. All I'm saying is at similar performance/$, NV will always get more sales (for now). So either AMD is fine with low volume, they will actually price it better than leaks suggest or they simply don't get the market. Given the 590 the later isn't that unlikely.
 

RaV666

Member
Jan 26, 2004
76
34
91
****.

What I actually said, maybe you do not understand this, is that WE DO NOT KNOW ANYTHING ABOUT NAVI, and its blatantly stupid to assume anything about its power draw or performance.

I was not assuming few posts above anything about TBP. I pointed that Nvidia states TDP for their RTX 2070. Which in reality translates to 200-215W for whole board. I said that if you state TDP for Nvidia, then state TDP for Navi, and not mistake it for board power. Then you went off to your whole rumbling in last few posts trying to prove me, incorrectly, that Nvidia TDP is the board power, which has been proven in tests.

Yeah, clearly it has been disproven. How hard for you is it to grasp this concept of reference design vs Custom design?

Is it a blatant lie, about RTX 2070 power draw? Why do you ignore in the charts the power draw of CUSTOM CARDS? ASRock demoed what? Reference designs? Why you think that comparing reference design to custom cards is fair? If there will ever be reference Navi design, sure lets compare power draws of them.

Its the same thing for your argument about Vega 56. AMD reference design states 210W TBP for Navi Reference Design. Is there anything like that? Where are the bloody reference designs for that GPU, eh? Or maybe TechpowerUp tested custom cards, only? How hard is it for you to see this?

I told you. You have way too much assumptions about Navi GPU. We can go back to this discussion when the GPUs will be reviewed, and then make educated opinions. Right now, your rumblings were pointless.

Because custom cards can exceed reference TDP ratings, its the same for amd cards.
Nvidia states TBW as well, if they didnt you could sue them because thats what they state.
And yes you said in few instances about navis 225TBP, and you were talkin about it like it was real thing.
You are constantly manipulating the thread ignoring all other data.
You want TDP ?
Here you have 2070 FE witha TBP of 185W.
https://www.tomshardware.com/reviews/nvidia-geforce-rtx-2070-founders-edition,5851-9.html
187W.
Here you have reference 2070 non fe with TBP 175W
https://www.computerbase.de/2018-10...st/4/#abschnitt_messung_der_leistungsaufnahme
168W

And by you own account, Techpowerup 195W , i could give you like 10 more results but they are total system power consumption tests, so you would find a way to wiggle out another lie out of it.
Now lets put all those 3 results together, and you get around 183W, for a mix of FE and non FE cards.
Thats whole lotta lower than 225.

Now tell me again.
How did you end up adding 195W to 40W and end up getting 275W ?
Did you lie kind sir ?
 

Glo.

Diamond Member
Apr 25, 2015
5,761
4,666
136
You're still missing the point. Yes, the Ryzen chips were on a mainstream platform compared to Intel's HEDT. That meant less memory bandwidth, yes, but still roughly equivalent performance at LESS THAN HALF THE PRICE of Intel's offering.

That's the point. If you're the underdog, you HAVE to provide a compelling price/performance combo. It's not enough to match the dominant player, you have to beat them. Ryzen would have been a flop if it only had 10% better price/performance than Intel's chips.

If they can really beat RTX 1070 across the board by 10%, that's a $399 card at most if AMD wants it to sell. If it matches RTX 1070 in most games and sometimes falls short by 5% or so, then it's a $329-$349 card if they want it to sell.
I asked this beginner99, I will ask you the same thing.

If Majority of market having the option between GTX 1050 Ti, and 40% faster, and 20$ most of the time RX 570 still picked GTX 1050 Ti, what makes you believe that it would be anything different with Navi GPU?

And if so, what benefit would be for AMD to price their GPU at a discount, compared to Nvidia's offering, and still do not get any sales?
All I'm saying is at similar performance/$, NV will always get more sales (for now). So either AMD is fine with low volume, they will actually price it better than leaks suggest or they simply don't get the market. Given the 590 the later isn't that unlikely.
No. Nvidia ALWAYS Gets more sales. No matter what will happen Nvidia will always make more sales. Because they have mindshare, which AMD does not have.

There is no incentive for AMD to:
A) design powerful GPUs if they cannot sell them
B) to sell their GPUs at a discount compared to Nvidia, if they still cannot sell them.
Funniest part, is that if the rumors are 100% true, about AMD and Nvidia, Nvidia appears to be the savior of consumers, if they will really lower the MSRPs of their GPUs...
Now tell me again.
How did you end up adding 195W to 40W and end up getting 275W ?
Did you lie kind sir ?
And what will you get if you will add this:

to 40W more? You get 268W of power. Isn't it 275W TDP/TBP?

COMPARE THE CUSTOM CARDS TO CUSTOM CARDS. ALWAYS. COMPARE REFERENCE DESIGNS TO REFERENCE DESIGNS. ALWAYS. And do it AFTER WE WILL HAVE ANY REVIEWS of GPUs discussed.

There is no TDP and TBP rating official from AMD. Only Rumored ASRock designs. Why do you fail to grasp this concept? They are NOT reference designs.
 

Ajay

Lifer
Jan 8, 2001
16,094
8,106
136
What I actually said, maybe you do not understand this, is that WE DO NOT KNOW ANYTHING ABOUT NAVI, and its blatantly stupid to assume anything about its power draw or performance.
^This. We need to wait till E3, maybe we will get a little more info on Navi and actually have something to talk about.
 
Status
Not open for further replies.
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |