TaintedSquirrel
Member
- Jun 11, 2015
- 32
- 0
- 0
Based on what some media types have said, I'm expecting the 300 series to be nothing other than a straight rebrand unfortunately.
http://www.legitreviews.com/amd-shows-radeon-r9-300-series-cards-to-red-team-plus_165838
https://twitter.com/ryanshrout/status/609437983134089216 (didn't mention AMD specifically, but you can tell he was referring to them...)
It seems no review samples are out there, that's really not a good sign.
Quote:
We are glad to see AMD supporting the end users, but it is unusual to see the traditional hardware sites and media not briefed this close to new product launch. In the past when traditional hardware reviewers aren’t given a product with enough time to properly test it there is usually a reason for doing so.
Considering four months wasnt long enough to discover 970 memory issues, AMD would need to ship engineering samples to these "traditional hardware reviewers" to give them enough time. Also, as a kepler owner, good job not educating me on the driver nerfs to previous gen cards by Nvidia over the past year. End users now have to rely on product evaluations and issues discovered by buyers which get posted to forums because hardware media is no longer doing its job. Hope they enjoy their free swag for literally doing nothing of value.Quote:
It is clear to me that more and more companies no longer respect media's role in educating buyers. Practices of balanced coverage unwanted.
The high power consumption on the 290 series looks like it may have been addressed a little: http://www.tecmundo.com.br/amd/81391-exclusivo-tecmundo-descobriu-tudo-novas-placas-amd.htm
Specs show 208W power consumption. I think that's an improvement, but we'll have to see when real cards are tested by reputable sites.
I find it funny that people are complaining about GPU rebrand... at this point why would AMD invests even more money in something that is already VERY close to the GTX 980....
To me GM204 was always a hole filler and nothing was never impressive about it. GM200 and Fiji is where the real battle is at.
And people complaining about power consumption but want high ends cards... c'mon.
From the looks of it 390x or for that matter all of 300 series will not have hdmi 2 port. That just blows.
I saw this same page and at first had the same reaction, but it looks like this identical figure (208W) was originally used for the R9 290X. This slide from MSI's website and this one from Club3D's website both incorrectly represent reference R9 290X cards as having 208W power usage (Club3D even says this is the maximum). It's actually 271W-282W in gaming and 309W-315W in FurMark.
I have two theories about where this incorrect 208W figure originally came from in the R9 290X slides. One is that AMD's engineers determined that a power target of 208W made for the best performance per watt, and were originally going to release the card at that. But this was based on the assumption that the only GK110 card would be the $999 Titan, so when Nvidia dropped the GTX 780, AMD had to ramp up the juice to compete, and we ended up with the power-guzzling monstrosity that Hawaii is now. The other, more prosaic theory is that it was supposed to be 280W and the 208W figure is a simple transposition error.
The XFX listing says HDMI 2.
Tonga power consumption was dropped by 20%. If you drop Hawaii by 20%, then it would set at 200 watts. Seems reasonable given the improvement going from TSC to GF.
From the looks of it 390x or for that matter all of 300 series will not have hdmi 2 port. That just blows.
I have like 3 HDMI->DVI converter. No big deal.
I think he means an HDMI 2.0 port, which would allow for 4K@60Hz over the HDMI cable. Useful for a lot of 4K TVs on the market.
have you seen his 200 series review? cheery picked benchmarks all over the entire review. that was when I put him into the trash pile along with a few others.Linus confirms he also doesn't have a 390X.
https://twitter.com/LinusTech/status/609543480071729153
But there's no evidence that they went to GloFo. "Trinidad" on the R7 370 is definitely the same old TSMC Pitcairn chip we've known since 2012 (you can read the "MADE IN TAIWAN" lettering on the die in one of the MSI card pictures), and "Antigua" has the same die dimensions as Tonga (which was made by TSMC, not GloFo). Unfortunately, the evidence so far points to the 300 series consisting entirely of straight rebrands with no substantive improvements of any kind.
By the way, dropping Hawaii's power consumption by 20% would put it around 250W maximum, not 200W.
By the way, dropping Hawaii's power consumption by 20% would put it around 250W maximum, not 200W.
I think he means an HDMI 2.0 port, which would allow for 4K@60Hz over the HDMI cable. Useful for a lot of 4K TVs on the market.
http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/21
For Crysis 3, the power difference is about 60 watts. If AMD drops the power consumption by 50 watts, good enough to Maxwell.
GTX 980’s power consumption is lower than everything else on the board, and noticeably so. With 294W at the wall, it’s 20W less than GTX 770, 29W less than 290X,
Average gaming load: 230-250W for R290/X, custom variants are towards the lower end.
Dropping that by 20% puts it at ~180-210W average gaming load. That's actually competitive if its got 8GB vram and +10% performance.
Read the explanation below the chart. CPU is working harder to deliver more frames per second. Difference is shown when isolating the card's power consumption, such as what TPU does.