I was expecting a lower power consumption especially after HD 7790 surprising efficiency. +50W than the Titan in Uber mode is a little disappointing. But hell, it remains acceptable. I'm puzzled about overclocking though, it seems like the power consumption goes fast out of control, so it may not be good for that purpose at all. That said, this gpu is already powerfull enough out of the box and overclocking memory should not impact power consumption that much. For core, it's another story.
I was expecting a lower power consumption especially after HD 7790 surprising efficiency. +50W than the Titan in Uber mode is a little disappointing. But hell, it remains acceptable. I'm puzzled about overclocking though, it seems like the power consumption goes fast out of control, so it may not be good for that purpose at all. That said, this gpu is already powerfull enough out of the box and overclocking memory should not impact power consumption that much. For core, it's another story.
I guess the high xtor density is increasing the power consumption.
That's a big one. There's no such thing as a free lunch after all.
The poor cooling isn't helping either. At such high temps the transistors are going to be significantly more leaky.
Seconded. And high leakage also generates unnessesary power consumption.
Why do NV and AMD make blower types as reference cards?
- i mean they dont sell much if any of importance.
And with that in mind i dont understand why AMD didnt use an expensive cooler like NV, as these blower cards looks to be more show off than selling??
If the OP wants, he can add these to the op. No point of them being lost inside pages and pages of posts.
AMD Radeon R9 290X reviews
http://www.hardocp.com/article/2013/10/23/amd_radeon_r9_290x_video_card_review/16#.UmifAel3tsc
http://www.anandtech.com/show/7457/the-radeon-r9-290x-review
http://www.hardware.fr/articles/910-1/amd-radeon-r9-290x-test-hawaii-sors-ses-watts.html
http://www.guru3d.com/articles_pages/radeon_r9_290x_review_benchmarks,1.html
http://www.computerbase.de/artikel/grafikkarten/2013/amd-radeon-r9-290x-im-test/
http://www.legitreviews.com/amd-radeon-r9-290x-video-card-review_126806
http://www.tomshardware.com/reviews/radeon-r9-290x-hawaii-review,3650.html
http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-290X-Hawaii-Review-Taking-TITANs
http://www.hardwarecanucks.com/foru...iews/63742-amd-radeon-r9-290x-4gb-review.html
http://www.techspot.com/review/727-radeon-r9-290x/
http://www.pcgameshardware.de/Gefor...-257241/Tests/Radeon-R9-290X-im-Test-1093796/
http://www.tweaktown.com/reviews/5823/amd-radeon-r9-290x-4gb-reference-video-card-review/index.html
http://www.kitguru.net/components/graphic-cards/zardon/amd-r9-290x-review-part-1/
http://techreport.com/review/25509/amd-radeon-r9-290x-graphics-card-reviewed
http://www.hardwareheaven.com/revie...9-290x-graphics-card-review-introduction.html
http://www.techpowerup.com/reviews/AMD/R9_290X/
crossfire
http://www.pcper.com/reviews/Graphi...deon-R9-290X-CrossFire-and-4K-Preview-Testing (preview)
http://www.guru3d.com/articles_pages/radeon_r9_290x_crossfire_vs_sli_review_benchmarks,1.html
http://www.techpowerup.com/reviews/AMD/R9_290X_CrossFire/
Pfft, I never asked for this thread to be a sticky nor did I make it a "review" thread when the 280x was released. Someone else made it that, apparently. By all means take ownership and do it yourself. I'm not messing with that crap.
And with that in mind i dont understand why AMD didnt use an expensive cooler like NV, as these blower cards looks to be more show off than selling??
Why? Cheaper initial cost and probably gives AMD's partners a way to differentiate themselves and upcharge. Otherwise, can't fathom a reason to use the barely adequate coolers myself, either.
I couldn't care less about the noise issues. I have six reference 7950's in my room all mining cryptocurrency all at 100% fan usage. It's as loud as you think. However, when I use two of them for gaming I dial those fans down to 70% while the other four are still at 100% and I can't hear anything when using headphones.
95C is a concern though. That is pushing the limits and would leave me absolutely zero ability to Crossfire in the future.
As it stands, I get roughly the same performance at 1440p from my two 7950's as I would from a single 290X. Since I can't Ebay them for anywhere near enough money for a 290X, and the 290X is a dead-end upgrade-wise because of the temps not allowing future Crossfire, I have no choice but to stand pat until 20nm in 2014
...or cross my fingers that Nvidia slashes prices of the 780 to such an extent that the 780ti is $550. One can dream.
This is a quote from one of my prior post:
"In the last decade capacitor technology have improved gradually so you get good capacitor performance and durability even at high temperatures and get those capacitors cheaply. The capacitors is the weak part for temperature, so this technology improvement matters.
That means for gfx its cheap to use capacitors that is rated for 105degrees or higher instead of 80degrees. (Typical standard values).
Unlike the nv480 card that used the extremely expensive tantalum capacitors the 290 serious can do without and still be specced to the high temperatures.
(edit: i will guess that the capacitors of this card is rated for 125c as opposed to the usual 105c - that gives 20c extra headroom compared to earlier generation)
105°C electrolytic capacitors are currently as cheap to produce than
their 85°C siblings.
Also they ll never hit high temperature on the GFX card PCB
given the places at wich they are located.
The most sensitive , the ones filtering the chip SMPS ,
are placed right under the fan and as such they ll be
at about the same temp as the case so the argument
of capacitors temperature is as of bad faith as possible
and no more than an irrelevant argument caught on the fly
by people who are not favourable to this product.