Cookie Monster
Diamond Member
- May 7, 2005
- 5,161
- 32
- 86
it would be interesting to see the same test on the R9 285.
My thoughts precisely.
it would be interesting to see the same test on the R9 285.
it would be interesting to see the same test on the R9 285.
It was only recycled back to the 6970. The 5870 cooler was quite different and didn't have a vapour chamber. I actually think the heatsink portion of the 290X isn't the problem as much as the fan (which is about half the depth of the titan cooler).
The chip itself is amazing for what it is. That said, 33% over 980 is disappointing. Overclocking needs to be a redeeming feature. $1,000 is a really bad joke. I really hope Fiji is straight up faster across the board at 1440p and 4k, overclocks well, comes in way cheaper, and has similar efficiency so Nvidia can't clutch onto perf/w as being the end all metric.
Yesterday, i made a list of the people I expected would trash on the Titan X and what do ya know...........it was spot on!!!!!!
Even 980TI at 700 is overpriced as hell.Am I on that list? Because I haven't been trashing it at all. I've been expecting ~35% over a 980 and at ~780ti power use and so I wasn't surprised nor disappointed.
The only disappointment would be if NV does not allow custom models on the full GM200 SKU, because that blower is actually holding GM200 back a lot. Other reviewers have noted the same:
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_X/36.html
"The Titan X's cooling seems to hold it back. Though NVIDIA's choice to reuse the original Titan cooler does not seem optimal, the card will be running at its 84°C temperature limit almost all the time during demanding games, which causes Boost clocks to go down. I also wonder why NVIDIA chose such a low temperature limit when it would have clearly made a performance difference.
Another shortcoming of the cooler, which looks fantastic with its black-powdered coat by the way, is fan noise. While not terribly noisy, it definitely emits more noise than NVIDIA's other recent releases, roughly matching the Radeon R9 290X's noise output. It was the wonderful Maxwell architecture that brought huge efficiency gains so cooling could be quieter yet still powerful enough to keep temperatures in check."
It will be interesting when they drop a 980ti 6GB mildly-neutered GM200 in a few months at ~$699, allowing custom AIB models that end up with beastly OC and beating Titan X OC vs OC.
I'm not even going to bash or trash the $999 price tag and Titan moniker without DP compute. Why? Because I know, you know, NV knows, there's a lot of gamers & benchers who would happily folk out that much for less performance increase than 35%.. and, its got 12GB VRAM!
NV is going to bank big time with 980ti at $699 and Titan X at $999. Hopefully AMD can bring the competition back so NV is inclined to lower the price a bit and we can all enjoy!
Am I on that list? Because I haven't been trashing it at all. I've been expecting ~35% over a 980 and at ~780ti power use and so I wasn't surprised nor disappointed.
The only disappointment would be if NV does not allow custom models on the full GM200 SKU, because that blower is actually holding GM200 back a lot. Other reviewers have noted the same:
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_X/36.html
"The Titan X's cooling seems to hold it back. Though NVIDIA's choice to reuse the original Titan cooler does not seem optimal, the card will be running at its 84°C temperature limit almost all the time during demanding games, which causes Boost clocks to go down. I also wonder why NVIDIA chose such a low temperature limit when it would have clearly made a performance difference.
Another shortcoming of the cooler, which looks fantastic with its black-powdered coat by the way, is fan noise. While not terribly noisy, it definitely emits more noise than NVIDIA's other recent releases, roughly matching the Radeon R9 290X's noise output. It was the wonderful Maxwell architecture that brought huge efficiency gains so cooling could be quieter yet still powerful enough to keep temperatures in check."
It will be interesting when they drop a 980ti 6GB mildly-neutered GM200 in a few months at ~$699, allowing custom AIB models that end up with beastly OC and beating Titan X OC vs OC.
I'm not even going to bash or trash the $999 price tag and Titan moniker without DP compute. Why? Because I know, you know, NV knows, there's a lot of gamers & benchers who would happily folk out that much for less performance increase than 35%.. and, its got 12GB VRAM!
NV is going to bank big time with 980ti at $699 and Titan X at $999. Hopefully AMD can bring the competition back so NV is inclined to lower the price a bit and we can all enjoy!
Oye vei.Got $450 for my 980 off craigslist. Titan X should be coming on Thursday. I briefly tried 980 SLI but it didn't work in some games and also had microstutter in some, so Titan X is the card I've been waiting for.
Depends on how long it takes for Pascal to come out, but if its a long time then I'll probably just throw a water block on the TiX or an aftermarket cooler when they become available. My 980 could barely hold a 100MHz overclock without artifacts so if my Titan X can then do at least that then I'm looking at a great upgrade at 4K.
Even just getting 40fps is a big deal at 4K, especially if you've been looking at 25-30fps for the last 5 months :O
AT's power tests are different from so many other sites. Almost all the other sites I've checked say that a TitanX consumes less power that a R290X
They didn't show the result of running 8 x MSAA though, its a slideshow, irrelevant. Pushing vram for the sake of it is stupid, [H] has dropped in quality?
They even did this:
"Amazing performance and efficiency strikes again. These wattage numbers are full system wattages taken at the wall. Our average performance increase at 1440p and 4K was 33% over the GeForce GTX 980. Consider above that the power increase is only 21% at full-load to achieve this.
All of this is also contained in the same video card size package as the GeForce GTX 980, no size differences, no exotic cooling differences needed."
.. like duh, it would be less % when using TOTAL SYSTEM power. Even took a swipe at the upcoming 390X with "no exotic cooling"..
Their own readers on forums are taking note of how bias they have become of late, pushing more GameWorks into their tests and endlessly bashing/blaming AMD for lack of CF support in NV games! Yeah, nice one guys. Even Tomshardware is more respectable!
Depends on the test, some games have wildly different results.
http://www.computerbase.de/2015-03/nvidia-geforce-gtx-titan-x-im-test/8/
They didn't show the result of running 8 x MSAA though, its a slideshow, irrelevant. Pushing vram for the sake of it is stupid, [H] has dropped in quality?
Their own readers on forums are taking note of how bias they have become of late, pushing more GameWorks into their tests and endlessly bashing/blaming AMD for lack of CF support in NV games! Yeah, nice one guys. Even Tomshardware is more respectable!
I actually did run the performance, but did not have time to include it in the preview. Performance wasn't as bad as you'd think. Our full-evaluation will show very high settings, pushing all video cards to compare them at high settings to show what that looks like.
We are using newer games, a lot of newer games are using some GameWorks features. We don't chose games based on which IHV has injected 3D effects into said game. No one complained when we used Tomb Raider for 2 years which favored AMD cards via TressFX. We pick games that are popular, well played, forward looking, and GPU demanding, among other factors. The influence of GameWorks features is exagerated. For the record, the newest game we added, Dying Light, only contains two GameWorks features: Depth of Field and HBAO+, which seem to perform similarly on AMD and NV GPUs. So let's stick to the facts on that topic.
With no support for CrossFire in FC4 since November, and it is now March, AMD rightly deserves criticism for lack of CrossFire support in newer games. When there have been over 3 months of no new drivers, yet several new games being released, with no driver optimizations or CF profiles, this should definitely concern AMD and CrossFire users who want to play new games. It is a valid concern, just thinking from a gamer perspective.
I actually did run the performance, but did not have time to include it in the preview. Performance wasn't as bad as you'd think. Our full-evaluation will show very high settings, pushing all video cards to compare them at high settings to show what that looks like.
We are using newer games, a lot of newer games are using some GameWorks features. We don't chose games based on which IHV has injected 3D effects into said game. No one complained when we used Tomb Raider for 2 years which favored AMD cards via TressFX. We pick games that are popular, well played, forward looking, and GPU demanding, among other factors. The influence of GameWorks features is exagerated. For the record, the newest game we added, Dying Light, only contains two GameWorks features: Depth of Field and HBAO+, which seem to perform similarly on AMD and NV GPUs. So let's stick to the facts on that topic.
With no support for CrossFire in FC4 since November, and it is now March, AMD rightly deserves criticism for lack of CrossFire support in newer games. When there have been over 3 months of no new drivers, yet several new games being released, with no driver optimizations or CF profiles, this should definitely concern AMD and CrossFire users who want to play new games. It is a valid concern, just thinking from a gamer perspective.
You guys planning to do some indepth analysis into overclocking/SLi at 4K testing?
Would be awesome if the likes of say heavily overclocked R290X/GTX980 was in the mix.
I actually recall quite a few people complaining, but that has gone away because TressFX source was made available so anyone could optimize for it. GameWorks is not like this which I am sure you are well aware, it is a closed system.No one complained when we used Tomb Raider for 2 years which favored AMD cards via TressFX.
Thanks for addressing some concerns, your site has been respectable over the years.
The problem with CF in GameWorks title is not entirely the blame of AMD, as your posters & reviewers noted, a patch was required to enable CF for some of the GameWorks title. It's clearly in the control of developers and not at AMD's whims.
GW endeavors to prevent this on the AMD side, which is why I'm confused as to why you compared it to TressFX which is open to Nvidia. GW is part of a special license that devs sign preventing them from revealing what Nvidia is supplying to the dev, and makes it difficult for AMD to optimize.I am pro frequent driver updates to support new games.