Ah now we have a pickle. Is the extra 4GB useful on the 390X, and the 4GB limit on Fiji going to be a problem. Or is the 4GB limit on Fiji enough, and the 8GB 390X is just a pointless cost addition on 4GB 290X. Going to be difficult to eat both cakes.
8GB is worthless for Hawaii but 4GB isn't. The key videocard that has been devastating AMD's 290/290X has been the 970. The 3.5GB VRAM resulted in lower sales in February but since then most gamers forgot about the issue. With AMD positioning R9 390 with similar performance to a 970 but having 8GB of GDDR5 will really drive the message to the type of gamers who buy things at places like BestBuy. People have been asking AMD to improve their marketing, here it is. If 970 and 390 are similar in performance, it makes it a lot more difficult to recommend a 3.5GB card vs. an 8GB card because at that point 8GB becomes a 'free' bonus.
I have that monitor. It's huge. It took a bit to get used to it. If you have multiple monitors, this can effectively replace a couple 27 monitors or 3 24 inch monitors. The DPI is almost identical to the 27 inch 1440p monitors. The contrast is WAY better than IPS. Viewing angle is great. Input lag is low. I like it. But, it's a PWM monitor. That might turn some off. From my experience, a single GTX Titan X overclocked to 1550 still isn't enough to drive 4k. You'll need 2 top end GPUs to do 4k justice.
That means you'll still need 2 FIJI gpus to do 4k with good eye candy.
:thumbsup: Thanks for sharing your views on the monitor and your experience. You also keep highlighting a recurring theme - PC gamers who actually own 4K monitors continue to comment how one really needs dual flagship cards to really enjoy it vs. other PC gamers using 1080P-1440P monitors on a mid-range 290X/970/980 card saying that modern games can be played at 4K at high quality settings 40-60 fps, clearly a theme that contradicts every professional review site and every 4K owner's experience on this site.
The 290X cards with 8gb didn't make any waves:
290X with 16GB and $199 price wouldn't make any waves. R9 200 series has a tarnished reputation among the mainstream media/gamer for being hot and loud. These 2 aspects will no longer apply with 390/390X. 290 just needed about 5-6% more performance to match a 970 and 390 should be close to that but have at least true 4GB of VRAM against 970's 3.5GB, and now the 390 will run cool and quiet - addressing 2 major concerns of the 290 series tarnished image.
OK, AMD is dead now... and HBM is the last blow from them.
AMD could sell 0 video cards for the next 18 months and not go bankrupt. You might want to look up how the firm actually works.
That could turn out to be a critical mistake. What is better, having a low volume set of GPUs or the rest of the lineup updated and competitive in all metrics.
True, but in this case all the GPU design work has been done on R9 300 series which means if you amortize the R&D costs, and consider how little time if left on this 28nm node for these cards before next year, I don't think the cost-benefit analysis would have justified creating new GPUs top-to-bottom on 28nm node for AMD. Their priorities are the future - R9 300 series is not it. If it costs $250 million - $1B to design a new GPU, do you honestly think AMD would have been able to make $ if they designed 3 new ASICS? I think Lisa realized from a financial perspective, it would have been a less profitable venture. Pretty sad state of affairs from both AMD and NV in the $300-500 range right now.
HDMI 2.0 certainly has enough bandwidth to do 4K/60Hz/ 4:4:4. I've tested it myself on one of the Samsung 4K sets.
However , it cannot do 4K/60Hz/ 4:4:4 if HDCP 2.2 is active due to added overhead. No current video cards support HDCP 2.2, but it will be required for 4K Blu-ray.
Thanks for clarifying that. Sounds like most 4K TVs and all 4K capable videocards aren't future-proof then for 4K HTPC. Hopefully by the time Pascal launches, we will have DisplayPort 1.3 and next gen 4K hardware HEVC capabilities top-to-bottom.
This specific forum on Anandtech is Video Cards and Graphics. People have been asking for a Display forum for quite a while. This forum has it's niche, just like others on the net. I'd recommend looking into the AVSForum.com for broader display discussions. HardOCP display forum is pretty good as well. Head-fi.org is great for audiophiles. But I'd understand if you want to maintain the Anandtech community.
:thumbsup: Good thing they are now adding a Display section. It never made sense to me to have so much focus on videocards but so little focus on monitors when a lot of gamers entering the PC gaming industry (younger generations) need a new monitor too. Even the traditional review sites place too little emphasis on monitor reviews vs. videocard reviews. When we use our PC, we interact with a monitor in both 2D and 3D work and it outlasts any GPU. A monitor buy imo is a more important component since a $650 videocard with a crappy $200 monitor is still a crappy gaming experience imo.
Look again, you'll find plenty of games that are playable. Check HardOCP, TPU, AT, TechReport, etc... I know, they are American sites and they're run by typical Americans and blah blah blah, spare me the drama. Why are we only including games within the last 8 months? Who made up that random timeline?
I am not going to waste my time. 4K is not playable on a 970/980/290X or therefore 390/390X at
good setting and FPS and all 4 sites above confirm. Go read more reviews and come back with an objective response.
Face it, people are using the HDMI 2.0 ports on the Maxwell cards. Whether you think they should be or not (like that opinion matters to them at all), they are. And to say it is all just a gimmick is laughable at best, trolling at worst.
Way to take my posts out of context. Yes, HDMI 2.0 is useful but to imply that R9 300 series is DOA because it doesn't have HDMI 2.0 is stretching things. You keep saying how HDMI 2.0 is useful for gaming but the reality is for 4K gaming, none of these cards in question are sufficient.
And before my phone dies, I didn't upgrade out of necessity, I upgraded because I had to sell my 770 before the resale value tanked on it. We went over this before... At that time, I needed a new card. And the 970s were both faster and cheaper than the 290, so guess which one I went with! :awe:
1. The fact that you bought a 770 during R9 280X generation shows you don't care about price/performance at all. 280X was $299 vs. $380 for 770 2GB (obsolete) and $450 against 770 4GB. Also, the fact that you waited that long to buy a 770 and skipped 1Ghz 7970 cards that were $300 8 months before 770 even released shows you had no interest in any AMD card.
2. 970 was only cheaper than a 290 for 1 month at best. You aren't discussing how an after-market 290 was $350-375 5 months before 970 launched or how 290 was $400 10 months before 970 launched. Again, there was plenty of time to sell a 770 and buy a 290 but you didn't do that either. Instead you waited 10 months to get a card 5% faster. Again, shows you had no interest in buying an AMD card. You aren't fooling anyone trying to spin things as if your 970 purchase was objective. This entire forum already knows you only buy NV cards and the fact that you owned GeForce 5
and 7
and Fermi puts you in a very special group of GPU owners, which I won't name as I'll get an infraction.
Just look at 1440p displays. Even in 2015 almost none of them support 1440p over HDMI, which is a HDMI 1.3/1.4 feature. Most 1440P monitors are still limited to 2005/2006 HDMI refresh rates and resolutions.
That's true but if we are talking PC monitors and not TVs, as long as they have DisplayPort 1.2, that's enough to drive 1440P-4K. I got a BL3200PT and I don't care if it has HDMI 2.0 since it has DP1.2. HDMI 2.0 in general is inferior since it can't do FreeSync/GSync which means it's soon going into legacy status for PC monitors once adaptive sync gains more traction or manufacturers release HDMI 2.0 monitors that can do adaptive sync. The problem is the 4K TVs don't have DP connectors. That's a way bigger deal than for a PC monitor to not have an HDMI 2.0 connector. We can use DP for a PC monitor but if one gets a 4K TV, they are stuck using HDMI 2.0.
Sapphire's R9 290X Tri-X was clocked at 1040 MHz core and 1300 MHz RAM. It draws a
maximum of 316W in FurMark, and a peak of 253W during normal gaming. I think we'll be seeing similar figures here, since it's the same chip with maybe a new stepping if we're lucky. But it's not going to go to 375W.
Friendly advice: stop using FurMark as a measurement of a videocard's real world power usage. Just like no one would use LinX + FurMark simultaneously on a PC system to figure out the maximum power usage when choosing a power supply, we do not use FurMark to extrapolate a GPU's real world power usage in games. As has already been explained to you many times, an identical GPU (GM204) will draw 350W of power in Furmark but only 200-210W in games because the PCB is designed for that (dual 8-pin, 8 power phases, etc.). One can easily design a 980 that is capable of drawing 500-600W of power in Furmark. You need to be able to understand how Furmark works both on the ASIC and the PCB/power circuitry components. It's a power virus, not a game.
Ugh, straight rebrands. How could anyone think this was a good idea?
It's not a good idea for us consumers this generation but for AMD's long-term future and hopefully a solid 14nm HBM2 line-up, this is probably a smart decision. Also, don't forget that AMD's GPUs tend to drop in price quicker than NV's. If R9 390/390X launch at $329/389, in about 5-6 months it should be possible to find them for $260/300 with rebates.
Agreed. Given AMD's R&D limits, they would have been better off putting the money into two smaller chips instead of Fiji.
Would have been true if this was the beginning of an all new 28nm node, if we were at the bidding phase for new laptop design wins and if AMD had a clear strategy of getting a lot more notebook OEM design wins. None of these apply. We are at the end of the 28nm node gen, all major laptop GPU design wins have been won by NV already. AMD did get GCN into Macs so that was a major marketing win. Fiji can be reused on a 14nm shrink since it was designed to use HBM. It's probably cost prohibitive to redesign Tonga, Bonaire, Hawaii, Tahiti to 14nm HBM2. At that point AMD will need an entirely new stack of chips -- and that's hopefully where they invested the money after choosing to refresh R9 300 series like that.
Also, your argument is inconsistent. You criticize AMD should have redesigned the entire stack top-to-bottom this round but then you say NV can readily drop the prices of Maxwell. That's exactly why Lisa Su would have been suicidal to spend $250M-1B to redesign 28nm low- and mid-range GPUs because NV could have just dropped 960 to $149, 970 to $249-269 and 980 to $399. Do you see now why what you are saying is completely inconsistent with the other point you keep making that NV's high profit margins could easily allow them to make AMD's new line-up obsolete? Don't you think Lisa Su knows this very point which is why she probably figured it's better to sell the aging 200 series at small profits than take massive $500M investment risk on redesigning the entire stack, and then not even win because nV could drop prices?
Also, it seems you keep using market share and sales numbers to drive your point. I mean if we look at the performance of 750/750Ti/960, it's nothing special. So if someone was a budget gamer and had a 300W PSU, would you recommend them a 750/750Ti or a $20 more expensive 30-45% faster R9 370? You seem to have not addressed this point.
Maybe a lot of less knowledgable gamers have no clue how Pitcairn pounds 750Ti into the ground in games? R9 300 series helps AMD reset this idea.
http://www.computerbase.de/2015-05/...ergleich/2/#abschnitt_leistungsratings_spiele
The negativity on the rebrands is understandable but declaring Fiji a failure before it ever comes out is silly at best.
There was hardly any negativity on 560 and 770. 770 2-4GB was $80-$150 overpriced compared to the 280X but it was cheered by professional reviewers. Also, many gamers don't remember (or are ignorant) but 680 sold for $320-340 when GTX770 came out for $379. That meant 680's replacement had worse price/performance to fire sale 670/680 cards. Don't remember this forum bashing a 560 or 770 though. Also, it's quite ironic how price/performance of 390/390X is being attacked now by the same people who ignored price/performance of 290/290X cards because to them after-market 290/290X cards didn't exist for the last 1.5 years.
Fury is a new design, not a rebrand. But it will sell miniscule compared to other chips. Specially the 370 and down. Chips(250/270) that desperately needed to be moved away from GCN 1.0.
It's longer term thinking. Regain flagship GPU performance, keep investing into future GPU tech, and Fury designs can be used as a foundation for next gen mid-range 14nm HBM2 product. People actually expect NV to release R9 400 and 500 and 600 series. I know this concert is hard to grasp for you since you've been waiting for decades now for AMD to declare bankruptcy. That's why it's hard for you to grasp why AMD's GPU division keeps investing into tech of tomorrow.
How confident are you with this prediction?
You have to ask? According to him AMD won't improve perf/watt this generation at all. He is in for a shock when Fiji is > 30% faster than Hawaii and it won't be just from HBM1.
Please explain this to me. I always thought if a card uses 275 watts it always consumes 275 watts.
Power usage is also impacted by temperatures and dynamic voltage/power load balancing. With lower temps, power usage falls. With fast switching dynamic voltage/load balancing, a reference 980 can use 165W of power in games but in compute, that can easily go over 200W. Because compute is a constant workload, there power load balancing doesn't work and dynamic voltage doesn't help since the GPU is pegged at 100% usage the entire time. Games do not do that which is why high frequency dynamic voltage switching and power load balancing help GPUs use way less power in games than more strenuous compute workloads.
For the sake of the argument. Lets just say that AMD release the 390X and its performance is pretty much equal to the Titan X but is significantly cheaper, do you think Nvidia might change the price of the Titan X at all? I'm keeping an open mind though I'd like a Titan X and would wait a few weeks to see if the price will budge at all.
No. Even after R9 290X came out at $549 and even after 780 Ghz editions beat the OG Titan, NV's $1K price remained. Even if Fiji is magically 15% faster than the Titan X, the TX will still cost $1K. Recall that NV sold slower, hotter, louder Titan Z for $3K when R9 295X2 cost $1.5K and beat it in nearly every key metric.
There is really no point to buying a TX anymore unless you are running 3 4K monitors in surround.
Don't say that! You might hurt someone's e-peen.
Site blocked at work, is that:
8% over GTX 980 power consumption for that 20% perf
or
8% over R9 290X power consumption for that 20% perf?
I'm assuming the former, which would be a huge boost to AMD. Reviews would have to point that out.
Chances are Simple Jack was either a made up BS/slide, or it was an early version of Fury X / Fury tested at lower clocks early in the development cycle.
Well that is sounding awesome for Fiji.
I haven't lost hope on it though.
This is good to hear
It's one of the reasons 300 series and Fury are marketed differently. You have 'legacy' GDDR5 products and all-new HBM1 products. With rumoured 4000 shaders, 128 ROPs, 512GB/sec bandwidth, Fury should be substantially faster than the 390X. In fact, if you look at AMD's new gen flagship cards built from the ground-up, besides HD5870->6970, AMD's next gen flagship is at least 30% faster. That means Fury X should be
at least 30% faster than the 290X.
Even if Fury Pro is 15% faster than a 980 and Fury X only matches a 980Ti, it's still good to see 2 strong players in the GPU market this close again. Remember than NV's flagships tended to outperform AMD's by 10-18% over the years. If Fury is identical in performance to a 980Ti, it would mean AMD would have completely closed the flagship gap NV enjoyed with 280/285/480/580, etc. Stronger competition is good for the consumer since we could see more game bundles, rebates and price drops and lower 2nd tier SKUs as NV/AMD desire to win sales. Think about it, less than 1 months ago we had $550 980 and today we have a $650 980Ti that's way faster. Add Fury to the mix and the flagship $500+ landscape will change dramatically putting 970/290X/390X/980 strictly into mid-range category.
At the same time, we have to be realistic with our expectations. For example, after-market 980Ti cards when max overclocked are
24-35% faster than a reference 980Ti. As a result, no one should expect Fury to "blow away" GM200. It's not going to happen simply because GM200 overclocks and subsequently scales very well with increased clock speed.
About 24 hours for the presentation and no solid leaks. :/
If you got time to burn, there is a lot of E3 gaming content.
Where to Watch E3 2015 Press Conferences
http://www.playstationing.com/ps4/where-to-watch-e3-2015-press-conferences/2002
I am actually more excited about the new games coming out than Fiji -- Fallout 4, Doom, Forza 6, Rise of the Tomb Raider, Halo 5 Guardians, Xbox one gaining BC with 360, FF7 remake, Dishonored 2, etc.