It was a Marketing decision and IMO a good one. All the baggage accumulated from previous gens didn't overshadow Fury. Fury was presented in such a way that it would be judged on its' own merits.
For me, the 6 big issues for Fury were (in no particular order):
1) 1080p DX11 driver overhead, although I'd question why anyone would care to buy a $650 GPU to play modern AAA console ports at 1080p when a $280 R9 390 (or competing product) is just as sufficient in this targeted space. For 1080p gaming, might as well buy a $280-300 R9 390 and then another $280-300 card in 2 years rather than buying a Fury X. Still, the very thought that you pay $650 for a card that shines at 1440p and above ensures that 1080p gamers will buy the competitor every time. That means Fury X was too expensive for 1080p and not fast enough for the price at that resolution. While I would imagine that most 1080p gamers wouldn't even be looking at this class of card, those that did would not buy the Fury X. With 2016 and beyond, AMD needs to keep focusing on DX11 drivers and yet have enough resources to allocate programmers on DX12 drivers. Big challenge.
2) The second major issue is lack of overclocking headroom.
Stock vs. stock, Fury X is very competitive a 1440p and is actually faster at 4K. But when the competing card has 20-25% overclocking headroom, that's another tier of GPU performance. Unfortunately for AMD, there is a double standard in the GPU industry where if the competitor overclocks much better, it matters a great deal while if AMD overclocks well, it's largely ignored. But it's the reality AMD has to deal with and for R9 400 they need to up the clocks as far as possible OR allow AIBs to offer much higher factory pre-overclocked cards. This is one area where 14nm LPP could show great benefits over 28nm TSMC on which Fury X was manufactured.
3) Lack of HDMI 2.0 support. For a larger part of Fury X's intended target market -> 4K, this was a gross oversight. It's hard to measure how much this mattered but I presume every consumer with a 4K HDTV went with the competition due to lack of DP-to-HDMI 2.0 adapters for most of 2015. Even if Fury X were 10-15% faster, the lack of HDMI 2.0 connectivity would have ensured that 100% of the sales for 4K HDTVs went to the competitor. This area will be completely addressed with R9 400 due to DP1.3 and HDMI 2.0a that will support HDR.
4) Launching after your competitor and yet offering inferior price/performance. This is self-explanatory and it doesn't look good for brand image. If you launch later, it's better to bring something new to the table, whether it's superior price/performance OR more performance. For a brand like AMD, they cannot afford to just match the competition when they launch late. This is something AMD needs to pay close attention to especially if they plan to launch R9 400 series only in 2H of 2016, assuming launching behind their competitor.
5) Lack after-market air coolers. Had Fury X launched with air coolers, the terrible QA/QC of earlier AIO CLC pump samples wouldn't have smeared the entire Fury X line-up since the consumer would have had alternatives to AIO CLC. Also, since not everyone feels safe about using an AIO CLC in their rig, providing more options is important. Furthermore, the mandatory inclusion of AIO CLC and lack of non-reference PCB designs implied two crucial limitations:
(i) It made SKU differentiation impossible for AMD's AIBs. This was a devastating strategy since the competitor's AIBs could do whatever they wanted --> Design a cheaper PCB and lower price? Check. Design a better PCB+components and get higher overclocks? Check. Lots of options that AMD didn't give to the Fury X.
(ii) AIO CLC means it's not possible to have 0dBA operation for <60C as many flagship cards can have. As a result, AMD forced the consumer to make the choice between a quieter card at load but louder card in 2D/light 3D gaming applications. Again, offering after-market air cooling designs would have addressed both types of consumers.
For R9 400 series and Fury X competitor, AMD needs to be more flexible in offering a reference AIO CLC solution if they so desire but also giving the option of an after-market air cooled designs and doing so as close as possible to reference card's launch date.
6) 4GB of HBM1 vs. 6GB of GDDR5. It looks really bad after HBM1 was over-hyped to deliver next gen memory technology breakthrough but you not only get worse performance than the competition but you also has 50% less VRAM. Considering the competitor's customers are more loyal to begin with, delivering less in this area would have ensured that even if Fury X matched the competition in all key metrics, it would have ultimately lost based on the perception about VRAM capacity alone.
With HBM2 allowing scaling up to 32GB of VRAM, this issue should be finally addressed. For mainstream chips, they could use GDDR5X and 8GB.
-----------