Official Fury X specs

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

happy medium

Lifer
Jun 8, 2003
14,387
480
126
I'll ask the question here I did in the benchmark thread: as 4k slowly becomes more of a real thing people are trying to use, will texture sizes increase? And if they do, will 4GB no longer suffice? What will the longevity of 4GB really be?

@ 4k? I think 4gb is just enough right now for 85% of games @ max settings but by the end of the year? I think it will get worse.
 
Last edited:

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
As games move to dx12 texture memory requirements will go down as texture streaming becomes commonplace (sparse/tiled texturing) either obviating the need for massive frame buffers or further ratcheting quality standards beyond the limit of what is feasible now.

Basically, stuff that stresses 4G cards now, won't with dx12, and the stuff that does will be far beyond what is considered impressive now.
 

Fallen Kell

Diamond Member
Oct 9, 1999
6,063
437
126
I'll ask the question here I did in the benchmark thread: as 4k slowly becomes more of a real thing people are trying to use, will texture sizes increase? And if they do, will 4GB no longer suffice? What will the longevity of 4GB really be?

That is hard to say. If the total texture sizes of all items viewed in a single frame of output exceed 4GB, then yes, I believe the longevity will be pretty small. But if you don't really need that texture on this particular frame and have fast enough memory and disks, pulling that data back into memory won't be too bad as long as there is enough memory still available that you don't need to do the longer load from disk operation too many times.

Until we see more games that are really using lots of video memory, we won't know the answer to this question. That said, it is all a "chicken and the egg" theoretical problem in the first place. In the end, game developers work towards the hardware that they have available. If some feature that they can implement needs 8GB of video RAM to perform, they are more likely to put that at the back of the list of things that have priority to be implemented due to only a few graphics cards that have support for using that much memory. Really, we are already limited by what the "consoles" can do. Most development studios will make sure their games will be just as good on the console as their PC release (since they typically make more money from the console side), so the majority of the time spent working on the games are to make it great on the consoles with just exposing extra resolution on the PC side (possibly higher texture packs, but you won't see them spending a lot of extra time on it).
 

chimaxi83

Diamond Member
May 18, 2003
5,649
61
101
It's great that AMD is covering everything on the card, GPU, HBM, as well as all the power circuitry. Guess that's why they said this card is for overclockers. Looking forward to custom cards.

Here's the cooler.

 

Grooveriding

Diamond Member
Dec 25, 2008
9,108
1,260
126
Copper pipe running through that channel must be for the VRMs.




This will bring much of the experience you get from closed loop GPU cooling. Acoustically, it will be quieter than anything else out there. The hybrid AIOs are still using a fan to cool the power delivery area.
 

.vodka

Golden Member
Dec 5, 2014
1,203
1,537
136
Get out of here with your facts man

Sorry, can't. We all gotta start dispelling all the FUD that was being thrown around for the past few months.


They weren't kidding when in a promotional video that was posted today, claimed "built with the best materials".. They've clearly gone out of their way to get rid of the image they created with the 200 series.

I like this new AMD. Hope they can do the same with the CPU division next year.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,108
1,260
126
I'll ask the question here I did in the benchmark thread: as 4k slowly becomes more of a real thing people are trying to use, will texture sizes increase? And if they do, will 4GB no longer suffice? What will the longevity of 4GB really be?

Consoles drive the gaming market and they are capable of only doing 1080p 30fps for the most part. I don't see game developers investing resources in huge textures just for an outlier resolution on PC.

VRAM demands always get higher as time goes by. With my 780tis they eventually hit a game that they could no longer run for me at 2560x1600 because of VRAM limits - Shadow of Mordor killed them on ultra settings. Eventually 4GB cards will find a similar situation.

Right now I haven't seen any benchmarks that show a 4GB card running out of VRAM. No sudden drops in framerate to a couple fps. You can't even run the really demanding games on their highest settings at 4K and get a halfway decent framerate yet. Dual Titan X in GTA 5 or Witcher 3 and you are getting 30-40fps on average with max settings, minimums in the teens or 20s. 4K works fine in most games, but the really tough ones to run you're still having to turn settings down no matter what hardware you have. It's GPU power that's lacking for that resolution right now.

For myself 4GB doesn't bother me. I'll be jumping on whatever is fastest as soon as it arrives just like I do now and won't have the cards for years and find myself running out of memory.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,949
504
126
That is probably where the components are packaged or maybe who makes the package, we don't know where the GPU die is made.
 

crashtech

Lifer
Jan 4, 2013
10,554
2,138
146
Unfortunately for me, drivers are the one thing AMD is really not that good at doing. Don't get me wrong, they have most definitely gotten better than they were a few years ago (then again, a few years ago, you couldn't be doing a whole lot worse). When AMD took over ATI, this is the one area which I felt they have really screwed up. It always seems the be 6 months or more lag before AMD has drivers that actually work with their hardware...
I'm not sure how long you've been in the PC biz, but ATI was never really known for their great driver support. I actually think things have become better since AMD took over. Small increments to be sure, but still...

That is hard to say. If the total texture sizes of all items viewed in a single frame of output exceed 4GB, then yes, I believe the longevity will be pretty small. But if you don't really need that texture on this particular frame and have fast enough memory and disks, pulling that data back into memory won't be too bad as long as there is enough memory still available that you don't need to do the longer load from disk operation too many times.
I did the arithmetic once on how much RAM it takes to hold one 4K frame at 32-bit color, and it's something like 33MB. So that's a lot of textures hogging up the rest! Seems like there might be some savings to be found if devs are forced to do it.
 

RampantAndroid

Diamond Member
Jun 27, 2004
6,591
3
81
Consoles drive the gaming market and they are capable of only doing 1080p 30fps for the most part. I don't see game developers investing resources in huge textures just for an outlier resolution on PC.

VRAM demands always get higher as time goes by. With my 780tis they eventually hit a game that they could no longer run for me at 2560x1600 because of VRAM limits - Shadow of Mordor killed them on ultra settings. Eventually 4GB cards will find a similar situation.

Right now I haven't seen any benchmarks that show a 4GB card running out of VRAM. No sudden drops in framerate to a couple fps. You can't even run the really demanding games on their highest settings at 4K and get a halfway decent framerate yet. Dual Titan X in GTA 5 or Witcher 3 and you are getting 30-40fps on average with max settings, minimums in the teens or 20s. 4K works fine in most games, but the really tough ones to run you're still having to turn settings down no matter what hardware you have. It's GPU power that's lacking for that resolution right now.

For myself 4GB doesn't bother me. I'll be jumping on whatever is fastest as soon as it arrives just like I do now and won't have the cards for years and find myself running out of memory.

I agree that 4GB might be fine for now, but I don't find myself in the bucket of buying cards every 2 years anymore. My 670 dates back a bit. The second one happened later on...and I'm done with SLI. So it's top tier cards for 3 years or so for me. In that period of time, we went from the 670 @ 2GB of RAM being enough for 1440p (I saw this as the general opinion on forums back then) to now 4GB being recommended for 1440p. I'd rather overshoot the recommendation in case I suddenly move to 1600p down the road or something.

That is hard to say. If the total texture sizes of all items viewed in a single frame of output exceed 4GB, then yes, I believe the longevity will be pretty small. But if you don't really need that texture on this particular frame and have fast enough memory and disks, pulling that data back into memory won't be too bad as long as there is enough memory still available that you don't need to do the longer load from disk operation too many times.

Until we see more games that are really using lots of video memory, we won't know the answer to this question. That said, it is all a "chicken and the egg" theoretical problem in the first place. In the end, game developers work towards the hardware that they have available. If some feature that they can implement needs 8GB of video RAM to perform, they are more likely to put that at the back of the list of things that have priority to be implemented due to only a few graphics cards that have support for using that much memory. Really, we are already limited by what the "consoles" can do. Most development studios will make sure their games will be just as good on the console as their PC release (since they typically make more money from the console side), so the majority of the time spent working on the games are to make it great on the consoles with just exposing extra resolution on the PC side (possibly higher texture packs, but you won't see them spending a lot of extra time on it).

While this is true, you have games that will also be "next gen" and try to push more towards the limits. I see DX12 only accelerating this trend, with the CPU being less of a limit allowing people to focus on top end cards. The trend has been in past to increase VRAM - and I see the 8GB 390/x being a continuation of this trend. The Fury series just screams "compromise" to me.

Regardless, touching system RAM for the next frame simply isn't doable - system RAM has a much *much* higher latency to it, and slower speed across the PCIe bus, along with possible memory contention. Touching the system RAM (or heaven forbid, hitting a page fault and going to the disk) is an order of magnitude slower than just hitting VRAM...last I knew. There's a reason CPUs are shoving wads of L3/L4 cache on these days too.

I'm not sure how long you've been in the PC biz, but ATI was never really known for their great driver support. I actually think things have become better since AMD took over. Small increments to be sure, but still...


I did the arithmetic once on how much RAM it takes to hold one 4K frame at 32-bit color, and it's something like 33MB. So that's a lot of textures hogging up the rest! Seems like there might be some savings to be found if devs are forced to do it.

It isn't the 4k frame itself actually 24bit (the extra 8 bits being transparency) and thus 10MB:
24bits * 2560 * 1440 / 8 bits / 1024 / 1024 = 10MB

Either way, the textures being loaded are using up the memory, not the frame itself (there's a flip buffer for that?) Also, if doing super sampling, the math changes some.
 

Paratus

Lifer
Jun 4, 2004
16,846
13,777
146
One quick comment on the 4GB.

From the comments made by AMD they said they haven't ever bothered to optimize memory storage since they've basically been increasing it by 1/3 - 2X since the 4870.

The thing is both AMD and Intel have sophisticated algorithms to optimize the L1-3 caches on their CPUs.

Assuming AMD isn't BSing there's probably a lot they can do to efficiently utilize 4GB of space beyond just the color compression on Tonga.

To really tell if AMD has done its work AT needs to test the following:
  • VRAM usage at high settings
  • FPS at high settings (Min, Max, Avg, and FCAT)

They need to compare against:

  • 6GB 980Ti
  • 8GB 390X
  • 4GB 290X
  • 3.5GB / .5GB 970

If VRAM usage for the Fury shows 2/3 the usage of the 6GB 980Ti they'll have done their job optimizing usage.

If however Fury keeps maxing out at 4GB we'll need to compare FPS against the 4 GB cards to see if AMD made any improvement at all. If the 290X and 970 tank while Fury keeps chugging away they'll have done their job.

The benefit of using driver level optimization means that potentially every GCN card could get an effective memory size boost.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
One quick comment on the 4GB.

From the comments made by AMD they said they haven't ever bothered to optimize memory storage since they've basically been increasing it by 1/3 - 2X since the 4870.

The thing is both AMD and Intel have sophisticated algorithms to optimize the L1-3 caches on their CPUs.

Assuming AMD isn't BSing there's probably a lot they can do to efficiently utilize 4GB of space beyond just the color compression on Tonga.

To really tell if AMD has done its work AT needs to test the following:
  • VRAM usage at high settings
  • FPS at high settings (Min, Max, Avg, and FCAT)

They need to compare against:

  • 6GB 980Ti
  • 8GB 390X
  • 4GB 290X
  • 3.5GB / .5GB 970

If VRAM usage for the Fury shows 2/3 the usage of the 6GB 980Ti they'll have done their job optimizing usage.

If however Fury keeps maxing out at 4GB we'll need to compare FPS against the 4 GB cards to see if AMD made any improvement at all. If the 290X and 970 tank while Fury keeps chugging away they'll have done their job.

The benefit of using driver level optimization means that potentially every GCN card could get an effective memory size boost.
Great post. Level headed analysis.
 

Fallen Kell

Diamond Member
Oct 9, 1999
6,063
437
126
I did the arithmetic once on how much RAM it takes to hold one 4K frame at 32-bit color, and it's something like 33MB. So that's a lot of textures hogging up the rest! Seems like there might be some savings to be found if devs are forced to do it.

I don't think you understand how it actually works. You see, the 3D object/model, almost essentially as a whole, gets the texture loaded onto it, and then the part that is actually going to make it into view is the only bit drawn on the screen. But unfortunately, the card still loads the texture of the object/model, and there might be thousands or even millions of objects and models on display, each with their own 4k resolution texture, which while I know won't actually make the final 4k pixels that are drawn to the player, all those textures are needed to be loaded so that an algorithm can run against it to only draw the correct pixels for the angle, distance, and partial obstructed views from the "camera" vs the object as a whole. So while the entire output frame itself is only say, 33MB in size, it needed to perhaps load 10,000 4k resolution textures which vary in size depending on the overall size of the object/model (size a 4k resolution texture on an object that only has a surface size of 1000 pixels is only going to be 1000*<colorbit depth> in size, but a model that has a surface size of 100,000,000 pixels is going to be 4096*4096*<colorbit depth> in size).

There are some advanced techniques that are trying to reduce the textures from being loaded, like being able to only need to load parts of a texture map into memory (so that for an object/model that has only 1 pixel visible from it due to other objects/models obscuring it in some way (i.e. a second object/model placed on top of another one to make a more complex shape, or one placed "in front of" in the direction of the draw camera, etc). These are the only things that will reduce the texture load. Otherwise, if the object is on the screen the texture for it needs to be loaded, because the part that does the linear transform from the 3D environment to the 2D image drawn on the display doesn't at that point in time tell the graphics card to load that part of the texture for the model, it has already been loaded...
 
Last edited:

.vodka

Golden Member
Dec 5, 2014
1,203
1,537
136
Reads: Content under embargo until ?AM June 24, 2015 ???

So June 24 it is for Fury/X, and the 18th for the 300 series. Reviews on Fury should be interesting. Insane numbers if it ends up like that on reviews, must justify them having a 5k monitor there at the presentation.
 

Fallen Kell

Diamond Member
Oct 9, 1999
6,063
437
126
4:53 of the Computex presentation video: https://youtu.be/QQ92qWdVLsM

Well, you need to take the Computex presentation at 4:53 and actually listen to what is being said, "it allows", and "in the future" are far different from "we built" and "we are releasing it in 2 weeks". They are simply talking about the technology in general with the potential of what the technology has. It is no different than say a rocket engine company saying they have a new engine design that produces more thrust, and uses less fuel than all other rocket designs in existence and our working engine that we are releasing for sale to the public in 2 weeks is a 1/10,000 scale version of the overall design (because the public doesn't need to shoot a 10 ton object to Jupiter).
 
Last edited:
Status
Not open for further replies.
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |