This URL was posted because of someone asking Brent, how his review comment about 4GB falling short on 1440p benchmark doesn't make much sense, when FuryX performs relatively better in the 4K tests of the same game. Brent's response is that the game was running so slow on the highest quality 4K settings, that he had to dial down the image quality, which must have reduced the VRAM usage, allowing FuryX to come closer.
If you look at all the results across the many review sites, we can see that FuryX is closest to 980Ti at 4K, and the performance difference increases (for the worse for FuryX), as you go down in resolution steps to 1600x900. Something is holding back FuryX, but it's very likely not VRAM limitation. 4K is actually where the FuryX is benchmarking the best.
The biggest problem is Brent's assumption, that VRAM usage numbers show the amount of VRAM *required* to run the game. That is, if a game takes up 5.2GB on a 6GB card, then that's what the game needs with those settings and resolution. It is a completely incorrect assumption. The memory management modules in game engines work in many different ways. Some employ high and low watermarks (which are based on percentage of available VRAM) to allow flexibility and some hysteresis in how data is swapped out to VRAM. So for instance, same game running on 6GB may have 5.2GB allocated where 2.6GB of data has been referenced in last 1000 frames or so, and maybe 1GB has not been referenced at all for a while. The engine is under no pressure to swap out the VRAM (indirectly of course by freeing up RAM or asking for more, since it's the driver's job to actually swap the data in or out) as high-watermark is not met. A 4GB VRAM card can run the same game perfectly fine at the same asset quality, and with minimum swaps to system RAM in the drivers.
Unless you have the driver or game profiling tool, it's not easy to determine how the loaded VRAM is in use. The only way available is largely empirical - that is testing out the game across different cards with different VRAM settings and same image quality and seeing where the actual VRAM cut off is by *noticing performance drops (frame rates and frame times)* between the cards and also across different resolutions.
I am, in no way saying that 6GB vs 4GB doesn't matter. It can and surely must do for few applications. However the current 4K results don't prove at all that the FuryX is limited by its VRAM in vast majority of tested games.