This is a terrible review they did. Showing consumption measured in Afterburner tells you nothing about how much VRAM you actually need. You can use two different video cards with different amounts of VRAM in the same system, same game and same settings - but see two different values of VRAM consumed. You have to show framerate over time to give a real indication of how much VRAM you actually need. When you actually run out of VRAM your game drops down to sub 5fps and stutters while the memory is refilled. Just because software says the game is using X amount of VRAM, it does not mean that is how much VRAM you need.
For example my VRAM used in Battlefield 4 was different on my 780ti cards vs my Titan X cards; the Titan Xs used more VRAM. This has no impact on visuals or gameplay, both were identical, just different amounts of VRAM consumed.
The only time I've actually ran out of VRAM in recent memory was trying to run Shadow of Mordor at 2560x1600 on my 3GB 780ti cards. The game ran out of VRAM constantly and stuttered horribly. I had to reduce the texture settings to make it playable. On my Titan X cards I saw VRAM usage over 5GB, but if you look at reviews of GTX 980 cards with 4GB, they don't stutter at 2560x1600 with maxed settings. The 980 with 4GB can even run Mordor at Ultra settings 4K without running out of VRAM:
TLDR: This only shows how much VRAM is being filled in monitoring software, not how much VRAM you actually need. To show actual VRAM requirements you need to show minimum framerate over time in a graph that also displays VRAM usage over the same synchronized timeline.