Not sure why many people dont differentiate between vram usage (incl caching) vs actual vram need.. where if its not met, performance suffers. GPUs with a lot of vram may tend to use (cache) more of it in a game vs GPUs with less vram. Similar to windows ram management in some cases (Vista ?), the more you have the more it will use, whether it needs it or not. At least windows reports it more intelligently: total ram > cached > available> free... where the 'available' includes the cached.
I made some tests because i want to understand better about that texture/vram usage.
Ubisoft games works a bit different from GTA 5 in Vram management. I used custom low settings with the High texture quality(which require 2gb Vram in both games) to test in r9 390 and GTX560 1gb.
When i enter Watch dogs gameplay it already store all data in Vram which be used the rest of run. It start using 2.5gb Vram and keep this amount of usage all over time i played.
I played the game with gtx560 1gb at same settings and the game runs fine without major stuttering.
So in GTA 5 things goes a bit different, when enter the gameplay it loaded only 1.7gb in Vram, then while i keep moving it load data, Vram memory keep increasing till hit a wall around 3,5gb and stuck at this size and start removing and storing new data.
I played with my GTX560 1gb at same settings in GTA 5 without major stuttering.
The way Watch dogs store data on Vram is same in others Ubisoft titles Far cry 4 and AC: Unity.
Running far cry 4 map for around 30 minutes it loaded only 4gb of data from drive. It used ~3.1gb Vram and 1.3gb system RAM.
On the other hand GTA5 after 30 minute running around map loaded 17gb from drive. Average use was 3.4gb Vram and 3.5gb from system RAM.