A lot of people seem to get the causation and correlation between computer hardware the wrong way around. They assume that software will require the thing therefore the manfacturers make it. It doesn't work that way at all, its the exact opposite. The manufacturers of hardware make it and then the software developers utilise it. They have to do a bit of prediction (which is how games like Watch dogs end up getting delayed for performance problems on the consoles) for the near future but basically they don't write games for hardware that isn't very common.
Lets look at the history a bit to say something about the future. So if we look at the 7970 v 680 battle that took place two years ago there is 50% more VRAM on the 7970. How many games actually used it? Very very few, maybe 6 in all. All the big games played well on both pieces of hardware for obvious reasons, because their job is to sell games. So if one card had better tesselation or the other more VRAM what happened is you basically got the worst performance of the pair of the cards. The reason for that is that as a game manufacturer you are never going to choose one company over the other exclusively. Sure some games performed better depending on which company helped with development, you had 3D vision and Nvidia AA and HDAO+ and AMD did similar things with other games but its not like a game came out that only ran well on the 7970 because it needed more VRAM. In actuality what happened was the games developers determined they had 2GB of VRAM to play with because of the 680 and they could only tesselate as much as the 7970 could cope with (which performed worse on that aspect that the 680), that they could only use so much compute performance because the 680 was a bit hampered there, that they also couldn't use many double calculations again because of the 680. The extra VRAM on the AMD 7970 was mostly going to waste but so is much of the tessellation hardware on the 680.
This is just how the industry works, even at the top end of performance you still have to choose a relatively wide market to make developing those graphic features worth the development cost, so high end graphics are targeted at both cards. Thus game for 680/7970 class hardware (they perform pretty similarly) is always going to be limited to each others faults. The causation and correlation is always hardware comes first and the software guys use as much as they can get away with without completely screwing anyone. I know of only one game that even remotely intends to break that and its Star Citizen and yet even they aren't intending to make a game that no one can run, they just adding features exclusive to both AMD and Nvidia in the same game, which hasn't IIRC happened before.
We see the same thing on CPUs. There is a tonne of software that can't run on the desktop because its not fast enough yet. Its not that all software ideas have been exhausted its that the performance isn't really climbing so we can't add features and whole new types of applications. The hardware leads and the software follows. The software can never lead because if we don't have hardware capable of running it we can't test it and no one would buy it because they can't run it.
I generally agree with the above, but consider:
- The upper limit for image quality is near infinite, so a program could have settings that only the most powerful, VRAM-filled cards can use, and lower settings for more-common, less-powerful, less-VRAM-laden cards to use. When Crysis came out, all but the fastest cards choked on highest settings, so the common peasantry had to use lower settings. Just as an example.
- Next-gen consoles have 8GB combined RAM. Some for OS, some for the actual game program, and some for VRAM. It is possible that, once devs get more comfortable with the limits of the new consoles, games will be programmed with this 8GB combined RAM in mind. So when it comes time to port the game to PC, there may be a "full" mode that requires more resources and a "common peasantry" mode for those with 1GB, 2GB, VRAM, or whatever they decide is the lowest target for the game.
And lastly:
- Mods (see e.g. Skyrim with enough mods eating >2GB VRAM). Skyrim runs ok on 1GB VRAM for all but the highest resolutions if you don't go crazy with resolution/settings. But mods can and do eat up more VRAM. More VRAM thus gives you the flexibility to install bigger and more mods. Some people don't care about mods, but many people do.
2GB is enough at 1440p for future games at some given quality level. Whether it will be enough for the quality level you want, I don't know. As some have pointed out, you can worry about this once we get there and upgrade your card only when forced to. That's a valid strategy.