With the official packs, Skyrim still looks fuzzier than tunnel shooters from years earlier, and just makes it worthy of having 1GB at <1080P. They boost it to just over console quality. Not to knock it overall, but there are some faults I've come to expect from Bethesda games out of the box, and it was not an exception. Waiting awhile, and then playing a community-tailored version of it, is part of it, and really makes me pity the console players (and it's not even the visual improvements that really make it, but that I get to have to worry about clothes and food, and still have a real challenge as the game progresses).
VRAM is ending up being a lot like system RAM, in that a smaller amount is just fine, to a point, so long as your hardware isn't starved for more (1-2GB, today and in the near future), and getting more than a given card is specified for is largely a matter of wanting more detailed visuals, not of requisite performance (but that doesn't mean the GPU will be a bottleneck before the RAM, if you try to make the game need 2GB of textures when you only have 2GB total--more VRAM for a given GPU can be fine, and not require moving up to a higher-end GPU, if more screen pixels aren't having to be worked on). Even as it has increased in price, RAM has stayed cheap enough that neither AMD nor nVidia spec too little, in either total capacity or number of packages (interface width, effectively), to handle what software is out there or soon to be out there.
For instance, with a decent broadband connection, you can blow right past 1GB VRAM in The Witcher, with a quick trip to The Nexus or ModDB; and The Witcher 2 doesn't even need mods for that, able to get to ~1.5GB stock, but with them can exceed 2GB handily; should we expect the 3rd installment to stagnate in that realm, be it stock or after the fact? I don't think so. Much like Bethesda, but way more efficient at it, CDP have put in a wide variety of surfaces with varied textures, with sufficient view distance and clutter and distance to chew up some RAM with textures, should the textures be of sufficient detail, and if the maker doesn't do that sufficiently, the community will.
OTOH, 1GB is the common denominator, so of course it's going to run fine in 1GB VRAM, too. Nothing will be unplayable in 1GB at 1080P, much less 2GB at 1080P, for some time. To do so would mean alienating customers.
IMO, either is fine, for the OP. It all depends on what the OP is going to play, and how, and what the OP's priorities are, since doubling VRAM adds significant cost, and what IQ improvements are worth some performance reduction is subjective, and may not be worth paying more to be able to have or try out (until surfaces <1ft away from the "player eyes" have 2 or more texture/shader samples per screen pixel, FI--at ~100 PPI at ~3ft--I'll want for higher texture detail). Cards with double the regular spec have their place, even outside of SLI. But, just the same, no current stock-spec >$150 card is going to be crap for any game coming out in the next few years at 1080P. It's just a question of your performance and detail wants, and budget.