Maybe that trying to make engines very scalable is part of the problem and not "just" the consoles having limitations. To be honest it surprises me that developers today are still trying to make hardware from three or four years ago, or even more, to be able to run the game "well", whatever in-game settings the player uses. The thing is... who actually DOES use lesser in-game settings?
And if so, is the player doing it until the desired performance is reached whatever the price to pay in terms of graphics quality and details is? In other words, for Crysis for instance, how many players actually "accepted" to play it on "low" settings, or even medium? When a gamer looks at all the videos and screen-shots, they want their games (generally) to look like that, but they never expect the performance hit to come with it, unless they do have the very best hardware there is (and even then, it might be pushing it for a game like Crysis, as we know).
I think there's a bit of naivety as well from the developers doing scalable engines, they honestly "think" that the gamers will take the responsibility by themselves and say something along the lines of: « Well, this game has the latest in graphics technology, my hardware can display that, but my performance is barely acceptable, I think I'm going to have to play this on low settings until I get new hardware. » ...I just don't think that's the case, at least not in majority, I never heard of a single gamer paying Crysis on low and him or her being content with that and say « yeah that game is amazing! », even if that's a possibility on paper, I just can't picture that.