Ok, I'm revising my recommendation:
I think that in terms of horsepower, the 8800gt is very similar to the 4850 when it is fully OC'ed, because, looking at the non-AA graph, you can essentially substitute the 8800gtx for the 9800gtx+ if it is OC'ed (and the 8800gt OC'ed is slightly lower performance, about where the 4850 is). Crysis is a very unusual game in that it uses a crap load of video memory. For that particular game, I would shoot for no AA and instead use the game's built in edgeaa (2) setting (which is automatically activated on High settings-- 'shaders' and 'post processing'). This will give nice image quality and significantly reduce jaggies (also while making the vegetation look fuller) w/o the insane video memory requirements.
Basically, just slap another 8800gt in there and OC them and your CPU and run in dx9 mode (and activate the Very High settings) and you should achieve about 30fps. Also, I remember that the Tom's hardware custom benchmark for crysis is especially grueling and is at the low end of what you can expect in-game... so your average fps should be even higher than 30fps (although I don't know where the cpu speed becomes the bottleneck).
Of course, you can also increase performance by 10% by playing at 1080p rez (don't forget to disable scaling in the control panel though
***UPDATE:
I now understand how the whole price/performance dynamic works. The percentage increase in performance should always be significantly greater than the percentage increase in price. So, if card A is 50fps and card B is 75fps (a 50% increase in performance), but costs 50% more money, then it's NOT worth it. Thanks to Moore's Law, we should get MORE for the SAME (well, including inflation). So, in terms of Moore's law, the performance/cost ratio should always be about (going by theoretical performance) 2:1. So, in other words, if card A is 50fps and card B is 80-100fps and costs not significantly more... it's worth it! OR, put another way, a 1% increase in price should improve --theoretical-- performance (like an increase in stream processors or clock speed) by 2% to be worth it. It all sounds incredibly obvious, but researching on newegg, you can find a lot of such scenarios (like comparing a 9500gt to a 4670 for instance).
One way to calculate this is to divide the framerate by the price... a higher relative number is better. To compare the price/performance ratio between cards, if card A is rated 0.5 (fps divided by price) and card B is rated about the same, even though say card B has a higher actual framerate, you'd think it would be better, but it's not because it costs more! This is because although say its performance is 50% higher, its price is also 50% higher. So card A with a lower actual framerate has a better price/performance ratio even though they both might be rated 0.5 (price divided by framerate).
To figure out if it's worth upgrading from your current card, you simply use the price you paid for it and compare its performance with a card you are considering. As long as the fps/price rating is higher, it's worth it.
You can get another 8800gt/9800gt (same for sli...? not sure) for less than $100 these days! So, you see that although you bought one a while ago for twice as much, you're getting another card with the same performance for half the price
I suppose if you REALLY want to play crysis at that rez w/ aa then go for the 4850 (even the 512mb edition seems pretty solid on those settings... i.e., ATI cards seem MUCH more efficient than NVIDIA cards re. vid mem.)
*** UPDATE 2:
forget the whole price/performance thing here, I've come up with an entire guide on the subject in the forums!