Oops.
As it turns out, I did make a mistake in my math (that's what I get for trying to calculate at 12:30 AM). And not a little mistake, either... a very big one. I was way too optimistic. The amount of time it would actually take to brute-force a 1,048,576 bit key is much, much longer than I stated.
How much longer? Well, take that incredibly vast amount of time I described, and double it. Then double it again. And again. And again. After you've doubled it a million times, you're still at a number that's an insignificant speck compared to the correct answer. Double it another 40,000 times or so and you're close.
Hamburglar, here's how you figure something like this out:
A brute-force attack takes time proportional to the total number of possible keys. Adding one bit to the key length doubles the keyspace, just like adding one bit to a binary number doubles its highest value (or adding one digit to a decimal number multiplies its highest value by 10.) So a 1,025 bit key will take twice as long to crack by brute force as a 1,024 bit key, and a 1,026 bit key will take four times as long, and an N-bit key (N>1024) will take 2^(N-1024) times as long.
This means a 2,048 bit key will take 2^1024 times longer than a 1,024 bit key... and the huge amount of time I gave in my last post was actually how long it would take to brute-force a 2,048-bit key. 2^1024 = approximately 10^309. Here's where fun with exponents comes in. If you divide X^M by X^N, you get X^(M-N), which means that if M is large, you can divide your number by some pretty darned impressively big amounts without making much of a dent. So, for example... your 2^2048 bit key will take 10^309 times longer than a 2^1024 bit key. How long does a 2^1024 bit key take? Let's say one microsecond. There are 10^6 microseconds in one second, so that means your 2^2048 bit key will take 10^309 / 10^6 = 10^(309-6) = 10^303 seconds to crack. There about 3x10^8 seconds in a year... divide and see that our 2048-bit key takes between 10^294 and 10^295 years. The age of the universe is 10^10 years, divide and our key will take 10^285 times the universe's age. Now, 10^309 is immensely larger than 10^285 (by a factor of the age of the universe in microseconds) but 10^285 is still way, way, way outside anything we can comprehend. We need to shrink it by a lot before we can begin to understand it.
The error I made in my math earlier was that I for some reason divided 1,048,576 by 1,024 and decided that the megabit key would be 2^1,024 times harder to crack. What I actually should have done is subtracted 1,024 from 1,048,576, and come up with a difficulty factor of 2^1,047,552. This number is so big that I can't even think about it, or my head will explode.
In fact, a 1,048,576 bit key is so ridiculously huge, so absurdly bigger than it needs to be to completely stymie brute force attack, that I suspect there's something else at work here. That key is part of the encryption at a fundamental level, and it needs to be so big not to prevent brute-force attack, but to prevent a more classical cryptographic analysis of the ciphertext. I bet that with smaller keys, patterns emerge in the encrypted text that can be analyzed.
(If you think about it, 1,048,576 bits is big enough to encrypt a 128K document using the key as a one-time pad. That sort of encryption really is unbreakable, even in theory.)