I read an interesting article in an aviation mag years ago arguing that the calculation was all wrong, and deliberately obfuscating the real risk. I'm not good enough with math to have an opinion, but the premise _sounded_ plausible, at least. The idea is that simply comparing deaths per miles traveled ignores the fact that the air travelers were exposed to risk for much shorter periods of time. In other words, to travel from NY to LA you spend 6-7 hours in a plane, or 50 hours in a car. If, for example, people died at about the same rate per mile traveled in planes and cars you might say planes were a lot more dangerous because you are only in one for a tenth of the time it takes to go the same distance in a car. On the other hand, you have to choose one or the other means to cover the distance. You can't wish yourself there.