Ok, first a bit of background...
1. I only scanned the first and last pages, so if I repeated or missed something, my apologies.
2. I'm an engineer, so, er.. I might be biased.... uh..yeah..
My reasoning:
Mathematics is mathematics. You can only apply the rules of mathematics to a mathematical problem. Using generalized logic is also not applicable. Why, because generalized logic does not recognized the foundations of mathematics; for example, the definitions of operators.
Logic is applicable to mathematics only when taking into account the rules upon which mathematics is based. I can always post the proposition "1 + 1 = 3" and not only claim it's true, but also prove it.
How? One way is by defining my own definition to the operator '+'. Another way would be to define the number sequence as 1,3,4,2,11,13,... As long as we're working with my rules, it is valid and true. If, however, you do what most people do instinctively and apply the rules of mathematics, it's obviously false.
Basically, my basis for largely ignoring MadRat's posts (sorry) is because each post seems to be an attempt at applying common sense or philosophy to mathematics.
I mean no offense, but I fail to see how providing philosophical concepts or implying common sense and instinct proves mathematics. A truth table can always provide a contradiction to a logical argument, but it can not provide a proof. At least, not in digital logic design.
Therefore, philosophy has no place in mathematical proof. It may provide inspiration and it may lead to new definition, but it does not govern. This is similar to modern thereotical physics. Traditionally, physics has paved the road to new ideas and technology while mathematics has followed to provide a numerical basis. An example would be Newtonian physics.
Nowadays, with so much development in "strange" territory from Cosmology to Quantum Mechanics, mathematics has begun to lead the way more and more simply because instinct fails. New discoveries are occuring not because a physicist had a sudden insight, but because the mathematical equations lead to, simplify, resolve themselves, or whatever, into seemingly strange and nonsensical forms that imply new discoveries and are only proven years later when the technology catches up.
In essence, 0.9999... = 1 not because it's easy to understand, or because common sense tells us it is (which is a bad way to prove things, btw), but because the rules of mathematics laid the foundation for the data and the process of analysing the data and the logical conclusion following the rules results in the proposition being true.
You can argue until you're blue in the face that 0.999... != 1 simply because it would be implying that sitting in a chair with one's butt glued firmly to the seat results in one being the chair. One can pull out as many degrees and accredidations as there are in the world to cement the right to provide such a proof. However, doing so is completely irrelevant and is akin to refereeing a U.S. football game with the rules from British football.
One last thing:
I don't truly 100% believe in the mathematics, but I do believe it is proven correctly. If not, well, egg on my face..
On the other hand, few can accept the time-dilation effects of General Relativity. That concept powers your light bulb, by the way. (Electro-magnetic waves, aka. light, is a relativistic phenomenon)