People are trying too hard, I think, to figure out what "0.999. . ." "really means". It doesn't "really mean" anything, any more than "42" really means anything. Both are just a bunch of squiggles on a piece of paper. (Or patterns of light on a monitor. Whatever.)
They mean whatever we define them to mean.
One can reasonably ask, "Why did mathematicians choose the particular definition they did?" (Many of the supposed proofs in this thread should really be considered answers to this question, and not actual proofs in the mathematical sense.)
But how can one claim something is false, or even that there's some uncertainty about it, when it's true by definition?
They mean whatever we define them to mean.
One can reasonably ask, "Why did mathematicians choose the particular definition they did?" (Many of the supposed proofs in this thread should really be considered answers to this question, and not actual proofs in the mathematical sense.)
But how can one claim something is false, or even that there's some uncertainty about it, when it's true by definition?