See below. You've done the following, at least: a) invoked infinity casually, making everybody cringe, b) assumed 0.9999... exists and makes sense (i.e. doesn't contradict anything) and c) assumed it has unique value. (Obviously, these are all OK, but you have to justify b) and c) in particular.)
OK.
a) I simply wouldn't call it infinity, because it isn't. What we have is a decimal that repeats "infinitely" (the adverb, not the noun). I wouldn't use that term either. At the 7th grade level, I would say that the 9 repeats "forever." I think that's a temporal concept that incorporates a more intuitive sense of infinite or infinitely. We can grasp that there is no such thing as the day after forever. Forever isn't simply "a long time." There is no end to it. Hence, I think 7th grade students who are capable of doing pre-algebra can grasp the notion that the 9 never stops. You cannot simply go all the way out to the end and then add another 9, because there is no end. They get that -- well, the ones who are smart, but not too clever by half do, anyway.
You think that's invoking infinity casually? OK. We can agree to disagree. I think it's a handy way of avoiding causing everyone's head to explode.
b) I've "assumed" .999... exists? No, I haven't. I have
defined N to be .999... That's a given. It's the main axiom in the proof. You don't have to prove axioms. ETA: Oops, now I've gone and misapplied the term "axiom." I'll stick with given.
What could it possibly contradict? Are you going to postulate that every number represented by a decimal string must eventually terminate? If so, then I can easily prove you wrong by taking the string out one digit further. I can do so indefinitely, which of course means we can make one
infinitely long. Again, in this context, I prefer to say that we can keep adding digits to the end "forever." It's more intuitive, in my opinion.
c) I really don't understand what you mean here. Every real number has a unique value. That's what it means to be a real number. You aren't going to try to assert that under certain conditions, in the set of real numbers sometimes 2 = 3, are you?
[Here's a more formal definition of "real number:" In mathematics, a quantity that can be expressed as a finite or infinite decimal expansion. The counting numbers, integers, rational numbers, and irrational numbers are all real numbers. Real numbers are used in measuring continuously varying quantities (e.g., size, time), in contrast to measurements that result from counting. The word real distinguishes them from the imaginary numbers. ]
What this proof demonstrates to younger students is that there is more than one way to represent a number
notationally. .999... is simply a very well disguised notation for the very same number as 1.
Numbers themselves do not change value. They simply are what they are. Pi is always Pi. 0 is always 0.
Please note that in the set of real numbers, infinity is not a number. It doesn't mean "really big." It means unbounded, without limit. It is undefined numerically.
.999... does exist" is a poor argument. So why is .999... justified and not 999...? Plus, did you prove it has unique value?
The set of real numbers is defined to include numbers which can be expressed as infinite decimal expansions. On the other hand, 999... is understood to repeat infinitely and become infinitely large; it's "infinity." As others have noted, it is divergent and has no limit. That's another way of saying it isn't defined within the set of real numbers, and it is not real.
ETA: Oops. That last paragraph is incorrect. An infinitely large number is real, but it's undefined. It's not an imaginary number.
Again, the unique value question makes no sense to me. A number's value is what it is. You can perform a function on it and the result of having done so can be different from the original number, but that's the result of having performed the function, not because the number itself underwent some kind of transformation.
Answer without using sequences and limits, please.
Limits? I don't need no stinkin' limits.
(This is my point. Your proof is essentially correct, but you need to justify some of your assumptions - assumptions that a "7th grader" will not spot, and doesn't care about. The language of limits and sequences makes such justification unnecessary, because such language is hyperprecise. Also, I hope I'm not being patronising or condescending - I apologise if so. I don't mean to be. I have a certain lack of way with words...)
It's not "essentially" correct. It
is correct, without qualification.
I don't think you're being patronising or condescending. I hope you don't think I am. I'm being stubborn and resolute, but not condescending.
AS