Masterchief, are you serious, or are you joking? The identity .99999...=1 is a well known fact. Every so often people doubt it, but such people don't even know the definitions.

First question: what is meant by the representation .98604, or .33333..., or .9999....?

Answer: each of these is shorthand for a sum, where the nth digit is the multiple of 10^{-n} that we are adding. So .98604 is simply shorthand for 9*10^{-1}+8*10^{-2}+6*10^{-3}+0*10^{-4}+4*10^{-5}, and .3333... is shorthand for 3*10^{-1}+3*10^{-2}+3*10^{-3}+3*10^{-4}+....

This immediately raises a second question: what is meant by an infinite sum? You can't add infinitely many things, can you?

Answer: yes, you can, in some situations. You can add 1+0+0+0+0+0+..., because after the first term you aren't changing things at all. You can add 1/2+1/4+1/8+1/16+..., because such a sum can be represented "physically" on the number line as follows: first you go half way from 0 to 1, then you go half way from where you are to 1, then you go half way from where you are now to 1, etc. The sum is 1, because that's exactly where you get when you move half way to one infinitely many times (this is not, of course, a proof, but acts as motivation for the idea of limits.) You cannot, however, take the sum 1-1+1-1+1-..., because that sum alternates between 1 and 0, and never settles on any one number (anyone who tells you that sum exists is either completely nuts or has been staring at the zeta function too long). The definition that makes this work is the definition of limits. A sequence of numbers x_n is just that: a sequence x_1, x_2, x_3, x_4, .... Such a sequence is said to converge to a limit x if for any e>0, there is an N>0 so that the differences between x and x_n are less than e for n>N. An infinite sum is just the limit of the partial sums of the initial terms, provided that limit converges. So the decimal number .999999... *really is* the limit of the sums 9/10, 9/10+9/100, 9/10+9/100+9/1000, etc.

They don't teach you all this when they teach you decimals in school, because it's complicated, so they usually just gloss over it, but it is what decimal representations actually are, and so any demonstration that .999...=1 has to use this fact (of course, there are some tricks people sometimes use to "cheat", but those involve taking things on faith, which are true, but which are exactly the things that Erebos is denying above). The lack of such actual definitions is why so many people are confused, or try to deny that .9999...=1. But what's true is that decimal numbers are really just representations of real numbers, and are not always unique. The fact that 1 can also be written as .9999.... is just one example.

So, without further ado-

Proof that 1=.99999...:

We know that .999... is, by definition, the sum 9/10+9/100+9/1000+... which is the limit of the sums 9/10, 9/10+9/100, 9/10+9/100+9/1000, etc. Simply finding common denominators tells us that 9/10+9/100+9/1000+...+9/10^{n}=99...9/10^{n}=(10^{n}-1)/10^{n}=1-1/10^{n}, and since for any e>0, 1/10^{n} is eventually smaller than e, this limit is 1.

p.s. I decided to change my avatar in honor of this thread.