I understand the concept, but like the idea of 0.999... = 1, it is more because someone decides it was better to accept the idea, than it is more valid the the alternative.
Hmm... I don't think that's the case. Try to remember that any "decimal number", by which I mean a string of decimal digits, optionally followed by a decimal point followed by an optionally-infinite string of digits, means something quite particular: it refers to an infinite sum as I have discussed above. In some cases of infinite sums, the sum has no value, but in the case of decimal numbers, they always have an exact value. In some cases, the exact value of the sum happens to be the ratio of two integers. In the specific case of the string "0.999..." where we take "..." to represent an infinite string of 9s, the exact value of the sum (computing by conventional means -- in other words,
not simply picked for convenience) is the integer 1.
It's a consequence of existing mathematics, not somebody's decision that it was "just better that way". It's a consequence in the sense that if it didn't take you any time to add numbers together, you could actually add up all the terms (9/10, 9/100, 9/1000, etc) in the infinite sum, and you would get the number 1.
To my mind, 1 - 0.999... = 1.0 ^ 10^-Infinity ( can't get my keyboard to spin the 8 key : ) ) is equally valid.
It's understandable that to your mind they might be equally valid, but that's because you are not insisting that every symbol you use has a well-specified meaning, and that that meaning combines in a well-specified way with the meanings of other symbols in context. In other words you have not specified what "Infinity" means, nor what it means for Infinity to be the exponent of a number in an expression.
Some less important comments on that statement: 1.0^(anything) is 1.0. Your equation is, in fact, false, since 1 - 0.999.... = 0, but the stuff on the right-hand-side, if we assume it to have a value at all, is probably going to have the value 1.
It's all in how you set the conditions. It's like saying 2 + 2 = 5 (for sufficiently large values of 2) , or the joke about the engineer and the mathematician (the one that goes on about how every second you cover half the distance to your girl friend)
Well, you could pick other values for the symbol "2" such that the sentence "2 + 2 = 5" is true, but it would probably make other sentences very false, such as "2^2 = 4". It all comes back to having formal, consistent definitions for all our symbols, as well as a "calculus" for combing those symbols into larger expressions and getting consistent meanings for those expressions. So yes, you can pick any definition of the symbols that you want, but some will be better than others (some will be inconsistent; some will be consistent but very boring). The set of definitions and calculus that modern mathematicians use is a rather good one -- it's powerful enough to talk about how to build bridges so that they don't collapse, and it seems so far to be consistent -- and one of its consequences is that 0.999... = 1.