@pyro #78718
What it really comes down to is also simply explained without real mathematical proof, and the common sense proof shows that this guy is lacking common sense, as well as mathematical knowledge.
He’s absolutely right, .999 does not equal 1. Not even if it’s an infinitely repeated 9 after the decimal point. We can all agree on this, because it’s simple, logical, and true.
But, just like the reasoning behind the mathematical point of limit notation, it’s really all about making the math easier, while being accurate enough for the purpose proposed. And very early on in mathematics, preceding even calculus by several centuries, if not several millennia, is the concept of rounding. That is, truncating past the point of accuracy needed, with the first digit cut off determining if your last digit stays the same, or goes up. Since the first digit cut off is 9, the last remaining digit is increased, and the carried as far as it takes until you no longer have anything to carry. In this case, turning .999 (repeated as far as you like) into 1, no matter what.
Basically, for all intents and purposes, .999 is 1 when you need to do math, unless the purposes require significantly greater accuracy. By the way, there’s not a whole hell of a lot that requires so much accuracy that 1 is so fundamentally different from even a flat .999 (not repeated) that one isn’t best served by rounding. And the more digits, which means the more 9s, the less there is where the difference will be that significant.
Frankly, the difference between .999 and 1 really only makes a big deal when dealing with space craft launches and long distance tunnel digging. And if it’s an infinitely repeated 9, even in those cases, it is the same as 1, and 1 is a much better solution.