The expression 0.999... = 1 will always be a matter of contention for a few reasons.
a. The expression is a convention. Its generally agreed to be true. The convention is not without its uses, but as conventions go, you will always have nay sayers.
b. 0.000...1 cannot exist as a non-terminating decimal. Essentially, it terminates at the decimal place where the 1 occurs. This means that you cannot subtract 0.999... from 1. If you don't believe me, try solving 0.999... + X = 1 within the constraints of Dedekind cuts.
c. Until someone within the mathematical community has the spine to stand up and say 1/0 = [infinity], one will always have to defend the idea that certain non-terminating decimals are rational while a non-terminating decimal like Pi is irrational.
Those who simply dismiss this age old discussion as a waste of time, fail to recognize the importance in accurately defining basic principals of math.
Keep the discussion going Netorama. I'm glad to see someone in the main stream bringing this discourse back to light.
a. The expression is a convention. Its generally agreed to be true. The convention is not without its uses, but as conventions go, you will always have nay sayers.
b. 0.000...1 cannot exist as a non-terminating decimal. Essentially, it terminates at the decimal place where the 1 occurs. This means that you cannot subtract 0.999... from 1. If you don't believe me, try solving 0.999... + X = 1 within the constraints of Dedekind cuts.
c. Until someone within the mathematical community has the spine to stand up and say 1/0 = [infinity], one will always have to defend the idea that certain non-terminating decimals are rational while a non-terminating decimal like Pi is irrational.
Those who simply dismiss this age old discussion as a waste of time, fail to recognize the importance in accurately defining basic principals of math.
Keep the discussion going Netorama. I'm glad to see someone in the main stream bringing this discourse back to light.