Right, it’s only a problem because we chose base ten (a rather inconvenient number). If we did math in base twelve, 1/3 in base twelve would simply be 0.4. It doesn’t repeat. Simply, then, 1/3 = 0.4, then (0.4 × 3) = (0.4 + 0.4 + 0.4) = 1 in base twelve. No issues, no limits, just clean simple addition. No more simple than how 0.5 + 0.5 = 1 in base ten.
One problem in base twelve is that 1/5 does repeat, being about 0.2497… repeating. But eh, who needs 5? So what, we have 5 fingers, big whoop, it’s not that great of a number. 6 on the other hand, what an amazing number. I wish we had 6 fingers, that’d be great, and we would have evolved to use base twelve, a much better base!
It’s a weird concept and it’s possible that I’m using it incorrectly, too - but the context at least is correct. :)
Edit: I think I am using it incorrectly, actually, as in reality the difference is infinitesimally small. But the general idea I was trying to get across is that there is no real number between 0.999… and 1. :)
It is possible to define a number system in which there are numbers infinitesimally less than 1, i.e. they are greater than every real number less than 1 (but are not equal to 1). But this has nothing to do with the standard definition of the expression “0.999…,” which is defined as the limit of the sequence (0, 0.9, 0.99, 0.999, …) and hence exactly equal to 1.
There is no difference, not even an infinitesimally small one. 1 and 0.999… represent the exact same number.
They only look different because 1/3 out of 1 can’t be represented well in a decimal counting system.
Right, it’s only a problem because we chose base ten (a rather inconvenient number). If we did math in base twelve, 1/3 in base twelve would simply be 0.4. It doesn’t repeat. Simply, then, 1/3 = 0.4, then (0.4 × 3) = (0.4 + 0.4 + 0.4) = 1 in base twelve. No issues, no limits, just clean simple addition. No more simple than how 0.5 + 0.5 = 1 in base ten.
One problem in base twelve is that 1/5 does repeat, being about 0.2497… repeating. But eh, who needs 5? So what, we have 5 fingers, big whoop, it’s not that great of a number. 6 on the other hand, what an amazing number. I wish we had 6 fingers, that’d be great, and we would have evolved to use base twelve, a much better base!
I mean, there is no perfect base. But the 1/3=0.333… thing is to be understood as a representation of that 1 split three ways
Well, technically “infinitesimally small” means zero sooooooooo
Edit: this is wrong
An infinitesimal is a non-zero number that is closer to zero than any real number. An infinitesimal is what would have to be between 0.999… and 1.
You are correct and I am wrong, I always assumed it to mean the same thing as a limit going to infinity that goes to 0
It’s a weird concept and it’s possible that I’m using it incorrectly, too - but the context at least is correct. :)
Edit: I think I am using it incorrectly, actually, as in reality the difference is infinitesimally small. But the general idea I was trying to get across is that there is no real number between 0.999… and 1. :)
I think you did use it right tho. It is a infinitesimal difference between 0.999 and 1.
“Infinitesimal” means immeasurably or incalculably small, or taking on values arbitrarily close to but greater than zero.
The difference between 0.999… and 1 is 0.
It is possible to define a number system in which there are numbers infinitesimally less than 1, i.e. they are greater than every real number less than 1 (but are not equal to 1). But this has nothing to do with the standard definition of the expression “0.999…,” which is defined as the limit of the sequence (0, 0.9, 0.99, 0.999, …) and hence exactly equal to 1.
Wait what
I always thought infinitesimal was one of those fake words, like gazillion or something
It sounds like it should be, but it’s actually a real (or, non-real, I suppose, in mathematical terms) thing! :)