I’m not a math person at all, so I’m not really debating your proof, but it seems to me that if 0.9• = 1, then what does 0.1• equal? It “fits” perfectly into the “space” between 0.9• and 1, but if 0.9•=1 then 0.1• should equal 0, right? Except it doesn’t, because 0.1<0.1• and 0.1 definitely isn’t 0.
I definitely understand why some religious people think numbers are a tool of Satan.
I’m not a math person at all, so I’m not really debating your proof, but it seems to me that if 0.9• = 1, then what does 0.1• equal? It “fits” perfectly into the “space” between 0.9• and 1, but if 0.9•=1 then 0.1• should equal 0, right? Except it doesn’t, because 0.1<0.1• and 0.1 definitely isn’t 0.
I definitely understand why some religious people think numbers are a tool of Satan.
0.1111… is equal to 1/9. 0.0000… is trivially equal to 0.