Viewing a single comment thread. View all comments

Slypenslyde t1_iuiiryp wrote

Basically, our eyes and the tools we use don't have the precision to tell the difference.

Think about it. If I draw a line on a piece of paper then draw a dot on that line, the center of the dot can't be EXACTLY any one value past a certain precision. And even if I have high precision, your eye won't be able to tell the difference between "1.33333333333333" and "1.33333333333334" unless you use highly calibrated measuring tools and microscopes.

Likewise, on a computer screen, it's hard to display things between pixels accurately, so there's always a bit of fudge too.

TL;DR: Number lines aren't 100% accurate. They're "close enough", where the definition of that word varies based on what tools are being used.

1