You're saying that Americans preferentially use base-2 fractions (nearly) everywhere except on computers; on computers you preferentially use base-10, even though said computers are going to store the values as base-2?
> "on computers you preferentially use base-10, even though said computers are going to store the values as base-2?"
I don't understand this comment at all. Most people are not computer programmers and therefore have no insight into how computers represent numbers. They just populate text fields in Excel, or whatever, with numbers. It's easier to use decimal points to do that than fractions.
When I'm not using computers or calculators, I do prefer American-style fractions over decimal point representations. Fractions are a lot easier for me to manipulate in my head. If I'm out in my garage, upside down, underneath a car, and trying to decide which drill bit to use, I don't have convenient access to the calculator on my cell phone or a pencil and paper, and reliable mental math is important. The fractions are easier, so that's what I want.
I believe they're saying it's ironic that, despite the base-2 values being ideal for binary computers, users are having to use base-10 to enter them into the computer and hence getting none of the possible advantage. I don't believe they're making a value judgement.
A feature of some English measures is unbiased division into three equal parts, e.g. Three teaspoons to the tablespoon, twelve inches to the foot, and twelve ounces to the pound (for one of several definitions of “ounce”).
Thirds don’t fit neatly into binary nor do they fit neatly into decimal. This is what the comment refers to.
You're saying that Americans preferentially use base-2 fractions (nearly) everywhere except on computers; on computers you preferentially use base-10, even though said computers are going to store the values as base-2?
That's the funniest thing I've heard all week.