Hi, I think in metric units, so almost everything is some form of a power of 10, like a kilogram is a 1000 grams, etc.

Sometimes I will think of an hour and half as 150 minutes before remembering that it is 90 minutes.

Does something similar happen to imperial units users? Because as far as I understand you don’t have obvious patterns that would cause you to make these mistakes, right?

  • TheRealKuni@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    You don’t need the extra gradations. Trust me.

    And if you don’t trust me, do what I did!

    (I will preface this by saying “the best unit is the one understood by the audience.” So there is obviously no reason to do this if it doesn’t interest you. But I enjoyed it!)

    I’m American, raised on Fahrenheit. I long used the argument that Fahrenheit was really good for humans, because it has lots of specificity and it describes a range that represents weather for temperate climates.

    But I decided back in 2019 to learn Celsius. This was precipitated (hah! weather pun) by a trip to the UK and memorizing a few points so I’d be able to understand the weather if someone told me. Specifically I memorized 10°C=50°F, 20°C = 68°F, 30°C=86°F. Halfway between those also, 15°C=59°C, 25°C=77°F. Then if someone told me the temperature in Celsius I could find my nearest memorized point and move 2°F per °C (a close enough approximation). So 22°C, start at 68°F and add 4 to reach 72°F. (The actual value is 71.6°F, so this is clearly accurate enough for weather.)

    After I got back from the UK I decided to just keep using Celsius as an experiment. After all, I had been saying for years that Celsius was better for science and Fahrenheit was better for weather, why not test the hypothesis?

    Well, it’s been almost 5 years and I’m still using Celsius for weather.

    It turns out there are two major things making Celsius better for the weather. 1, having too much specificity actually hurts the scale. A degree Fahrenheit doesn’t have enough meaning, so it’s harder to have a sense for how much change you get. Once I started to get a feel for Celsius (which did take a few weeks/months) it was remarkable how quickly I attained a sense for what those degrees communicated.

    But the much more important one is point 2: freezing is 0.

    I didn’t think this would matter as much as it did, but oh boy is it fantastic. Temperatures below freezing in Fahrenheit never really meant much to me. They were just cold. But since in Celsius they are just negatives, I can actually understand them much more easily.

    That is to say, 23°F doesn’t really mean anything to me, but -5°C means “as far below freezing as 5°C(41°F) is above freezing.”

    (Anyway, your chart of temperature ranges to words maps just fine in Celsius if you use 5s. <-15, -15 to -10, -10 to -5, -5 to 0, 0 to 5, etc.)

    • sorghum@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      8 months ago

      This might be the first time I’ve been told that more specificity in a measurement is bad, lol. I use both imperial and metric everyday. Cooking in the kitchen was my entry point as being an American. Calculating percentages for recipes is always easier on metric. Short distances when working on projects is easy enough too. The more graduations in millimeter wrenches over fractional inches was the main reason I wanted to switch in the first place. Which brings me to the problem I’ve always had with temperature. I’d rather have the extra graduations for weather, but am fine with Celsius everywhere else especially in applications that I measure temps close to water boiling for instance in filament temps for 3d printing or CPU GPU temp monitoring.

      • Loki@feddit.de
        link
        fedilink
        arrow-up
        0
        ·
        8 months ago

        Not trying to be mean, but why not use fractions if you need to be more precise? If you need to express “halfway between 20°C and 21°C” you could just say 20.5°C.