• ped_xing [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    3
    ·
    1 year ago

    When it’s above 100, people who have options for something lower will generally go for them. Similarly for under 0. OK, so as PancakeLegend@mander.xyz pointed out, such sensitivities might be specific to US culture, but theoretically, how much would we have to expand the 0-100 Fahrenheit range so that 0 is too cold for pretty much everyone and 100 is too hot for pretty much everyone? 0 goes to -10, 100 to 140? A new-Fahrenheit degree would still be more precise than a Celsius degree.

    • Xavienth@lemmygrad.ml
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      1 year ago

      My point is “really hot” and “really cold” are not useful reference points to ascribe to, no matter what numbers you’re using. If i was coming up with a measurement system for brightness and i said 1000 was “really bright” would you be able to tell me anything about 500? No because you literally have no reference frame for what i mean by “really bright”. It’s the same thing when Americans describe Fahrenheit to the rest of the world. You have to experience the data points, and at that point, whether you use 0 to 100, -20 to 40, or 250 to 310, it doesn’t matter. You will just intuitively understand the scale and so there’s no inherent benefit.