• Kusimulkku@lemm.ee
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    It’s better to pick the scale that does conform to it for the vast majority of applications, and then just deal with the others. Either by using C or just dealing with it. For every 1 time you need to deal with temps of your computer, you’ll interact with the environmental temperature a thousand times. And neither C or F are inherently better for describing CPU temps.

    I mean neither conforms very well, that’s the whole point. And what’s the deal with 0-100, why is that so beneficial in your opinion?

    And neither C or F are inherently better for describing CPU temps.

    Well yeah, it was simply about the 0-100 thing.

    Oh, I forgot to pull out my cooking manual. Yeah C is MUCH better.

    Wait till you see the ovens. It’s incredible. There’s usually few temps you need to care about and it changes in 20 degree marks. Incredible, I know.