I like to refer to them as Freedom units and Communist units (in jest, obviously). I will say, though, that Fahrenheit feels like a more precise scale for measuring temperature even if the units are goofy.
What additional arguments besides personal experience would you give to back this precision claim?
Temperature scales are arbitrary by nature, and the criteria behind their definition can be useful or not. Fahrenheit’s isn’t that much useful compared to Celsius’ or Kelvin’s.
I’m not arguing on Fahrenheit’s behalf or saying it IS more precise. I just said it “feels” more precise because you have finer increments in whole numbers. 70 degrees F is about 21 degrees C while 90 degrees F is about 32 degrees. 20 degrees of increment in F versus 12 in C which feels more precise. It’s the same way metric length measurements feel more precise because there are whole number millimeters rather than fractional inches.
I have no strong opinion any one way, other than I feel like everyone should endeavor to be comfortable converting between various systems of measurement.
I don’t get the precision argument. It really doesn’t matter for personal use because you wouldn’t feel the difference anyways and if you really needed it to be as precise as possible (for… I don’t know, science) you’d use decimals. And if you’re sciencing, you’d use the system that allows easy conversion, which is metric.
I like to refer to them as Freedom units and Communist units (in jest, obviously). I will say, though, that Fahrenheit feels like a more precise scale for measuring temperature even if the units are goofy.
What additional arguments besides personal experience would you give to back this precision claim?
Temperature scales are arbitrary by nature, and the criteria behind their definition can be useful or not. Fahrenheit’s isn’t that much useful compared to Celsius’ or Kelvin’s.
I’m not arguing on Fahrenheit’s behalf or saying it IS more precise. I just said it “feels” more precise because you have finer increments in whole numbers. 70 degrees F is about 21 degrees C while 90 degrees F is about 32 degrees. 20 degrees of increment in F versus 12 in C which feels more precise. It’s the same way metric length measurements feel more precise because there are whole number millimeters rather than fractional inches.
I have no strong opinion any one way, other than I feel like everyone should endeavor to be comfortable converting between various systems of measurement.
You can simply use as many decimals you want to make Celsius more precise. You don’t see it used in general because it really isn’t needed.
The little digital thermometers I have around the house read to one decimal place. The precision argument is just bizarre.
I don’t get the precision argument. It really doesn’t matter for personal use because you wouldn’t feel the difference anyways and if you really needed it to be as precise as possible (for… I don’t know, science) you’d use decimals. And if you’re sciencing, you’d use the system that allows easy conversion, which is metric.
I’m scared to ask now if Fahrenheit has decimals or if it’s like 74 and one eighth degrees.