Having lived in places that used both systems, I have to say - I’m objectively on board with distance and weights in metric, but I’ve been less on board with temperature. The Celcius scale is good for science, but less useful for human measurement than Fahrenheit is. Fahrenheit zooms in closer to the human experience of temperatures (around 0F/-17C to 100F/37C) and so allows for slightly more variation when describing temperature in sets of 10 (that range of 100 digits in Fahrenheit is only 54 digits in Celsius, so it makes Celsius feel roughly half as detailed when talking about it). Anything below 0 in Fahrenheit is unbelievably cold. Anything above 100 is unbelievably hot. Celcius centers on freezing/boiling, which I get, but that’s not terribly useful for daily human purposes; namely weather. The temps from around 40 to 100 in Celsius aren’t useful to humans. It’s all just “really fucking hot”. So I give a big thumbs up to everything metric except for Celcius.
It’s not like you need 100 degrees of granularity in telling the temperature of the weather. Also, 0 degrees Celsius is by far the most influential temperature in day to day life, at least if you live somewhere it occurs.
boiling water is also a fairly important temperature, and it doesn’t take much brainpower to figure out roughly where between freezing and boiling you like to be.
Plus it’s nice to have the reference points be quite objective, you can create a celsius thermometer that’s reasonably accurate without much work, and i feel like it makes relating to temperatures outside the scale easier too.
The amount of energy it takes to bring a volume of water from freezing to boiling is plainly observable and has a nice size for comparing to e.g. the melting point of metals.
If we could start from scratch, I would define an absolute temperature scale where water freezes at 500, roughly 1.83x the kelvin scale.
So 4xx is freezing, and the max survivable temperature is around 570. (Water boils at 683, but freezing and boiling can’t both be round numbers on an absolute scale.)
But we do tend to round to the nearest half degree when discussing temperature in the UK. Do people do that with temps in the US or just round to the nearest degree? If it’s the later then the two are similarly granular in practise.
People round to the nearest degree in the US. But that’s kind of my point. It’s more awkward to throw in fractional temperatures and the fact that you do shows that Celsius isn’t properly expanded enough. In Canada people in my anecdotal experience actually haven’t been rounding to the nearest half degree, just the nearest degree thereby making the scale feel less granular.
Not to knock - everyone invariably likes what they’re used to better. I usually get a lot of pushback from people for this opinion. But that’s my point - I concede that even with my familiarity in miles and pounds that kilometres and kilograms are better systems of measurement. The wonder of the metric system is the simple ratios in multiples of 10. But temperature is a realm where that advantage doesn’t exist. And on an objective level, I think Fahrenheit has a better argument for function.
I mean - boiling is boiling, right? Do you ever really need to measure whether your water is boiling in daily life? I would concede that it’s useful to know more easily when water will freeze when it comes to the weather. It’s really the higher end of the Celsius scale that I’m critical of. Fahrenheit could share Celsius’s 0 and my criticism would be more toothless. Though Fahrenheit’s logic around 0 is that anything below 0 weather-wise is exceedingly rare and momentous in northern climates. I think that makes sense as an argument. Negatives in Celsius are common (at least in North America), but a negative in Fahrenheit is mouth gaping dreadful levels of cold. That’s at least as intuitive to me as having 0 be freezing. Since 0 implies the bottom of the temperature scale.
It’s intuitive for you, but things are intuitive if you grow up with it. Fahrenheit is anything but intuitive for me, even if, thanks to the internet, I’ve been confronted with it for years now.
But the point of the reference values is to be… referencable…
How do you reference 0F? Referencing 0°C is trivial albeit not strictly accurate, and 100°C is equally trivial though only accurate to within like 5 degrees.
And this isn’t a theoretical issue, it’s fundamentally useful to be able to double check that a thermometer is at least roughly correct.
Having lived in places that used both systems, I have to say - I’m objectively on board with distance and weights in metric, but I’ve been less on board with temperature. The Celcius scale is good for science, but less useful for human measurement than Fahrenheit is. Fahrenheit zooms in closer to the human experience of temperatures (around 0F/-17C to 100F/37C) and so allows for slightly more variation when describing temperature in sets of 10 (that range of 100 digits in Fahrenheit is only 54 digits in Celsius, so it makes Celsius feel roughly half as detailed when talking about it). Anything below 0 in Fahrenheit is unbelievably cold. Anything above 100 is unbelievably hot. Celcius centers on freezing/boiling, which I get, but that’s not terribly useful for daily human purposes; namely weather. The temps from around 40 to 100 in Celsius aren’t useful to humans. It’s all just “really fucking hot”. So I give a big thumbs up to everything metric except for Celcius.
It’s not like you need 100 degrees of granularity in telling the temperature of the weather. Also, 0 degrees Celsius is by far the most influential temperature in day to day life, at least if you live somewhere it occurs.
boiling water is also a fairly important temperature, and it doesn’t take much brainpower to figure out roughly where between freezing and boiling you like to be.
Plus it’s nice to have the reference points be quite objective, you can create a celsius thermometer that’s reasonably accurate without much work, and i feel like it makes relating to temperatures outside the scale easier too.
The amount of energy it takes to bring a volume of water from freezing to boiling is plainly observable and has a nice size for comparing to e.g. the melting point of metals.
If we could start from scratch, I would define an absolute temperature scale where water freezes at 500, roughly 1.83x the kelvin scale.
So 4xx is freezing, and the max survivable temperature is around 570. (Water boils at 683, but freezing and boiling can’t both be round numbers on an absolute scale.)
But we do tend to round to the nearest half degree when discussing temperature in the UK. Do people do that with temps in the US or just round to the nearest degree? If it’s the later then the two are similarly granular in practise.
People round to the nearest degree in the US. But that’s kind of my point. It’s more awkward to throw in fractional temperatures and the fact that you do shows that Celsius isn’t properly expanded enough. In Canada people in my anecdotal experience actually haven’t been rounding to the nearest half degree, just the nearest degree thereby making the scale feel less granular.
Not to knock - everyone invariably likes what they’re used to better. I usually get a lot of pushback from people for this opinion. But that’s my point - I concede that even with my familiarity in miles and pounds that kilometres and kilograms are better systems of measurement. The wonder of the metric system is the simple ratios in multiples of 10. But temperature is a realm where that advantage doesn’t exist. And on an objective level, I think Fahrenheit has a better argument for function.
Knowing what properties water will have is really useful in day to day life tho.
I mean - boiling is boiling, right? Do you ever really need to measure whether your water is boiling in daily life? I would concede that it’s useful to know more easily when water will freeze when it comes to the weather. It’s really the higher end of the Celsius scale that I’m critical of. Fahrenheit could share Celsius’s 0 and my criticism would be more toothless. Though Fahrenheit’s logic around 0 is that anything below 0 weather-wise is exceedingly rare and momentous in northern climates. I think that makes sense as an argument. Negatives in Celsius are common (at least in North America), but a negative in Fahrenheit is mouth gaping dreadful levels of cold. That’s at least as intuitive to me as having 0 be freezing. Since 0 implies the bottom of the temperature scale.
It’s intuitive for you, but things are intuitive if you grow up with it. Fahrenheit is anything but intuitive for me, even if, thanks to the internet, I’ve been confronted with it for years now.
Both got their pros and cons in day to day life. It’s about getting used to it. The switch to scientific applications is easier with Celsius however….
But the point of the reference values is to be… referencable…
How do you reference 0F? Referencing 0°C is trivial albeit not strictly accurate, and 100°C is equally trivial though only accurate to within like 5 degrees.
And this isn’t a theoretical issue, it’s fundamentally useful to be able to double check that a thermometer is at least roughly correct.