Americans always regurgite the “Fahrenheit is how people feel” nonsense, but it is just that: nonsense. Americans are familiar with fahrenheit so they think that it is more inituitive than other systems, but unsurprisingly people who are used to celsius have no problems using it to measure “how people feel” and will think it is a very inituitive system.
Can confirm. Moved from the US to Canada and maybe a year of using Celcius revealed to me just how fucking stupid and convoluted Fahrenheit is. My dad spent three weeks out here and started using Celcius on his phone. Now I only use Fahrenheit when dealing with fevers or temping cases of suspiciously overripe produce.
Fellow Americans. Celcius is superior and more intuitive for those who take a moment to adjust to it. It is okay to accept this as fact without developing an inferiority complex. USA not always #1. USA quite often not #1 and that is okay. It is okay for USA to not be #1 without developing an inferiority complex.
The universe is mostly empty space with an average temperature of like… 4 Kelvin or some shit. Why not use a system that reflects that? Oh, we do? Right. Celsius is Kelvin + 273.15.
So then we should use the system that reflects the freezing point and boiling points of water at nice round values such as 0 and 100 then? Sounds like Celsius is the better system
Slightly off topic, but 23°C is a nice room temperature? We have our thermostats at 20°C and I find it quite warm. In the sleeping room we have 18°C and so do I have in my office, which I find quite comfortable. I hate visiting my parents, they always have 22.5°C which I find uncomfortably warm.
Well it’s all subjective after all, I’ll be happy about chilly 23°C inside when summer comes.
What is your point? That people who use Celsius can’t feel the difference between 21.7°C and 22.8°C?
If you’re worried about your thermometer, you’ll be happy to hear that metric ones usually have finer precision than Fahrenheit ones, since they go in .5°C steps. Since +1°F means +5/9°C, you have less precision!
It’s just an incredibly weak defense. Why is it worse for C to use an extra decimal for these differences? I can just as well argue that C is a more accurate representation, because small differences in temperature are smaller. Just like your argument, this is purely an opinion - until you can show me that not needing the extra decimal is objectively better, or until I can show you that smaller differences being represented as such is objectively better, neither of them holds any weight.
It’s the same reason we use abbreviations and contractions when speaking. A trivial simplification is still a simplification.
Why bother with Celcius at all when there is Kelvin. Even Kelvin is arbitrary. Best to use Planck normalized temperature. The scale would be absolute 0 to 100 where 0 is absolute 0 and 100 is 10^32 Kelvin.
So whenever you have to tell someone the temperature outside, you say it’s 0.000000000000000000000000015237 Planck
If 3 digits isn’t more a tiny bit more cumbersome than 2, then 32 digits is fine too.
I would argue it’s because of historical usage, familiarity, and resistance to change. Most countries and most people living in hot climates use Celsius.
It is really easy to map onto human feel though. 0-100 pretty accurately maps onto our minimum and maximum realistically survivable temps, long-term, and the middle temperatures of those are the most comfortable. It’s far more round, when it comes to describing human preference and survivability, than Celsius is.
0-150 is the better range, and 75 is right in the middle. 100 is just a hot air temperature most people don’t want to be in but it’s not an extreme.
Saunas can get up to 200 degrees
Hot tubs are usually at 100
Freezers need to be at least 0
You say 15°C. 6° cooler than room temperature. But how much is 6°?
It’s 60°F.
50°F or 10°C is where you need clothes to survive
300, 325, 350 is where you bake cookies (149-176°C)
Fahrenheit has a bunch of 5 and 10s
Saying something like high 70s or low 70s for temp represents an easy way to tell temperature.
21° to 26° for celcius
I walk outside and say “It feels like high 70s today” someone using celcius would say, “Feels like 25°”. If it was a little warmer than “low 80s” compared to “Ehh about 26 or 27°C”
Yeah, I get your point. I think I’m just trying to explain that it all just matters where you grew up and what you used. I go outside today and I do say it feels like a 12 degree day. It’s not that much different.
I must admit, the oven temps are nice, but they are a product of being written in Fahrenheit (if they were written in celcius, it would be round too, like 150c, 160c, 170c, 175c, etc)
But the more I look at it the more I see it’s all just numbers. We put importance to these numbers but they’re all pretty arbitrary, except celcius using 0 as the freezing point for water and 100 as the boiling point- these are two very important measures that are just weird for Fahrenheit.
Why is it okay to say high 70s/low 80s and not high 20s? No one goes outside and says, “Ehh, it feels like 26.6 oC today.”, we just know it is a bit warmer than 25.
I bet a lot more people know what 0°C feels like than 0°F. One is freezing point, one is a completely arbitrary temperature which only gets called “the lowest you’ll experience” as a post hoc rationalisation of Fahrenheit. Most people will never experience anything that cold, some people experience colder.
I even bet more people know what 100°C feels like than 100°F. One is accidentally getting scalded by boiling water, the other is a completely arbitrary temperature which is quite hot but not even the hottest you’ll experience in America.
boiling water isnt necessarily 100c. if youre boiling water, it can be any arbitrary temperature above 100.
thats like going to a geyser pit and saying thats 100c, when it isnt. when you cook and let water come to a boil, the chef doesnt care that its exactly 100c, only that its in the state above 100.
if youre boiling water, it can be any arbitrary temperature above 100.
That’s not how boiling works. The water heats up to its boiling point where it stops and boils. While boiling the temperature does not increase, it stays exactly at the boiling point. This is called “Latent Heat”, at its boiling point water will absorb heat without increasing in temperature until it has absorbed enough for its phase to change.
I mean, you’re 100% wrong. Fahrenheit isn’t “how people feel” arbitrarily, it’s almost literally a 0-100 scale of how hot it is outside. You need no prior knowledge to interpret a Fahrenheit measurement. Which really reflects poorly on everyone who says “Fahrenheit doesn’t make any sense” because if they were capable of any thought at all they would figure it out in 2 seconds, like everyone else. I’m a lab rat that uses Celsius all day every day, I’m just not a pretentious stuck up tool about alternate measurements just because I refuse to understand them.
I like that Fahrenheit has a narrower range for degrees. 1C is 1.8 degrees F. So, F allows you to have more precision without the use of decimals. Like, 71F feels noticeably different to me than 64F, but that is only a 3.8 degree difference in C.
Where in the chicken I jam the thermometer makes several degrees difference. If you truly require that level of granularity whilst grilling, I’d wager reading a decimal figure isn’t the end of the world. Us normies can continue to bring chicken to 74 and call it a day
3 degrees celcius is easily noticeable too so that’s a bit of a moot point. If anything, 1 degree celcius is much harder to discern and therefore having an even more granular scale is unnecessary.
Americans always regurgite the “Fahrenheit is how people feel” nonsense, but it is just that: nonsense. Americans are familiar with fahrenheit so they think that it is more inituitive than other systems, but unsurprisingly people who are used to celsius have no problems using it to measure “how people feel” and will think it is a very inituitive system.
Can confirm. Moved from the US to Canada and maybe a year of using Celcius revealed to me just how fucking stupid and convoluted Fahrenheit is. My dad spent three weeks out here and started using Celcius on his phone. Now I only use Fahrenheit when dealing with fevers or temping cases of suspiciously overripe produce.
Fellow Americans. Celcius is superior and more intuitive for those who take a moment to adjust to it. It is okay to accept this as fact without developing an inferiority complex. USA not always #1. USA quite often not #1 and that is okay. It is okay for USA to not be #1 without developing an inferiority complex.
deleted by creator
Fahrenheit is European.
*was
I use it and I am not European.
Fahrenheit has a fine granularity that is lost in cold climates. It’s why the Bahamas/Belize use it as well.
Well you know that you can use the decimals?
How is - 40.000001°F more fine than - 40.00000000001°C?
23°C is a nice room temperature.
18°C is a bit chilly but still a comfortable temperature.
If you want to go for a finer destinction then we cann say 18.5°C is warmer but I personally can’t feel the difference.
Our bodies are mostly water why not use a system that reflects this?
The universe is mostly empty space with an average temperature of like… 4 Kelvin or some shit. Why not use a system that reflects that? Oh, we do? Right. Celsius is Kelvin + 273.15.
Are you made of mostly empty space? Your response does leave me questioning. Please aknowledge that you are made of 64% water and not 4°k nothing.
I mean, yeah, we all are. That’s how atoms work.
alternatively, yeah, mostly between his ears.
As a matter of fact…
Plese do not use Kelvin with a degree symbol. There ist no “degree Kelvin”.
Please make sure you are right before you correct someone https://www.techtarget.com/whatis/definition/kelvin-K
…rankine glowers in your general direction…
So then we should use the system that reflects the freezing point and boiling points of water at nice round values such as 0 and 100 then? Sounds like Celsius is the better system
Slightly off topic, but 23°C is a nice room temperature? We have our thermostats at 20°C and I find it quite warm. In the sleeping room we have 18°C and so do I have in my office, which I find quite comfortable. I hate visiting my parents, they always have 22.5°C which I find uncomfortably warm.
Well it’s all subjective after all, I’ll be happy about chilly 23°C inside when summer comes.
I can feel the difference between 71 and 73 in my house.
At 73, my kids room is uncomfortably hot. At 71, it has a perfect chill for sleeping.
What is your point? That people who use Celsius can’t feel the difference between 21.7°C and 22.8°C?
If you’re worried about your thermometer, you’ll be happy to hear that metric ones usually have finer precision than Fahrenheit ones, since they go in .5°C steps. Since +1°F means +5/9°C, you have less precision!
The point was they need that extra decimal because C isn’t good for human temperature sense.
It’s not like you are prohibited from using decimals in Fahrenheit. It’s that you don’t need 3 digits because it works better for people.
And fuck you for making me defend the most ass backwards measurement system on the planet.
It’s just an incredibly weak defense. Why is it worse for C to use an extra decimal for these differences? I can just as well argue that C is a more accurate representation, because small differences in temperature are smaller. Just like your argument, this is purely an opinion - until you can show me that not needing the extra decimal is objectively better, or until I can show you that smaller differences being represented as such is objectively better, neither of them holds any weight.
It’s the same reason we use abbreviations and contractions when speaking. A trivial simplification is still a simplification.
Why bother with Celcius at all when there is Kelvin. Even Kelvin is arbitrary. Best to use Planck normalized temperature. The scale would be absolute 0 to 100 where 0 is absolute 0 and 100 is 10^32 Kelvin.
So whenever you have to tell someone the temperature outside, you say it’s 0.000000000000000000000000015237 Planck
If 3 digits isn’t more a tiny bit more cumbersome than 2, then 32 digits is fine too.
Dude 71 is way too warm for sleeping, try 64-65 its healthier.
I don’t know if my thermostat is just wrong or if the layout of my house makes it inaccurate, but 64-65 in my house is frigid.
Plus we have a baby so 67-68 is really the lowest we could go at night I think.
But I agree, I sleep better in general when the blankets are warm and the house is cold!
Well it’s all subjective, I guess. Also depends on where you live.
I would argue it’s because of historical usage, familiarity, and resistance to change. Most countries and most people living in hot climates use Celsius.
It is really easy to map onto human feel though. 0-100 pretty accurately maps onto our minimum and maximum realistically survivable temps, long-term, and the middle temperatures of those are the most comfortable. It’s far more round, when it comes to describing human preference and survivability, than Celsius is.
I wanna say that with this logic 50 should be right around the most comfortable temp… But for most people it’s closer to 70.
I’ll try to explain how easily mappable Celsius is to people as well.
-40 to +40… -40 being extremely cold, and +40 being extremely hot. 21c is the equivalent of 70f.
It’s all the same stuff. Just matters what you’re used to.
0-150 is the better range, and 75 is right in the middle. 100 is just a hot air temperature most people don’t want to be in but it’s not an extreme.
Saunas can get up to 200 degrees
Hot tubs are usually at 100
Freezers need to be at least 0
You say 15°C. 6° cooler than room temperature. But how much is 6°?
It’s 60°F.
50°F or 10°C is where you need clothes to survive
300, 325, 350 is where you bake cookies (149-176°C)
Fahrenheit has a bunch of 5 and 10s
Saying something like high 70s or low 70s for temp represents an easy way to tell temperature.
21° to 26° for celcius
I walk outside and say “It feels like high 70s today” someone using celcius would say, “Feels like 25°”. If it was a little warmer than “low 80s” compared to “Ehh about 26 or 27°C”
Yeah, I get your point. I think I’m just trying to explain that it all just matters where you grew up and what you used. I go outside today and I do say it feels like a 12 degree day. It’s not that much different.
I must admit, the oven temps are nice, but they are a product of being written in Fahrenheit (if they were written in celcius, it would be round too, like 150c, 160c, 170c, 175c, etc)
But the more I look at it the more I see it’s all just numbers. We put importance to these numbers but they’re all pretty arbitrary, except celcius using 0 as the freezing point for water and 100 as the boiling point- these are two very important measures that are just weird for Fahrenheit.
When do you use 0° and 100°C?
This is also at standard pressure and most do not live at sea level.
I don’t put a thermometer in my water to make sure it is boiling or one in my water to make sure it freezes.
It can snow and roads can ice before it hits 0°C
It has no real world applications
Why is it okay to say high 70s/low 80s and not high 20s? No one goes outside and says, “Ehh, it feels like 26.6 oC today.”, we just know it is a bit warmer than 25.
I bet a lot more people know what 0°C feels like than 0°F. One is freezing point, one is a completely arbitrary temperature which only gets called “the lowest you’ll experience” as a post hoc rationalisation of Fahrenheit. Most people will never experience anything that cold, some people experience colder.
I even bet more people know what 100°C feels like than 100°F. One is accidentally getting scalded by boiling water, the other is a completely arbitrary temperature which is quite hot but not even the hottest you’ll experience in America.
What? People experience 100 f regularly. It’s literally their body temperature.
100F is a fever; if you’re experiencing those regularly you should go see a doctor.
boiling water isnt necessarily 100c. if youre boiling water, it can be any arbitrary temperature above 100.
thats like going to a geyser pit and saying thats 100c, when it isnt. when you cook and let water come to a boil, the chef doesnt care that its exactly 100c, only that its in the state above 100.
That’s not how boiling works. The water heats up to its boiling point where it stops and boils. While boiling the temperature does not increase, it stays exactly at the boiling point. This is called “Latent Heat”, at its boiling point water will absorb heat without increasing in temperature until it has absorbed enough for its phase to change.
There is an exception to this called superheating
Both are equally arbitrary. You just have to know a handful of temperatures that you use in your day to day life either way.
Celsius is more intuitive for like science or lab work but for day to day use either one is really arbitrary based on what you’re used to.
I mean, you’re 100% wrong. Fahrenheit isn’t “how people feel” arbitrarily, it’s almost literally a 0-100 scale of how hot it is outside. You need no prior knowledge to interpret a Fahrenheit measurement. Which really reflects poorly on everyone who says “Fahrenheit doesn’t make any sense” because if they were capable of any thought at all they would figure it out in 2 seconds, like everyone else. I’m a lab rat that uses Celsius all day every day, I’m just not a pretentious stuck up tool about alternate measurements just because I refuse to understand them.
I like that Fahrenheit has a narrower range for degrees. 1C is 1.8 degrees F. So, F allows you to have more precision without the use of decimals. Like, 71F feels noticeably different to me than 64F, but that is only a 3.8 degree difference in C.
But that also doesn’t matter because the granularity is meaningless if you don’t make decisions for differences between 71F and 70F
Not at those exact temperatures, but one degree matters in in grilling meat, making mash for beer, making candy, etc.
Sure, but you should be using Celsius for those things. That’s the main argument here.
You win best username. I’m assuming you’re a Linux nerd as well. <3
Where in the chicken I jam the thermometer makes several degrees difference. If you truly require that level of granularity whilst grilling, I’d wager reading a decimal figure isn’t the end of the world. Us normies can continue to bring chicken to 74 and call it a day
3 degrees celcius is easily noticeable too so that’s a bit of a moot point. If anything, 1 degree celcius is much harder to discern and therefore having an even more granular scale is unnecessary.