this post was submitted on 20 Nov 2023
1384 points (100.0% liked)

196

16450 readers
1742 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 2 points 11 months ago

It’s better to pick the scale that does conform to it for the vast majority of applications, and then just deal with the others. Either by using C or just dealing with it. For every 1 time you need to deal with temps of your computer, you’ll interact with the environmental temperature a thousand times. And neither C or F are inherently better for describing CPU temps.

I mean neither conforms very well, that's the whole point. And what's the deal with 0-100, why is that so beneficial in your opinion?

And neither C or F are inherently better for describing CPU temps.

Well yeah, it was simply about the 0-100 thing.

Oh, I forgot to pull out my cooking manual. Yeah C is MUCH better.

Wait till you see the ovens. It's incredible. There's usually few temps you need to care about and it changes in 20 degree marks. Incredible, I know.